seldonian.optimizers.gradient_descent

Functions

gradient_descent_adam(primary_objective, ...)

Implements KKT optimization, i.e. simultaneous gradient descent/ascent using the Adam optimizer on a Lagrangian: L(theta,lambda) = f(theta) + lambda*g(theta), where f is the primary objective, lambda is a vector of Lagrange multipliers, and g is a vector of the upper bound functions.

setup_gradients(gradient_library, ...)

Wrapper to obtain the gradient functions of the primary objective and upper bounds function given a gradient library