garage.tf.optimizers.first_order_optimizer module¶
-
class
FirstOrderOptimizer
(tf_optimizer_cls=None, tf_optimizer_args=None, max_epochs=1000, tolerance=1e-06, batch_size=32, callback=None, verbose=False, name='FirstOrderOptimizer', **kwargs)[source]¶ Bases:
object
Performs (stochastic) gradient descent, possibly using fancier methods like ADAM etc.
-
update_opt
(loss, target, inputs, extra_inputs=None, **kwargs)[source]¶ Parameters: - loss – Symbolic expression for the loss function.
- target – A parameterized object to optimize over. It should implement methods of the
garage.core.paramerized.Parameterized
class. :param leq_constraint: A constraint provided as a tuple (f, epsilon),of the form f(*inputs) <= epsilon.Parameters: inputs – A list of symbolic variables as inputs Returns: No return value.
-