addons/tensorflow_addons/optimizers
..
tests
BUILD
README.md
__init__.py
adabelief.py
average_wrapper.py
cocob.py
conditional_gradient.py
constants.py
cyclical_learning_rate.py
discriminative_layer_training.py
lamb.py
lazy_adam.py
lookahead.py
moving_average.py
novograd.py
proximal_adagrad.py
rectified_adam.py
stochastic_weight_averaging.py
utils.py
weight_decay_optimizers.py
yogi.py

README.md

Addons - Optimizers

Components

https://www.tensorflow.org/addons/api_docs/python/tfa/optimizers

Contribution Guidelines

Standard API

In order to conform with the current API standard, all optimizers must:

  • Inherit from either keras.optimizer_v2.OptimizerV2 or its subclasses.
  • Register as a keras global object so it can be serialized properly: @tf.keras.utils.register_keras_serializable(package='Addons')

Testing Requirements

  • To run your tf.functions in eager mode and graph mode in the tests, you can use the @pytest.mark.usefixtures("maybe_run_functions_eagerly") decorator. This will run the tests twice, once normally, and once with tf.config.run_functions_eagerly(True).

Documentation Requirements