Commit Graph

19 Commits

Author SHA1 Message Date
who who who e458390678
fix wrong hints (#2817) 2023-02-28 20:33:56 -08:00
Chen Qian 1f14395415
Get compatible with optimizer migration in TF 2.11 (#2766)
* Get compatible with optimizer migration in TF 2.11

* Fix comments

* add adamw to the change

* Add type exception
2022-10-11 22:37:29 +02:00
Chen Qian 339159fc36
Point optimizer to tf.keras.optimizer.legacy.Optimizer to be compatib… (#2706)
* Point optimizer to tf.keras.optimizer.legacy.Optimizer to be compatible with Keras optimizer migration

* small fix

* add version control

* small fix

* Update discriminative_layer_training.py

* fix version control

* small fix

* move optimizer class to __init__.py

* small fix

* fix problems

* small fix

* Rename BaseOptimizer to KerasLegacyOptimizer

* exclude keras optimizer from type check

* fix import
2022-06-20 07:46:55 +08:00
Ran Chen 09befb1ae1
Calcuate decay once per step instead once per variable #moving_average (#2342)
Calling _get_decay once per variable can greatly slow down kernel
launches as it contains control flows. By putting decay into apply state
we only need to calcuate it once per step. The same optimization
technicle is applied in keras optimizer for learning rate scheduler.
2021-01-12 20:53:58 -08:00
Tzu-Wei Sung a21a32a1c6
Warn users when it's unable to assign average slot (#2261)
* Warn users when it's unable to assign average slot to variables in var_list
2020-12-07 19:21:56 -08:00
Sean Morgan 1e732fc8dc
* Remove deprecated sequential update (#2249) 2020-11-22 17:53:31 -08:00
bhack 2bf57f8383
Base optimizer tracking (#2126)
* Update lookahead.py

Inital fix of 
https://github.com/tensorflow/addons/issues/2094
https://github.com/tensorflow/addons/pull/2102

* Fix linting

* Resolve name conflict with mixed prexision

* Track baseline optimizer in avg
2020-09-01 10:48:02 -07:00
who who who 08741c997b
Add experimental_aggregate_gradients support (#2137)
* Add experimental_aggregate_gradients support

* format code

* fix test case
2020-08-30 19:13:20 -07:00
Gabriel de Marmiesse 9acab6a942
Bump black (#2135)
* Bump the black version. Fixes the ci.

* Removed trailing commas.

* Update tensorflow_addons/image/filters.py

* Update tensorflow_addons/image/cutout_ops.py

* Removed trailing commas.

* Removed commas.

* Removed some more commas.

* Removed some commas.

* Removed some commas.

* Update tensorflow_addons/optimizers/moving_average.py

* Update tensorflow_addons/losses/tests/kappa_loss_test.py

* Update tensorflow_addons/layers/tests/wrappers_test.py

* Update tensorflow_addons/layers/tests/wrappers_test.py

* Update tensorflow_addons/layers/tests/wrappers_test.py

* Update tensorflow_addons/layers/tests/spectral_normalization_test.py

* Update tensorflow_addons/layers/tests/spectral_normalization_test.py

* Update tensorflow_addons/losses/tests/triplet_test.py

* Update tensorflow_addons/metrics/tests/hamming_test.py

* Update tensorflow_addons/metrics/tests/multilabel_confusion_matrix_test.py

* Update tensorflow_addons/optimizers/lookahead.py

* Update tensorflow_addons/metrics/tests/multilabel_confusion_matrix_test.py

* Update tensorflow_addons/metrics/r_square.py

* Removed commas.

* Removed commas.
2020-08-28 19:56:11 -07:00
Tzu-Wei Sung a9595fa11a
Clean up .assign in optmizers (#1994)
* Use use_locking when doing assignment

* Remove redundant .op

* Run formatter

* Use temp var to store update
2020-07-15 09:00:59 -07:00
Dheeraj R Reddy b329a066f9
Remove `sequential_update` from AverageWrapper (#1807)
* Remove `sequential_update` from AverageWrapper

In TF2.0, sequential_update is redundant. This allows
the removal of `tf.control_dependencies` from
average_wrapper and its subclasses: moving_average
and stochastic_weight_averaging.

* Revert "Remove `sequential_update` from AverageWrapper"

This reverts commit 7cf4201d83.

* Remove `tf.control_dependencies` from AverageWrapper

Add deprecation warning for `sequential_update`.

* Set type of sequential_update to Optional[bool]

`sequential_update` is no longer part of the optimizer's
configuration. Loading an older configuration throws the
DeprecationWarning.

* black format
2020-05-16 22:32:35 +02:00
Dheeraj R Reddy 9d9d484dba
Create new type for Optimizer (#1806) 2020-05-10 15:00:53 +02:00
Jhuo IH f42ca1dd74
Typing optimizers (#978)
* optimizers typing

* minor

* missing checking

* consistent to others with scale_fn

* correction of scale_fn

* missing name

* remove unused import

* revision based on comments

* add typecheck

* using typing.Type
2020-02-01 03:19:04 +01:00
Sean Morgan b7a66a7f3a CLN: Remove pylint and yapf (#946) 2020-01-27 09:13:16 +01:00
Gabriel de Marmiesse ca71fc861f Formatted with black all the files not being worked on at the moment. (#942)
* Keep all files being worked on in pull requests.
* Two files failed to be parsed.
* Formatted with black all the files not being worked on at the moment.
* Added e231 rule.
2020-01-26 18:44:26 -05:00
Gabriel de Marmiesse ee8eab02eb Removed all usages of six (#923) 2020-01-23 17:14:51 -05:00
Gabriel de Marmiesse e04093f883 Use dict expansion (#888) 2020-01-15 21:44:41 +05:30
Gabriel de Marmiesse 9e44ee2332 Used pyugrade on the optimizers files. (#883) 2020-01-14 20:48:21 -05:00
Dheeraj R Reddy b74d5a2c1c Create new AveragedOptimizerWrapper optimizer (#760)
* Create new AverageWrapper optimizer
* SWA and MovingAverage extend AverageWrapper and implement
`average_op(var, average_var)`
* Add support for BatchNorm
* Add fit_bn in optimizers.utils
2019-12-31 14:25:08 -05:00