Adam optmizer "decay" argument is deprecated
In the newer versions of tensorflow, the decay
argument of the Adam optimizer is deprecated. It now throws an error when using it.
I believe the way they want us to use decay is to use the LearningRateScheduler https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/schedules
The issue here, and correct me if I am wrong, there is no easy way in ketos to define the schduler in the ketos recipes. The interaction between RecipeCompat
class, the build
methods of each Interface and each interface _optimizer_from_recipe
... does not support declaring a Scheduler as the optmizer.
The only for the user to define the Scheduler is through the add_learning_rate_scheduler
which as the function itself says, This method must be called before training and after an optimizer has been defined.
And I dont think that is ideal.
Since you were the one that developed this part @frazao Do you have more insight on what could be done or if I am wrong in some of my assumptions? OR suggestions on how we can approach this.
Alternatively, we can use the legacy optmizer: https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/legacy/Adam