adam optimizer
stochastic gradient descent method that is based on adaptive estimation of first-order and second-order moments.
--Agreed Upon Solutions

The discussion does not exist would you like to add the first comment?

anonymous