Rate this Page
★★★★★
torch.optim.adam.adam#
- torch.optim.adam.adam(params,grads,exp_avgs,exp_avg_sqs,max_exp_avg_sqs,state_steps,foreach=None,capturable=False,differentiable=False,fused=None,grad_scale=None,found_inf=None,has_complex=False,decoupled_weight_decay=False,*,amsgrad,beta1,beta2,lr,weight_decay,eps,maximize)[source]#
Functional API that performs Adam algorithm computation.
See
Adamfor details.
On this page