ForBo7 // Salman Naqvi
  • Home
  • ForBlog
  • Bits and Bobs
  • Dictionary
  • About

Optimizer

The name given to the algorithm that updates the weights in a model. Note that it does not compute the gradients, and hence is not part of backpropagation.

An optimizer in its most basic form looks like this.

with torch.no_grad(): 
  for p in model.parameters(): p -= p.grad * lr 
  model.zero_grad() 
Back to top

ForBo7 // Salman Naqvi © 2022–2025 to ∞ and ForBlog™ by Salman Naqvi

Version 2.2.2.0 | Feedback | Website made with Quarto, by me!