mlx-optimizers - 0.4.1 documentation

mlx-optimizers - 0.4.1 documentation#

A library to experiment with new optimization algorithms in MLX.

  • Diverse Exploration: includes proven and experimental optimizers like DiffGrad, QHAdam, and others.

  • Easy Integration: compatible with MLX for straightforward experimentation and downstream adoption.

  • Benchmark Examples: enables quick testing on classic optimization and machine learning tasks.

See a full list of optimizers in the API Reference.

Example Usage#
   import mlx_optimizers as optim

   #... model, grads, etc.
   optimizer = optim.DiffGrad(learning_rate=0.001)
   optimizer.update(model, grads)
graph graph

API Reference