Continuous Optimization¶
When your objective is a smooth function and you have gradient information, you want continuous optimization. These power machine learning, curve fitting, and any problem where "take a step downhill" makes sense.
Solvers¶
Gradient-Based¶
| Solver | Memory | Adaptive LR? | Best For |
|---|---|---|---|
gradient_descent |
None | No | Simple problems, understanding |
momentum |
Velocity | No | Reducing oscillations |
rmsprop |
Gradient squares | Yes | Different scales per parameter |
adam |
Velocity + gradient² | Yes | Default choice, works almost everywhere |
Quasi-Newton¶
| Solver | Memory | Best For |
|---|---|---|
bfgs |
O(n²) | Fast convergence, smooth functions |
lbfgs |
O(n × m) | Large-scale problems |
Derivative-Free¶
| Solver | Best For |
|---|---|
nelder_mead |
No gradients, noisy objectives |
powell |
Non-smooth, no derivatives |
bayesian_opt |
Expensive evaluations (10-100 total) |
differential_evolution |
Global search, continuous |
particle_swarm |
Global search, swarm intelligence |
When to Use¶
Perfect for:
- Machine learning training
- Curve fitting and regression
- Parameter tuning for differentiable systems
- Smooth, continuous objective functions
When NOT to Use¶
- No gradient information - Use metaheuristics
- Discrete variables - Use MILP, CP-SAT
- Massively non-convex - Start with metaheuristics, refine with gradients
Quick Example¶
from solvor import adam, bayesian_opt
# Adam: Minimize x² + y²
def grad(x):
return [2*x[0], 2*x[1]]
result = adam(grad, x0=[5.0, 5.0], lr=0.1)
print(result.solution) # Close to [0, 0]
# Bayesian optimization: Expensive black-box
def expensive_objective(x):
return (x[0]-2)**2 + (x[1]+1)**2
result = bayesian_opt(expensive_objective, bounds=[(-5, 5), (-5, 5)], max_iter=30)
print(result.solution) # Close to [2, -1]
Rule of Thumb¶
Start with adam. If it's too complex, try gradient_descent with line search. If evaluations are expensive, use bayesian_opt.
See Also¶
- Metaheuristics - When you don't have gradients
- Linear Programming - For linear objectives