Globtim.jl Documentation
Global optimization of continuous functions via polynomial approximation
The Problem
Finding all local minima of a continuous function over a bounded domain is fundamentally hard. Standard optimization algorithms (gradient descent, BFGS, etc.) find one local minimum from a given starting point—but how do you know there isn't a better one elsewhere?
The Approach
Globtim solves this by replacing your function with a polynomial approximation. Why polynomials?
- Smooth functions are well-approximated by polynomials — Chebyshev and Legendre expansions converge rapidly for smooth functions
- Polynomial critical points can be found exactly — Setting ∇p(x) = 0 gives a polynomial system, which has finitely many solutions that can be computed using homotopy continuation
- Refinement recovers true minima — Each polynomial critical point seeds a local optimization (BFGS) on the original function
The result: a systematic way to find all local minima, not just the nearest one.
Algorithm Overview
f(x) → Polynomial p(x) → Solve ∇p = 0 → Refine with BFGS → All minima
(Chebyshev fit) (numerical or exact)| |
|
| |:–:|:–:|:–:|
For functions that vary on different scales in different regions, Globtim uses adaptive subdivision to build piecewise polynomial approximations that maintain accuracy everywhere.
Installation
julia> ]
pkg> add GlobtimAdditional Dependencies
- Visualization:
add CairoMakieoradd GLMakie - Exact solving: Install msolve (symbolic method based on Gröbner basis computations)
Getting Started
To see Globtim in action, run the demo script:
julia --project=. examples/quick_subdivision_demo.jlThis tests adaptive subdivision on multiple functions (sphere, Rosenbrock, Rastrigin) and shows the algorithm's behavior.
For a detailed walkthrough, see Getting Started.