scientific-skills/pymoo/references/algorithms.md
Comprehensive reference for optimization algorithms available in pymoo.
Purpose: General-purpose single-objective evolutionary optimization Best for: Continuous, discrete, or mixed-variable problems Algorithm type: (μ+λ) genetic algorithm
Key parameters:
pop_size: Population size (default: 100)sampling: Initial population generation strategyselection: Parent selection mechanism (default: Tournament)crossover: Recombination operator (default: SBX)mutation: Variation operator (default: Polynomial)eliminate_duplicates: Remove redundant solutions (default: True)n_offsprings: Offspring per generationUsage:
from pymoo.algorithms.soo.nonconvex.ga import GA
algorithm = GA(pop_size=100, eliminate_duplicates=True)
Purpose: Single-objective continuous optimization Best for: Continuous parameter optimization with good global search Algorithm type: Population-based differential evolution
Variants: Multiple DE strategies available (rand/1/bin, best/1/bin, etc.)
Purpose: Single-objective optimization through swarm intelligence Best for: Continuous problems, fast convergence on smooth landscapes
Purpose: Covariance Matrix Adaptation Evolution Strategy Best for: Continuous optimization, particularly for noisy or ill-conditioned problems
Purpose: Direct search method Best for: Problems where gradient information is unavailable
Purpose: Simplex-based optimization Best for: Local optimization of continuous functions
Purpose: Multi-objective optimization with 2-3 objectives Best for: Bi- and tri-objective problems requiring well-distributed Pareto fronts Selection strategy: Non-dominated sorting + crowding distance
Key features:
Key parameters:
pop_size: Population size (default: 100)sampling: Initial population strategycrossover: Default SBX for continuousmutation: Default Polynomial Mutationsurvival: RankAndCrowdingUsage:
from pymoo.algorithms.moo.nsga2 import NSGA2
algorithm = NSGA2(pop_size=100)
When to use:
Purpose: Many-objective optimization (4+ objectives) Best for: Problems with 4 or more objectives requiring uniform Pareto front coverage Selection strategy: Reference direction-based diversity maintenance
Key features:
Key parameters:
ref_dirs: Reference directions (REQUIRED)pop_size: Defaults to number of reference directionscrossover: Default SBXmutation: Default Polynomial MutationUsage:
from pymoo.algorithms.moo.nsga3 import NSGA3
from pymoo.util.ref_dirs import get_reference_directions
ref_dirs = get_reference_directions("das-dennis", n_dim=4, n_partitions=12)
algorithm = NSGA3(ref_dirs=ref_dirs)
NSGA-II vs NSGA-III:
Purpose: Multi-objective optimization with preference articulation Best for: When decision maker has preferred regions of Pareto front
Purpose: Improved version handling various scenarios Best for: Many-objective problems with additional robustness
Purpose: Decomposition-based multi-objective optimization Best for: Problems where decomposition into scalar subproblems is effective
Purpose: Adaptive geometry estimation Best for: Multi and many-objective problems with adaptive mechanisms
Purpose: Reference vector-based many-objective optimization Best for: Many-objective problems with adaptive reference vectors
Purpose: S-Metric Selection Evolutionary Multi-objective Algorithm Best for: Problems where hypervolume indicator is critical Selection: Uses dominated hypervolume contribution
Purpose: Dynamic multi-objective problems Best for: Time-varying objective functions or constraints
Purpose: Knowledge-guided dynamic multi-objective optimization Best for: Dynamic problems leveraging historical information
Purpose: Single-objective constrained optimization Best for: Heavily constrained problems
Purpose: Enhanced constrained optimization Best for: Complex constraint landscapes
For single-objective problems:
For multi-objective problems:
For constrained problems:
For dynamic problems: