Web Simulation 

 

 

 

 

Genetic Algorithm (Smart Rockets) Tutorial 

This interactive tutorial demonstrates Genetic Algorithms (GAs), optimization techniques inspired by natural selection and genetics that evolve solutions to complex problems through selection, crossover, and mutation. Genetic Algorithms work by maintaining a population of candidate solutions (individuals), evaluating their fitness (how well they solve the problem), selecting the fittest individuals as parents, creating offspring through crossover (combining parent traits), and introducing random mutations to maintain genetic diversity, iteratively improving the population over generations until an optimal or near-optimal solution emerges. The tutorial visualizes this evolutionary process through a "Smart Rockets" simulation, where a population of rockets learns to navigate around an obstacle to reach a target, demonstrating how random initial behaviors evolve into intelligent navigation strategies.

The visualization displays a single interactive canvas showing: (1) The Target (red circle at the top) - the goal that rockets must reach, (2) The Obstacle (gray rectangle in the middle) - a barrier blocking the direct path, forcing rockets to learn navigation strategies, (3) The Rockets (white triangles) - a population of agents, each with unique "DNA" (a sequence of force vectors that control their movement), evolving over generations to reach the target. Rockets that successfully reach the target turn green, while rockets that crash into the obstacle or boundaries turn red. The canvas is rendered using p5.js for real-time visualization with a dark theme (dark gray background) and bright colors for optimal visibility. Real-time statistics display the current generation number, frame count, best fitness, average fitness, and counts of completed and crashed rockets.

The simulator implements the complete genetic algorithm cycle: DNA (Genotype) - each rocket has a sequence of force vectors (directions) that control its movement over its lifespan, Fitness Evaluation - fitness is calculated based on distance to target (closer = higher fitness), speed bonus (reaching target faster = higher fitness), and collision penalty (hitting obstacle = fitness crashes to 0), Selection - rockets with higher fitness are more likely to be selected as parents (fitness-proportional selection), Crossover - offspring DNA is created by combining DNA from two parents at a random midpoint, Mutation - each gene (force vector) has a small probability of being randomized to maintain genetic diversity. You can adjust the population size (range: 20-200, default: 100), lifespan in frames (range: 100-800, default: 400), mutation rate (range: 0-10%, default: 1%), and maximum force (range: 0.05-0.5, default: 0.2) using sliders, and observe how these parameters affect the evolutionary process. Control buttons allow you to Play (start evolution), Pause (freeze current generation), and Reset (restart from generation 0).

NOTE : The tutorial uses standard genetic algorithm terminology: DNA/Genotype (the genetic code - sequence of force vectors), Phenotype (the physical representation - the rocket), Fitness (measure of how well a rocket performs), Selection (choosing parents based on fitness), Crossover (combining parent DNA to create offspring), Mutation (random changes to DNA). The simulation demonstrates the fundamental principle that genetic algorithms evolve solutions through iterative improvement: random initial populations gradually converge toward optimal behaviors through the combined effects of selection (keeping good solutions), crossover (combining good traits), and mutation (exploring new possibilities). This "Smart Rockets" example is the "Hello World" of visual genetic algorithms, providing immediate, intuitive visual feedback on how simple rules create complex intelligent behavior.

Mathematical Model

The "Smart Rockets" simulation is an application of a Genetic Algorithm (GA) to a continuous control problem. The system optimizes a trajectory path through a 2D space using evolutionary principles.

1. Core Definitions

The system distinguishes between the genotype (the encoded data) and the phenotype (the expressed behavior).

  • Genotype (DNA): A discrete sequence of L force vectors, representing the thrust applied at each time step t ∈ [0, L).
  • DNA = {f0, f1, ..., fL-1}
  • Where fi is a 2D vector with magnitude limit |fi| ≤ Fmax.
  • Phenotype (Rocket State): The physical manifestation of the genotype over time, defined by position p(t), velocity v(t), and acceleration a(t).

2. Phenotype Dynamics (The Physics Engine)

For each frame t, the phenotype determines its state using Euler Integration. We assume unit mass (m = 1).

  • Force Application: a(t) = ft
  • Velocity Update: v(t + 1) = v(t) + a(t)
  • Position Update: p(t + 1) = p(t) + v(t)
  • Heading: The visual orientation θ is derived from the velocity vector: θ = atan2(vy, vx).

3. The Objective Function (Fitness)

The Fitness Function F(i) evaluates the performance of individual i. It is designed as a maximization problem where higher values indicate better solutions.

Base Fitness Calculation:

We use an Inverse Distance weighting. Let d be the Euclidean distance between the rocket's final position p(L) and the target ptarget:

Fbase = 1 / (d + 1)

Multipliers (Environmental Feedback):

To accelerate convergence, we apply discrete multipliers based on collision states:

  • Completion Bonus: If d < rtarget (target reached), then F = Fbase × 10 × (L / tcomplete)
  • Crash Penalty: If collision detected, then F = Fbase × 0.1

Note: The speed bonus L / tcomplete rewards rockets that reach the target using fewer frames.

4. Evolutionary Operators

The population evolves from Generation g to g + 1 using the following stochastic operators:

4.1 Selection (Roulette Wheel / Rejection Sampling)

Probability of selecting individual i as a parent is proportional to its normalized fitness:

P(select i) = Fnorm(i) = F(i) / Fmax

Implementation: We normalize all fitness values to a range of [0, 1] and build a "Mating Pool" array where the number of entries for individual i equals floor(Fnorm(i) × 100).

4.2 Crossover (Single-Point)

Given two parents PA and PB with DNA length L, we select a random midpoint m ∈ [0, L). The Child C inherits:

  • C.genes[0...m-1] = PA.genes[0...m-1]
  • C.genes[m...L-1] = PB.genes[m...L-1]

4.3 Mutation (Stochastic Variation)

To prevent local optima, every gene in the child's DNA is subject to random mutation based on rate μ (typically 0.01):

For each gene j: if random() < μ, then fj = random2DVector(|f| ≤ Fmax)

5. High-Level Algorithm Cycle

  1. Initialize: Generate Population P0 with random DNA vectors.
  2. Simulation Loop (Repeat for Lifespan L):
    • Apply force ft from DNA to physics engine.
    • Update Position/Velocity using Euler integration.
    • Check collisions (Wall/Target).
  3. Evaluation:
    • Calculate Fitness F(i) for all N rockets.
    • Find Max Fitness (for statistics).
  4. Reproduction:
    • Normalize fitness scores.
    • Generate Mating Pool based on relative fitness.
    • Loop N times:
      • Select Parent A and Parent B from Pool.
      • Perform Crossover → Child DNA.
      • Perform Mutation on Child DNA.
      • Add Child to New Population.
  5. Advance: Replace old population with new population. Increment Generation count. Repeat.

 

Usage Example

Follow these steps to explore the Genetic Algorithm (Smart Rockets) simulation:

  1. Initial State: When you first load the simulation, you'll see a canvas with: (1) The Target (red circle at the top center) - the goal that rockets must reach, (2) The Obstacle (gray rectangle in the middle) - a barrier blocking the direct path, forcing rockets to learn navigation strategies, (3) The Rockets (100 white triangles at the bottom) - a population of agents, each with unique random DNA (force vectors), starting from the bottom center. Real-time statistics display the current generation (0), frame count (0/400), best fitness (0.0000), average fitness (0.0000), and counts of completed and crashed rockets. Notice that with the default parameters (Population Size: 100, Lifespan: 400 frames, Mutation Rate: 1%, Max Force: 0.2), the rockets start with completely random behavior - they move chaotically and most crash into the obstacle or boundaries.
  2. Start Evolution: Click the "Play" button to start the simulation. Observe that: (1) All 100 rockets begin moving simultaneously, each following its unique DNA sequence (force vectors), (2) Most rockets move randomly and crash into the obstacle (gray rectangle) or boundaries (turning red), (3) A few rockets may get closer to the target, but none reach it in the first generation, (4) The frame counter increments from 0 to 400 (the lifespan), (5) When the lifespan ends (frame 400), the generation counter increments, fitness is calculated, and a new generation is created through selection, crossover, and mutation. This demonstrates the initial random state: the population starts with no knowledge, just random DNA sequences.
  3. Observe Evolution Over Generations: Watch the simulation run for several generations (10-20 generations). Observe how the population gradually improves: (1) In early generations (0-5), most rockets crash randomly, but a few get slightly closer to the target, (2) In middle generations (5-15), you'll see patterns emerging - rockets start learning to "go up" to avoid the obstacle, (3) In later generations (15+), most rockets learn to navigate around the obstacle and reach the target (turning green), (4) The best fitness and average fitness values increase over time, (5) The number of completed rockets increases while crashed rockets decrease. This demonstrates evolution in action: successful behaviors (DNA patterns) spread through the population via selection and crossover, while mutation explores new possibilities.
  4. Adjust Population Size: Use the "Population Size" slider to change the number of rockets (range: 20-200, default: 100). Observe how: (1) Smaller populations (20-50) evolve faster but may get stuck in local optima (suboptimal solutions), (2) Larger populations (150-200) evolve more slowly but explore more thoroughly and find better solutions, (3) The diversity of behaviors increases with population size - more rockets means more genetic diversity, (4) Reset the simulation after changing population size to see the effect from generation 0. Try setting Population Size = 50: evolution happens faster, but the population may converge to a suboptimal strategy. Try Population Size = 200: evolution is slower but more thorough, eventually finding better navigation strategies.
  5. Adjust Mutation Rate: Use the "Mutation Rate" slider to change the probability of random DNA changes (range: 0-10%, default: 1%). Observe how: (1) Low mutation rates (0-0.5%) cause slow evolution or premature convergence (getting stuck in suboptimal solutions), (2) Medium mutation rates (1-3%) provide a good balance between exploration and exploitation, (3) High mutation rates (5-10%) create chaos - rockets jitter randomly and evolution becomes unstable, (4) The mutation rate affects how quickly new strategies are explored - too low and the population gets stuck, too high and it never converges. Try setting Mutation Rate = 0%: watch how the population quickly converges but may get stuck hitting the obstacle. Try Mutation Rate = 5%: observe the chaotic behavior - rockets never settle into a stable strategy.
  6. Adjust Lifespan: Use the "Lifespan (frames)" slider to change how long each rocket lives (range: 100-800, default: 400). Observe how: (1) Shorter lifespans (100-200) make the problem harder - rockets have less time to reach the target, requiring more efficient navigation, (2) Longer lifespans (600-800) make the problem easier - rockets have more time to explore and reach the target, (3) The lifespan determines how many force vectors (DNA genes) each rocket has - shorter lifespan = fewer genes = less complex behavior, (4) Reset the simulation after changing lifespan to see the effect. Try setting Lifespan = 200: observe how rockets must learn more efficient paths. Try Lifespan = 600: notice how rockets have more time to explore and eventually find paths.
  7. Adjust Max Force: Use the "Max Force" slider to change the maximum magnitude of force vectors (range: 0.05-0.5, default: 0.2). Observe how: (1) Lower max force (0.05-0.1) makes rockets move slowly and precisely, requiring more frames to reach the target, (2) Higher max force (0.3-0.5) makes rockets move quickly but less precisely, potentially overshooting the target, (3) The max force affects the "aggressiveness" of movement - lower values create smooth, careful navigation, higher values create fast, aggressive movement, (4) Reset the simulation after changing max force to see the effect. Try setting Max Force = 0.1: watch how rockets move slowly and carefully. Try Max Force = 0.4: observe how rockets move quickly but may overshoot.
  8. Use Pause to Observe: Click "Pause" to freeze the current generation. This allows you to: (1) Examine individual rocket positions and trajectories in detail, (2) Observe which rockets are close to the target (high fitness) and which have crashed (low fitness), (3) Analyze the current state before evolution continues, (4) Adjust parameters while paused, then resume to see the effect. Use Pause to carefully study how rockets navigate - you'll see that successful rockets follow curved paths around the obstacle, while unsuccessful rockets crash into it or boundaries.
  9. Use Reset to Start Over: Click "Reset" to restart the simulation from generation 0 with a new random population. This is useful for: (1) Testing how different parameter combinations affect evolution from the start, (2) Observing the variability in evolutionary outcomes (each reset creates a different initial population), (3) Comparing evolution speed with different settings, (4) Starting fresh after making parameter changes. Try resetting multiple times with the same parameters - you'll notice that evolution follows similar patterns but with slight variations due to randomness in initial DNA and mutation.
  10. Understand Fitness and Selection: Watch the statistics display to understand how fitness drives evolution: (1) Best Fitness shows the highest fitness in the current generation (rockets closer to target have higher fitness), (2) Average Fitness shows the population average (increases over generations as the population improves), (3) Completed shows how many rockets reached the target (increases as evolution progresses), (4) Crashed shows how many rockets hit obstacles or boundaries (decreases as evolution progresses). The fitness values determine which rockets become parents - higher fitness = more "tickets" in the mating pool = more likely to pass on DNA. This is fitness-proportional selection: it doesn't guarantee the best rockets become parents, but it biases toward better solutions.
  11. Observe Color Changes: Watch how rocket colors indicate their state: (1) White rockets (with transparency) are active and moving, (2) Green rockets have successfully reached the target (completed = true), (3) Red rockets have crashed into the obstacle or boundaries (crashed = true). The color changes provide immediate visual feedback on performance - you can quickly see which rockets are successful (green) and which have failed (red). As evolution progresses, you'll see more green rockets and fewer red rockets, demonstrating the population's improvement.

Tip: The key to understanding Genetic Algorithms is recognizing how simple rules (selection, crossover, mutation) create complex intelligent behavior through iterative improvement. Start with default parameters to observe the full evolutionary process - watch how random initial behavior gradually evolves into intelligent navigation. Experiment with different parameter combinations to see how they affect evolution: smaller populations evolve faster but may get stuck, larger populations explore more thoroughly, lower mutation rates converge faster but may miss better solutions, higher mutation rates explore more but may never converge. The "Smart Rockets" simulation is the "Hello World" of visual genetic algorithms - it provides immediate, intuitive feedback on how evolution works. Remember that there is no explicit intelligence or planning - just a list of force vectors (DNA) being refined through selection, crossover, and mutation. The intelligence emerges from the evolutionary process itself.

Parameters

Followings are short descriptions on each parameter
  • Population Size: The number of rockets (individuals) in each generation. Range: 20-200. Default: 100. Higher values mean more genetic diversity and thorough exploration, but slower evolution. Lower values mean faster evolution but may get stuck in local optima (suboptimal solutions). The population size determines how many candidate solutions are evaluated each generation - larger populations explore more thoroughly but require more computation. In the visualization, this is the number of white triangles (rockets) visible on the canvas. The parameter affects genetic diversity: larger populations maintain more diverse DNA, reducing the risk of premature convergence.
  • Lifespan (frames): The number of frames (time steps) each rocket lives. Range: 100-800. Default: 400. Higher values give rockets more time to reach the target (easier problem), while lower values require more efficient navigation (harder problem). The lifespan determines the length of each rocket's DNA - each frame corresponds to one force vector (gene) in the DNA sequence. Longer lifespans allow more complex behaviors but require more genes to evolve. The frame counter displays the current frame (0 to lifespan), and when it reaches the lifespan, the generation ends and evolution occurs.
  • Mutation Rate: The probability that each gene (force vector) in a child's DNA will be randomized. Range: 0-10%. Default: 1%. Higher values introduce more randomness (exploration) but may prevent convergence. Lower values allow faster convergence but may cause premature convergence to suboptimal solutions. The mutation rate is applied independently to each gene during mutation - each force vector has a mutationRate chance of being replaced with a random vector. This parameter balances exploration (finding new solutions) and exploitation (refining existing solutions). Too high mutation creates chaos, too low causes stagnation.
  • Max Force: The maximum magnitude of force vectors in the DNA. Range: 0.05-0.5. Default: 0.2. Higher values make rockets move faster and more aggressively, while lower values create slower, more precise movement. The max force limits the magnitude of each force vector in the DNA - all force vectors are normalized to have magnitude ≤ maxForce. This parameter controls the "aggressiveness" of rocket movement: higher values allow faster navigation but may cause overshooting, lower values create careful, precise movement but may be too slow to reach the target within the lifespan.
  • DNA (Genotype): The genetic code - a sequence of force vectors (directions) that control rocket movement. Each rocket has DNA.length = lifespan force vectors, one for each frame. Initially, DNA is random (random 2D vectors with magnitude ≤ maxForce). Over generations, successful DNA patterns (e.g., "go up and left to avoid obstacle") are preserved and refined through selection and crossover, while mutation introduces new variations. The DNA represents the "instructions" that control behavior - there is no explicit intelligence, just a list of force vectors being refined through evolution.
  • Fitness: A measure of how well a rocket performs. Calculated as: fitness = 1 / (distance + 1) × bonus × penalty, where distance is the distance to target, bonus = 10× if completed (reached target), bonus includes speed bonus (faster = higher), and penalty = 0.1× if crashed. Higher fitness means better performance. Fitness determines which rockets are more likely to become parents - rockets with higher fitness have more "tickets" in the mating pool (fitness-proportional selection). The best fitness and average fitness values are displayed in real-time, showing how the population improves over generations.
  • Target: The goal that rockets must reach (red circle at top center of canvas). Rockets that reach within 20 pixels of the target are marked as "completed" (turn green) and receive a 10× fitness bonus plus a speed bonus (reaching faster = higher bonus). The target position is fixed at (width/2, 50) - rockets start at (width/2, height-20) and must navigate around the obstacle to reach it. The target represents the optimization goal - the genetic algorithm evolves solutions (rocket DNA) that maximize fitness by reaching the target.
  • Obstacle (Barrier): A rectangular barrier (gray rectangle) blocking the direct path from start to target. The obstacle is positioned at (width/2 - 100, height/2) with dimensions 200×10 pixels. Rockets that collide with the obstacle are marked as "crashed" (turn red) and receive a 0.1× fitness penalty. The obstacle forces rockets to learn navigation strategies - they cannot simply go straight up, but must learn to go around (e.g., "go up and left, then up and right"). This makes the problem more interesting and demonstrates how genetic algorithms can solve complex navigation problems.
  • Generation: The current evolutionary generation (iteration). Starts at 0 and increments each time the lifespan ends. Each generation consists of: (1) All rockets run their DNA (apply force vectors over lifespan), (2) Fitness is calculated for each rocket, (3) A mating pool is created (fitness-proportional selection), (4) New population is generated through crossover and mutation, (5) Generation counter increments. The generation number is displayed in real-time, showing how many evolutionary cycles have occurred. Early generations (0-10) show rapid improvement as successful patterns emerge, while later generations (20+) show refinement and convergence.
  • Selection: The process of choosing parents based on fitness. Uses fitness-proportional selection: rockets with higher fitness have more "tickets" (copies) in the mating pool. When selecting parents, we randomly pick from the mating pool, so high-fitness rockets are statistically more likely to be chosen. This doesn't guarantee the best rockets become parents, but it biases toward better solutions while maintaining diversity. Selection is the "survival of the fittest" mechanism - successful behaviors (DNA patterns) are more likely to be passed on to the next generation.
  • Crossover: The process of combining DNA from two parents to create offspring. A random midpoint is chosen in the DNA sequence, and the child gets genes from parent A before the midpoint and genes from parent B after the midpoint (or vice versa). This allows successful traits from both parents to be combined - for example, if parent A learned to "go up" and parent B learned to "go left," the child might get "go up" from A and "go left" from B, potentially learning to "go up and left" to navigate around the obstacle. Crossover is the "recombination" mechanism - it combines good traits from multiple parents.
  • Mutation: Random changes to DNA with probability mutationRate. Each gene (force vector) has a small chance of being completely randomized (replaced with a random 2D vector with magnitude ≤ maxForce). Mutation is essential for: (1) Exploring new possibilities (discovering new navigation strategies), (2) Escaping local optima (avoiding getting stuck in suboptimal solutions), (3) Maintaining genetic diversity (preventing premature convergence). Too much mutation creates chaos, too little causes slow evolution or stagnation. The default 1% mutation rate provides a balance between exploration and exploitation.

Controls and Visualizations

Followings are short descriptions on each control
  • Play Button: Starts or resumes the simulation, beginning the evolutionary process. Located in the control panel. When clicked, the simulation runs continuously: rockets move according to their DNA, the frame counter increments, and when the lifespan ends, evolution occurs (fitness calculation, selection, crossover, mutation, new generation). The simulation continues until paused. Use Play to start evolution and observe how the population improves over generations.
  • Pause Button: Pauses the simulation, freezing the current generation at the current frame. Located in the control panel. When clicked, the simulation stops, allowing you to examine rocket positions and trajectories in detail, adjust parameters, or analyze the current state. Rockets remain visible with their current positions and states (white = active, green = completed, red = crashed). Use Pause to freeze the simulation for detailed observation, then click Play to resume.
  • Reset Button: Restarts the simulation from generation 0 with a new random population. Located in the control panel. When clicked, the simulation clears all state: generation counter resets to 0, frame counter resets to 0, and a new random population is created. This is useful for testing different parameter combinations from the start or observing the variability in evolutionary outcomes. Use Reset to start fresh after making parameter changes.
  • Population Size Slider: Controls the number of rockets in each generation (range: 20-200, default: 100). Located in the control panel with label and value display. Higher values mean more genetic diversity and thorough exploration, but slower evolution. Lower values mean faster evolution but may get stuck in local optima. The slider updates in real-time, but changes only take effect after Reset (to create a new population with the new size). Adjust this parameter to balance exploration (larger populations) vs. speed (smaller populations).
  • Lifespan (frames) Slider: Controls how long each rocket lives in frames (range: 100-800, default: 400). Located in the control panel with label and value display. Higher values give rockets more time to reach the target (easier problem), while lower values require more efficient navigation (harder problem). The slider updates in real-time, but changes only take effect after Reset (to create new DNA with the new length). The lifespan determines the length of each rocket's DNA sequence - each frame corresponds to one force vector (gene).
  • Mutation Rate Slider: Controls the probability of random DNA changes (range: 0-10%, default: 1%). Located in the control panel with label and value display. Higher values introduce more randomness (exploration) but may prevent convergence. Lower values allow faster convergence but may cause premature convergence. The slider updates in real-time, immediately affecting mutation during evolution. Adjust this parameter to balance exploration (higher mutation) vs. exploitation (lower mutation).
  • Max Force Slider: Controls the maximum magnitude of force vectors (range: 0.05-0.5, default: 0.2). Located in the control panel with label and value display. Higher values make rockets move faster and more aggressively, while lower values create slower, more precise movement. The slider updates in real-time, but changes only take effect after Reset (to create new DNA with the new force limit). Adjust this parameter to control the "aggressiveness" of rocket movement.
  • Visualization Canvas: The main canvas displaying the Smart Rockets simulation. Shows: (1) The target (red circle at top center), (2) The obstacle (gray rectangle in the middle), (3) The rockets (white triangles that rotate to face their velocity direction). Rockets change color based on state: white (active/moving), green (completed - reached target), red (crashed - hit obstacle or boundaries). The canvas uses a dark gray background (51) with bright colors for optimal visibility. The canvas size is 760×600 pixels, rendered using p5.js for smooth animation.
  • Statistics Display: Text overlay on the canvas and separate panel displaying real-time statistics. Shows: (1) Generation number (current evolutionary generation), (2) Frame count (current frame / total lifespan), (3) Best fitness (highest fitness in current generation), (4) Average fitness (population average fitness), (5) Completed count (rockets that reached target), (6) Crashed count (rockets that hit obstacle or boundaries). The statistics update continuously as the simulation runs, showing the current state and progress of evolution. Uses white text on dark background for visibility.
  • Control Panel: The UI panel containing all controls and statistics. Located above the visualization canvas. Contains: (1) Simulation Controls section (Play, Pause, Reset buttons), (2) Parameters section (sliders for Population Size, Lifespan, Mutation Rate, Max Force), (3) Statistics Display section (Generation, Best Fitness, Average Fitness, Completed, Crashed counts). The control panel uses a dark theme (black background #111111) with bright text and borders for visibility, matching the overall simulation aesthetic.
  • Rocket Visualization: Each rocket is drawn as a triangle that rotates to face its velocity direction. Rockets start at the bottom center (width/2, height-20) and move according to their DNA (force vectors). The triangle points in the direction of movement, providing visual feedback on rocket orientation and trajectory. Rockets are drawn with transparency (alpha = 150) when active, making overlapping rockets visible. The visualization clearly shows how rockets navigate - successful rockets follow curved paths around the obstacle, while unsuccessful rockets crash into it.
  • Color Coding: Visual indicators for rocket state: (1) White (with transparency) = active rockets that are still moving, (2) Green = completed rockets that successfully reached the target, (3) Red = crashed rockets that hit the obstacle or boundaries. The color changes provide immediate visual feedback on performance - you can quickly see which rockets are successful (green) and which have failed (red). As evolution progresses, you'll see more green rockets and fewer red rockets, demonstrating the population's improvement.
  • Real-Time Updates: All visualizations and statistics update in real-time as the simulation runs. The canvas redraws every frame (60 FPS), showing rocket movement, the frame counter increments, and statistics update continuously. When evolution occurs (lifespan ends), the generation counter increments, fitness is calculated, and a new population is created instantly. Parameter sliders update their value displays in real-time, and changes take effect immediately (for mutation rate) or after Reset (for population size, lifespan, max force). This real-time interaction helps you understand how each parameter affects the evolutionary process.

Key Concepts

  • Genetic Algorithm: An optimization technique inspired by natural selection and genetics. Genetic algorithms maintain a population of candidate solutions (individuals), evaluate their fitness (how well they solve the problem), select the fittest as parents, create offspring through crossover (combining parent traits) and mutation (random changes), and iteratively improve the population over generations until an optimal or near-optimal solution emerges. The "Smart Rockets" simulation demonstrates this process: rockets with better DNA (force vectors that navigate successfully) have higher fitness, are more likely to become parents, and pass on their successful traits to the next generation. Genetic algorithms are particularly effective for complex optimization problems where traditional methods may fall short, such as navigation, scheduling, design optimization, and machine learning.
  • DNA (Genotype): The genetic code - a sequence of force vectors (directions) that control rocket behavior. Each rocket's DNA consists of lifespan force vectors, one for each frame. Initially, DNA is random (random 2D vectors with magnitude ≤ maxForce). Over generations, successful DNA patterns (e.g., "go up and left to avoid obstacle") are preserved and refined through selection and crossover, while mutation introduces new variations. The DNA represents the "instructions" that control behavior - there is no explicit intelligence or planning, just a list of force vectors being refined through evolution. This demonstrates the key insight of genetic algorithms: complex intelligent behavior can emerge from simple genetic code through iterative improvement.
  • Fitness Function: A measure of how well a candidate solution (rocket) performs. In the Smart Rockets simulation, fitness is calculated as: fitness = 1 / (distance + 1) × bonus × penalty, where distance is the distance to target, bonus = 10× if completed (reached target) with speed bonus (faster = higher), and penalty = 0.1× if crashed. Higher fitness means better performance. The fitness function is crucial - it determines which solutions are "good" and should be preserved. Fitness drives selection: rockets with higher fitness are more likely to become parents and pass on their DNA. The fitness function must accurately reflect the optimization goal - in this case, reaching the target quickly while avoiding obstacles.
  • Selection: The process of choosing parents based on fitness. Uses fitness-proportional selection: rockets with higher fitness have more "tickets" (copies) in the mating pool. When selecting parents, we randomly pick from the mating pool, so high-fitness rockets are statistically more likely to be chosen. This doesn't guarantee the best rockets become parents, but it biases toward better solutions while maintaining diversity. Selection is the "survival of the fittest" mechanism - it ensures that successful behaviors (DNA patterns) are more likely to be passed on to the next generation. This is analogous to natural selection in biology: organisms with better traits are more likely to survive and reproduce.
  • Crossover (Recombination): The process of combining DNA from two parents to create offspring. A random midpoint is chosen in the DNA sequence, and the child gets genes from parent A before the midpoint and genes from parent B after the midpoint (or vice versa). This allows successful traits from both parents to be combined - for example, if parent A learned to "go up" and parent B learned to "go left," the child might get "go up" from A and "go left" from B, potentially learning to "go up and left" to navigate around the obstacle. Crossover is the "recombination" mechanism - it combines good traits from multiple parents, potentially creating better solutions than either parent alone. This is analogous to sexual reproduction in biology: offspring inherit traits from both parents.
  • Mutation: Random changes to DNA with probability mutationRate. Each gene (force vector) has a small chance of being completely randomized (replaced with a random 2D vector with magnitude ≤ maxForce). Mutation is essential for: (1) Exploring new possibilities (discovering new navigation strategies), (2) Escaping local optima (avoiding getting stuck in suboptimal solutions), (3) Maintaining genetic diversity (preventing premature convergence). Too much mutation creates chaos - rockets jitter randomly and evolution becomes unstable. Too little mutation causes slow evolution or stagnation - the population may get stuck in local optima. The default 1% mutation rate provides a balance between exploration (finding new solutions) and exploitation (refining existing solutions). This is analogous to genetic mutations in biology: random changes that can introduce beneficial new traits.
  • Evolutionary Process: The iterative cycle of evaluation, selection, crossover, and mutation that improves the population over generations. Each generation consists of: (1) All rockets run their DNA (apply force vectors over lifespan), (2) Fitness is calculated for each rocket, (3) A mating pool is created (fitness-proportional selection), (4) New population is generated through crossover and mutation, (5) Generation counter increments. Over generations, the population gradually improves: initially, rockets move randomly and crash. As successful patterns emerge (e.g., "go up to avoid obstacle"), they spread through the population via selection and crossover. Eventually, most rockets learn to navigate around the obstacle and reach the target. This demonstrates how simple rules (selection, crossover, mutation) create complex intelligent behavior through iterative improvement. The evolutionary process continues until a stopping criterion is met (e.g., target fitness reached, maximum generations, convergence).
  • Exploration vs. Exploitation: The fundamental trade-off in genetic algorithms. Exploration (finding new solutions) is achieved through mutation and diversity - exploring the solution space to discover potentially better regions. Exploitation (refining existing solutions) is achieved through selection and crossover - focusing on promising regions to improve solutions. The balance between exploration and exploitation is controlled by parameters: higher mutation rates favor exploration, lower mutation rates favor exploitation; larger populations maintain more diversity (exploration), smaller populations converge faster (exploitation). The optimal balance depends on the problem: complex problems with many local optima need more exploration, while simple problems with clear optima can focus on exploitation. The Smart Rockets simulation demonstrates this trade-off: too much mutation (exploration) creates chaos, too little mutation (exploitation) causes premature convergence to suboptimal solutions.
  • Local Optima vs. Global Optimum: A fundamental challenge in optimization. Local optima are suboptimal solutions that are better than nearby solutions but not the best overall. Global optimum is the best possible solution. Genetic algorithms can get stuck in local optima if the population converges prematurely (all individuals become similar). Mutation helps escape local optima by introducing random changes that can discover better regions. The Smart Rockets simulation demonstrates this: if mutation is too low, the population may converge to a suboptimal strategy (e.g., all rockets try to go straight up and crash into the obstacle), missing the better strategy (going around the obstacle). Higher mutation rates or larger populations help explore more thoroughly and find the global optimum (successful navigation around the obstacle).
  • Emergent Intelligence: The phenomenon where complex intelligent behavior emerges from simple rules and interactions. In the Smart Rockets simulation, there is no explicit intelligence, planning, or pathfinding algorithm - just a list of force vectors (DNA) being refined through evolution. Yet, over generations, the population "learns" to navigate around the obstacle and reach the target. This intelligence emerges from the evolutionary process itself: successful behaviors (DNA patterns) are preserved and refined, while unsuccessful behaviors are eliminated. The simulation demonstrates that intelligence doesn't require explicit programming - it can emerge from simple evolutionary mechanisms. This is a key insight of genetic algorithms and evolutionary computation: complex solutions can be found through iterative improvement of simple components.
  • Applications: Genetic algorithms have applications in many fields: (1) Optimization - function optimization, parameter tuning, design optimization, (2) Machine Learning - neural network architecture search, hyperparameter optimization, feature selection, (3) Robotics - path planning, gait optimization, controller design, (4) Scheduling - job scheduling, resource allocation, timetabling, (5) Game AI - strategy optimization, behavior evolution, (6) Engineering - structural design, antenna design, circuit optimization. The Smart Rockets simulation demonstrates the core concepts that apply to all these applications: encoding solutions as DNA, evaluating fitness, and evolving better solutions through selection, crossover, and mutation. Understanding genetic algorithms provides a powerful tool for solving complex optimization problems where traditional methods may fall short.
  • Practical Considerations: Real-world genetic algorithm applications face several challenges: (1) Fitness Function Design - must accurately reflect the optimization goal, balancing multiple objectives, (2) Parameter Tuning - population size, mutation rate, crossover rate must be tuned for each problem, (3) Convergence Criteria - when to stop evolution (target fitness, maximum generations, convergence detection), (4) Computational Cost - evaluating fitness for large populations can be expensive, (5) Encoding - how to represent solutions as DNA (genotype-phenotype mapping). The Smart Rockets simulation uses a simple encoding (force vectors) and fitness function (distance to target), but real applications may require more sophisticated encodings and multi-objective fitness functions. The simulation demonstrates the fundamental concepts, but real systems must account for these practical considerations.