Understanding Hyperparameters

Genetic Algorithms in Elixir — by Sean Moriarity (24 / 101)

The Pragmatic Programmers
The Pragmatic Programmers

--

👈 Building a Framework for Genetic Algorithms | TOC | Solving the One-Max Problem Again 👉

In machine learning, hyperparameters refer to the parts of the algorithm you set before the algorithm starts training. Internally, the algorithm learns parameters that help it perform a task. Externally, the programmer controls parameters that dictate how the algorithm trains.

In the context of genetic algorithms, hyperparameters refer to things like population size, mutation rate, and so on, that you choose before running the algorithm.

Because your hyperparameters can have a huge impact on the outcome of your algorithms, it’s important that you’re able to rapidly change them. To ensure you can change hyperparameters without too much of a headache, you need to implement a simple configuration mechanism into your framework that separates the hyperparameters from the overall structure of the algorithm.

To start, change the signature of both run/3 and evolve/4 to accept an additional parameter:

​ ​def​ run(genotype, fitness_function, max_fitness, opts \\ []) ​do​
​ ​# ...omitted...​
​ ​end​
​ ​def​ evolve(population, fitness_function, max_fitness, opts \\ []) ​do​
​ ​# ...omitted...​
​ ​end​

opts \\ [] indicates an optional parameter that will default to an empty list if you pass nothing in its place. You can use opts to…

--

--

The Pragmatic Programmers
The Pragmatic Programmers

We create timely, practical books and learning resources on classic and cutting-edge topics to help you practice your craft and accelerate your career.