Approximating Polynomials Using CMA-ES

Luca Rolshoven
The Startup
Published in
3 min readSep 4, 2020
Aerial footage of a street that looks like a function curve. The image was taken by Lucas Clara and published on Unsplash.

For my upcoming master thesis, I am currently reading a lot about Evolution Strategies. Recently, I’ve read about the Covariance Matrix Adaption Evolution Strategy (CMA-ES) [1]. My supervisor suggested that I might want to try the pycma package which is a python implementation of CMA-ES. To get familiar with the package, I decided to build my own example where I wanted to approximate the coefficients of a polynomial. So if you are looking for an intuitive start into pycma too, feel free to go through this little tutorial.

Getting Started

I will be using numpy to randomly generate the coefficient of the original polynomial that will be approximated using CMA-ES. Moreover, the python re module and IPython will be used to generate a LaTeX string representation of the polynomial. So let’s get started by importing everything we need.

Now let’s define some properties. For example, we might want to change the order of the polynomial and the range of the coefficients. Furthermore, I also added the numpy.random seeds that are used to generate the coefficients and the evaluation points in the objective function as well as the number of function evaluations that are used to generate the RMSE loss.

Creating and Displaying the Ground Truth Polynomial

We can just generate a polynomial function using numpy:

I was using a Jupyter Notebook to run this small experiment, so I wanted to find a way to display the polynomial in a readable way. I found a solution by using IPython, which can be seen below.

Defining the Loss Function

Now we can generate the loss function which takes the learned parameters as a parameter and then compares the original polynomial with the learned polynomial by comparing their function values for a given number of points (in my case NUM_LOSS_EVAL = 1000).

Run CMA-ES and Compare Learned Polynomial to the Ground Truth

We are ready to start the optimization process. Initially, all parameters are set to zero and the standard deviation is set to 0.5. I set the verbosity to -3 because otherwise there would be warnings appearing in my Jupyter Notebook. If you want to know more about how to use CMAEvolutionStrategy, please refer to the API. But now, let’s run the algorithm.

After some time, 2 minutes in my case, CMA-ES finishes and we are able to access the best found parameters using es.result.xbest. To compare the learned coefficients with the original ones, I plotted the two polynomials in the Jupyter Notebook.

The result can be seen below. Notice how CMA-ES was able to approximate the polynomial very well. While this little experiment was successful, it would be interesting to test some different scenarios. For example, you could set the order of the learned polynomial higher than the order of the original polynomial. Intuitively, CMA-ES should learn to set the coefficients of the terms that correspond to variables with a order higher than POLY_ORDER to zero.

Source Code

The Jupyter Notebook with my code can be found here. The repository also contains a pdf file which is a readable version of the Jupyter Notebook.

References

[1] Hansen, N. and A. Ostermeier (1996). Adapting arbitrary normal mutation distributions in evolution strategies: The covariance matrix adaptation. In Proceedings of the 1996 IEEE International Conference on Evolutionary Computation, pp. 312–317

--

--

Luca Rolshoven
The Startup

I’m a Computer Science student currently doing my master’s degree at the University of Bern, Switzerland with a specialization in Data Science.