Sitemap
TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Source: Unsplash

Member-only story

The Beauty of Bayesian Optimization, Explained in Simple Terms

The intuition behind an ingenious algorithm

Andre Ye
7 min readSep 12, 2020

--

Here’s a function: f(x). It’s expensive to calculate, not necessarily an analytic expression, and you don’t know its derivative.

Your task: find the global minima.

This is, for sure, a difficult task, one more difficult than other optimization problems within machine learning. Gradient descent, for one, has access to a function’s derivatives and takes advantage of mathematical shortcuts for faster expression evaluation.

Alternatively, in some optimization scenarios the function is cheap to evaluate. If we can get hundreds of results for variants of an input x in a few seconds, a simple grid search can be employed with good results.

Or, an entire host of non-conventional non-gradient optimization methods can be used, like particle swarming or simulated annealing.

Unfortunately, the current task doesn’t have these luxuries. We are limited in our optimization by several fronts, notably:

  • It’s expensive to calculate. Ideally we would be able to query the function enough to essentially replicate it, but our optimization method must work with a limited sampling of inputs.
  • The derivative is…

--

--

TDS Archive
TDS Archive

Published in TDS Archive

An archive of data science, data analytics, data engineering, machine learning, and artificial intelligence writing from the former Towards Data Science Medium publication.

Andre Ye
Andre Ye

Responses (1)