vooban AI
Published in

vooban AI

Hyperopt tutorial for Optimizing Neural Networks’ Hyperparameters

What is Hyperopt?

How to define Hyperopt parameters?

  • hp.randint(label, upper)
  • hp.uniform(label, low, high)
  • hp.loguniform(label, low, high)
  • hp.normal(label, mu, sigma)
  • hp.lognormal(label, mu, sigma)
  • hp.quniform(label, low, high, q)
  • hp.qloguniform(label, low, high, q)
  • hp.qnormal(label, mu, sigma, q)
  • hp.qlognormal(label, mu, sigma, q)
  • hp.choice(label, ["list", "of", "potential", "choices"])
  • hp.choice(label, [hp.uniform(sub_label_1, low, high), hp.normal(sub_label_2, mu, sigma), None, 0, 1, "anything"])
Histogram for an Hyperopt integer distribution
Histogram for an Hyperopt uniform distribution
Histogram for an Hyperopt loguniform distribution
Histogram for an Hyperopt normal distribution
Histogram for an Hyperopt lognormal distribution

On the loguniform and lognormal distributions

Histogram for an Hyperopt loguniform distribution, and its inverse
Histogram for an Hyperopt lognormal distribution, and its inverse

Example — optimizing for finding the minimum of:
f(x) = x^2 - x + 1

Some quadratic function f(x)=x^2-x+1
Found minimum after 1000 trials:
{'x': 0.500084824485627}

Example with a dict hyperparameter space

With choices, Hyperopt hyperspaces can be represented as nested data structures, too

Let’s now record the history of every trial

Up next: saving results to disk while optimizing for resuming a stopped hyperparameter search

Hyperspace Scatterplot Matrix

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store