Research

Open Source

Nevergrad, an evolutionary optimization platform, adds new key features

4/13/2020

What is it:

We have added a range of noteworthy new features to Nevergrad, Facebook AI’s open source Python3 library for derivative-free and evolutionary optimization. These enhancements enable researchers and engineers to work with several objectives (multi-objective optimization) or with constraints. These uses are common in natural language processing, for example, where a translation model may be optimized on multiple metrics or benchmarks simultaneously. Because Nevergrad offers cutting-edge algorithms through an easy-to-use, open Python source, anyone can use it to easily test and compare different approaches to a particular problem or to use well-known benchmarks to evaluate how a method compares with the current state of the art. To further improve Nevergrad, we have partnered with IOH Profiler to create the Open Optimization Competition. It is open to submissions for both new optimization algorithms and improvements to Nevergrad’s core tools. Entries must be submitted before September 30 to be eligible for prizes, and more information is available here.

What it does:

Nevergrad is an easy-to-use optimization toolbox for AI researchers, including those who aren’t Python geeks. Optimizing any function takes only a couple of lines of code:

1
2
3
4
5
6
7
8
9
import nevergrad as ng

def square(x):
    return sum((x - .5)**2)

optimizer = ng.optimizers.OnePlusOne(instrumentation=2, budget=100)
recommendation = optimizer.minimize(square)
print(recommendation.value)  # optimal args and kwargs
>>> array([0.500, 0.499])

The platform provides a single, consistent interface to use a wide range of derivative-free algorithms, including evolution strategies, differential evolution, particle swarm optimization, Cobyla, and Bayesian optimization. The platform also facilitates research on new derivative-free optimization methods, and novel algorithms can be easily incorporated into the platform.

Through a joint effort with IOH and with input from researchers at the black-box optimization meeting at Dagstuhl, we have made several noteworthy improvements to Nevergrad:

  • Multi-objective optimization.

  • Constrained optimization.

  • Simplified problem parametrization. Specifying a log distributed variable between 0.001 and 1.0 is just ng.p.Log(lower=0.001, upper=1.0).

  • Competence map optimizers. We provide algorithms to automatically select the best optimization method, taking into account your computational budget, the dimension, the type of variables, and the degree of parallelism.

This graphic shows results using a very diverse set of example problems. Numbers and colors correspond to the probability of an optimizer outperforming other algorithms. Algorithms (NGO, Shiva, CMA, etc) are ranked by performance, best algorithms on the left. The 6 best algorithms are also reported on the left. NGO and Shiva, the two highest-performing optimizers in this test bed, are handmade competence map optimization methods designed to be extremely versatile.

  • Tools for chaining optimization algorithms and decomposing problems into several subproblems, by attributing distinct variables to different optimizers.

  • Interface with HiPlot, Facebook AI’s lightweight interactive visualization tool. This allows researchers to easily explore the optimization process or to use an interactive plot in a Jupyter notebook to observe the behaviors of very different algorithms.

Example of optimization of a 2D sum of absolute value function with optimum at (100,100) and initial guess around (0,0) with standard deviation 1. Different algorithms adapt differently to find the optimum far from the initial guess (blue for first iterations and red for later ones).

As an additional experimental feature, we regularly compare optimizers’ performance and publish results here. AI researchers can easily extend Nevergrad with new benchmarks or optimizers and run them locally, or create a pull request on GitHub in order to merge their contribution and have it included in these automated tests.

Why it matters:

Most machine learning tasks — from natural language processing to image classification to translation and many others — rely on derivative-free optimization to tune parameters and/or hyperparameters in their models. Nevergrad makes it easy for researchers and engineers to find the best way to do this and to develop new and better techniques.

Multi-objective optimization (detailed in this example in Nevergrad) is prominent in everyone's life. For instance, if someone is looking to buy something, she or he may want options that are simultaneously cheap, nearby, relevant, and high quality.

Since its initial release, Nevergrad has become a widely used research tool. The new features we are now sharing enable work on additional use cases, such as multi-agent power systems, physics (photonics or antireflective coatings), and control in games. Nevergrad also provides generic algorithms that can better adapt to the structure of a particular problem, including by using specific mutations or recombination in evolutionary algorithms, through the new parametrization system.

Get it here:

GitHub: https://github.com/facebookresearch/nevergrad

Documentation: https://facebookresearch.github.io/nevergrad/index.html

Pypi: pip install nevergrad