Find the best black-box optimizer for machine learning.

The submission site was open July 1, 2020 - October 15, 2020.


The competition has finished!

We are surprised at how many teams pushed the boundaries of state-of-the-art methods. We give a special congratulations to: Huawei Noah’s Ark Lab, NVIDIA RAPIDS.AI, JetBrains Research, Duxiaoman DI, and Optuna Developers (Preferred Networks & CyberAgent).

Beat off-the-shelf black-box optimizers

The challenge will give the participants 3 months to iterate on their algorithms. We will use a benchmark system built on top of the AutoML challenge workflow and the Bayesmark package, which evaluates black-box optimization algorithms on real-world objective functions. For example, it will include tuning (validation set) performance of standard machine learning models on real data sets. This competition has widespread impact as black-box optimization (e.g., Bayesian optimization) is relevant for hyper-parameter tuning in almost every machine learning project (especially deep learning), as well as many applications outside of machine learning. The leader board will be determined using the optimization performance on held-out (hidden) objective functions, where the optimizer must run without human intervention. Baselines will be set using the default settings of six open source black-box optimization packages and random search.


A sample-efficient approach for derivative-free optimization

Bayesian optimization is a popular sample-efficient approach for derivative-free optimization of objective functions that take several minutes or hours to evaluate. Bayesian optimization builds a surrogate model (often a Gaussian process) for the objective function that provides a measure of uncertainty. Using this surrogate model, an acquisition function is used to determine the most promising point to evaluate next.

Bayesian optimization has many applications, with hyperparameter tuning of machine learning models (e.g., deep neural networks) being one of the most popular applications. However, the choice of surrogate model and acquisition function are both problem-dependent and the goal of this challenge is to compare different approaches over a large number of different problems. This challenge focuses on the application of Bayesian optimization to tuning the hyper-parameters of machine learning models.


Start July 1

  • Submission site open to public: July 1
  • Deadline for final submissions: October 15
  • Announcement of winners: November 15
  • Presentations, short papers, and code release from winners due: December 1
  • Competition track (virtual) award ceremony: December 2020


$15,000 (USD) Total

  • First place: $6,000
  • Second place: $4,000
  • Third place: $3,000
  • Fourth place: $1,000
  • Fifth place: $1,000

Contact:  info@bbochallenge.com