TEST MARKDOWN #

This tutorial helps you run your first example with OpenBox.

test ref#

sphinx.ext.autosectionlabel with autosectionlabel_prefix_document = True

use myst-parser ref syntax:

design principle

use markdown link with autosectionlabel:

Design Principle

use markdown link with anchor (myst_heading_anchors = 3)

Design Principle

test modal

transfer learning

test math

transfer learning

test tabs

inline-tabs

test tabs#

https://sphinx-design.readthedocs.io/en/furo-theme/tabs.html

Inline: {{ key1 }}

Block level:

{{ key2 }}

| col1     | col2     |
| -------- | -------- |
| {{key2}} | {{key3}} |

Inline: I’m a substitution

Block level:

Note

I’m a substitution

col1

col2

Note

I’m a substitution

fishy

test image#

the bare markdown will show the image, but with no align or scale:

the html image outside source root is not shown:

try html_image in myst-parser (succeed with align and scale):

_images/ab_testing.png _images/ab_testing.png

this syntax extension in MyST will show the image correctly, but normal markdown user cannot see the picture (e.g. in github):

ab_test

try another:

ab_test

This is a caption in Markdown#

try Inline attributes: (experimental. need attrs_image in myst_enable_extensions)

image attrs

a reference to the image

test formula#

\[ f(x)=1 \]

\(h(x)=3\)

\[ g(x)=2 \]
\[\begin{gather*} a_1=b_1+c_1\\ a_2=b_2+c_2-d_2+e_2 \end{gather*}\]
(1)#\[\begin{align} a_{11}& =b_{11}& a_{12}& =b_{12}\\ a_{21}& =b_{21}& a_{22}& =b_{22}+c_{22} \end{align}\]

test linkify#

www.baidu.com

Define Configuration Space#

First, define a configuration space using ConfigSpace for searching.

from openbox.utils.config_space import ConfigurationSpace, UniformFloatHyperparameter

# Define Configuration Space
config_space = ConfigurationSpace()
x1 = UniformFloatHyperparameter("x1", -5, 10, default_value=0)
x2 = UniformFloatHyperparameter("x2", 0, 15, default_value=0)
config_space.add_hyperparameters([x1, x2])

In this example, we create a ConfigurationSpace then add two UniformFloatHyperparameter into it. The parameter x1 ranges from -5 to 10. The parameter x2 ranges from 0 to 15.

Other types of hyperparameter are also supported in ConfigSpace. Here are examples of how to define Integer and Categorical hyperparameters:

from openbox.utils.config_space import UniformIntegerHyperparameter, CategoricalHyperparameter

i = UniformIntegerHyperparameter("i", 0, 100) 
kernel = CategoricalHyperparameter("kernel", ["rbf", "poly", "sigmoid"], default_value="rbf")

For advanced usage of ConfigSpace, please refer to ConfigSpace’s documentation.

Define Objective Function#

Second, define the objective function to be optimized. Note that OpenBox aims to minimize the objective function. Here we use the Branin function.

import numpy as np
from openbox.utils.config_space import Configuration

# Define Objective Function
def branin(config: Configuration):
    # convert Configuration to dict
    config_dict = config.get_dictionary()
    x1 = config_dict['x1']
    x2 = config_dict['x2']

    # calculate
    a = 1.
    b = 5.1 / (4. * np.pi ** 2)
    c = 5. / np.pi
    r = 6.
    s = 10.
    t = 1. / (8. * np.pi)
    y = a * (x2 - b * x1 ** 2 + c * x1 - r) ** 2 + s * (1 - t) * np.cos(x1) + s

    # return result dictionary
    ret = dict(
        objs=(y, )
    )
    return ret

The input of the objective function is a Configuration object sampled from ConfigurationSpace as we defined above. Call config.get_dictionary() to covert Configuration to Python dict form.

After evaluation, the objective function should return a dict (Recommended). The result dict should contain:

  • ‘objs’: A list/tuple of objective values (to be minimized). In the example above, we have one objective so return a tuple contains a single value.

  • ‘constraints’: A list/tuple of constraint values. If the problem is not constrained, return None or do not include this key in the dict. Constraints less than zero (“<=0”) implies feasibility.

In addition to the recommended usage, for single objective problem with no constraint, just return a single value is supported, too.

Run Optimization#

After we define the configuration space and the objective function, we could run optimization process, search over the configuration space and try to find minimum value of the objective.

from openbox.optimizer.generic_smbo import SMBO

# Run Optimization
bo = SMBO(branin,
          config_space,
          num_objs=1,
          num_constraints=0,
          max_runs=50,
          surrogate_type='gp',
          time_limit_per_trial=180,
          task_id='quick_start')
history = bo.run()

Here we simply create a SMBO object, passing the objective function branin and the configuration space config_space to it.

  • num_objs=1 and num_constraints=0 indicates our branin function returns a single objective value with no constraint.

  • max_runs=50 means the optimization will take 50 rounds (50 times of objective function evaluation).

  • surrogate_type=‘gp’. For mathematical problem, we suggest using Gaussian Process (‘gp’) as Bayesian surrogate model. For practical problems such as hyperparameter optimization (HPO), we suggest using Random Forest (‘prf’).

  • time_limit_per_trial sets the time budget (seconds) of each objective function evaluation. Once the evaluation time exceeds this limit, objective function will return as a failed trial.

  • task_id is set to identify the optimization process.

Then, call bo.run() to start the optimization process and wait for the result to return.

Observe Optimization Results#

At the end of the previous stage, bo.run() returns the optimization history. Call bo.get_history() will also get the history.

Call print(history) to see the result:

print(history)
+---------------------------------------------+
| Parameters              | Optimal Value     |
+-------------------------+-------------------+
| x1                      | -3.138277         |
| x2                      | 12.254526         |
+-------------------------+-------------------+
| Optimal Objective Value | 0.398096578033325 |
+-------------------------+-------------------+
| Num Configs             | 50                |
+-------------------------+-------------------+

Call history.plot_convergence() to see the optimization process (you may need to call plt.show() to see the graph):

history.plot_convergence(true_minimum=0.397887)

In Jupyter Notebook environment, call history.visualize_jupyter() to visualization of trials using hiplot:

history.visualize_hiplot()

test commit 1