In Optimizer mode, the Measurement program will try to minimize the value of an expression based on the measured channels instead of looping through the step channels in a pre-determined sequence. This can be useful when the goal of the experiment is to minimize a certain quantity as opposed to mapping out the value of the quantity over the full parameter space.
6.1. Optimizer operation¶
To enable the optimizer, simply double-click on one of the Step items in the Step sequence list in the Measurement program, switch to “Basic settings” (if not already in that mode), then click the “Optimize...”-button to convert the step item to an optimizer parameter. Instead of sweeping over the parameter, Labber will try to optimize the cost function (see below) by varying the parameter over the range specified by the “Min value” and “Max value” controls in the dialog. The various options in the dialog are described in more detail in Section ParameterSettings below.
6.1.1. Cost function¶
The next step is to define the cost function and the general settings of the optimizer. These options are available by clicking the “Show Settings” toolbar icon in the top-left corner of the Measurement dialog, and clicking Optimizer in the section list in the left part of the dialog. The most import setting is the optimizer cost function, which is defined by the expression in the “Minimization function”-control. The cost function takes the latest measured values of the log channels as inputs and must return a single scalar value. The optimizer algorithm will then try to minimize the value of the cost function by iteratively varying the various optimizer parameters.
The inputs available to the cost function are the latest values of the
measured log channels, provided in the numpy list
y. Each element in
the list corresponds to a channel, and the order of the elements is the
same as the order at which the log channels appear in the Measurement
Editor. If you are using a single log channel, its value can be
y. However, note that
y may be scalar or
vector-valued, depending on if the particular log channel returns a
trace of a single value. For the optimizer to work, the cost function
must always return a scalar, so if your log channel is vector-valued you
need to apply some operation to convert the vector to a scalar. For
mean(y) would optimize with respect to the mean of the
measured trace. In addition to
y, the vector
x with the latest
values of the optimizer parameters is also available as an input to the
minimization function. You can use any Python and numpy expression when
defining the cost function.
6.1.2. Termination and convergence criteria¶
There are three possible criteria for defining when the optimizer should terminate the optimization process.
- Absolute target reached:
- If the value of the cost function is less than the Target value, the optimizer will terminate.
- Relative tolerance reached:
- If the change in the cost function between calls is smaller than the value given by “Relative tolerance”-setting, AND if the change in the optimizer parameter values between calls are smaller than the “Precision” setting of each parameter, the optimizer will terminate. Note that both criteria need to be fulfilled for termination.
- Max number of evaluations reached:
- The optimizer will automatically terminate after performing the number of measurements specified by “Max evaluations”.
By default, the “Target value” is set to minus infinity, which means that it will never terminate the optimizer. In addition, the “Relative tolerance” is set to infinity by default, which means that only the “Precision” of the individual optimizer parameters matter for relative convergence.
Note that the termination/convergence criteria may differ for different optimizer algorithms, the description above only refers to the default Nelder-Mead optimizer provided by Labber.
6.1.3. Running an optimizer measurement¶
When running a measurement with the optimizer enabled, Labber automatically will add a step item named “Optimizer iteration” that handles the optimizer loop. Note that it is possible to run an experiment with a mix of optimized and non-optimized parameters, where the optimizer will execute to find the optimal value of one parameter while stepping over different values of another parameter.
6.2. Optimizer settings¶
In order to use the optimizer, both the general optimization protocol and the individual optimization parameters must be configured. The various settings are described below.
6.2.1. General optimizer settings¶
These settings define the cost function and the algorithm-specific settings of the optimizer, and can be accessed by clicking the “Show Settings” toolbar icon in the top-left corner of the Measurement dialog. The settings are described in detail below.
- Algorithm used for optimization.
- Max evaluations:
- Maximum number of function evaluations/measurements performed before terminating the optimization.
- Minimization function:
- Function for optimizer to minimize. The measured channels are
available in the variable
y, which is a list of log channel values. Each list item may be a number or a numpy array, depending on the channel datatype. Default is
min(y), which will minimize the value of the first log channel.
- Target value:
- Absolute value of minimization function at which the optimization
will terminate. Default value is
-inf, which will prevent the optimizer to terminate until the other optimization goals are met.
- Relative tolerance:
- Change in minimization function between iterations that is
acceptable for convergence. Default value is
inf, which will make the optimizer run until the
Precision-value of all involved parameters are met.
6.2.2. Individual parameter settings¶
These settings are individual to each optimization parameter, and can be accessed by double-clicking a channel in the Step sequence list and going to “Optimize...”-mode.
- Start value:
- Initial value for parameter.
- Initial step size:
- Initial step size for the parameter.
- Min value:
- Lowest parameter value allowed during the optimization procedure.
- Max value:
- Highest parameter value allowed during the optimization procedure.
- Target precision for optimizer that will trigger optimizer termination.
6.3. Custom optimizers¶
It is possible to create custom optimizer modules to implement a specific optimization protocol. The sections below describe how to define and test a custom optimizer algorithm.
6.3.1. Defining custom optimizers¶
It is recommended to use one of the already present optimizer
configuration files as a template. The custom optimizers should be
contained in a single python
.py file, which must contain a function
optimize that takes exactly two parameters:
- Python dict with optimizer settings. The keys have the same names as
the labels of the optimizer settings in the Measurement program.
The individual parameter settings are stored as a list in the same
dictionary, with key
- Python callable that takes exactly one argument (
x). The function will run the Labber measurement for the provided parameter values
x, where each value in the vector
xcorresponds to an optimizer parameter. The function is typically passed directly to the
scipyoptimizer, see the provided optimizer
Nelder-Meadfor an example.
The function must return a Python dictionary with results from
OptimizeResult format. The only
necessary key is “
x”, containing the final optimizer parameters.
When creating a new optimizer, the python file should be given a unique named and placed in the local optimizer folder (the folder named “Local optimizers” in the Preferences window), instead of the global one (“Optimizer functions” in Preferences). This allows the user’s own optimizers to be kept separately from the optimizers provided by Labber, and it also prevents optimizers written by the user from being deleted when updating the Labber program to a newer version.
Note that even when making additions/changes to an existing optimizers from the global folder, the best practice is to copy that optimizer file from the global folder to the local folder, and only make changes to the optimizer version. If optimizers with the same names exist in both the local and the global optimizer folders, Labber will always use the optimizer in the “Local optimizer”-folder.
6.3.2. Defining optimizers settings¶
For custom optimizers, it is possible to define optimizer-specific
configuration parameters in addition to the general settings in Section
OptimizerSettings above. The optimizer-specific settings are
defined by adding a function
define_optimizer_settings() to the same
.py file that contain the optimizer code. The function should
return a list of python dicts, where each dict represents a specific
setting. The settings are defined in a similar way to quantities of an
instrument driver (see Section Quantities), with the difference
that the settings are specified in a python function instead of a
.ini configuration file. Each setting must define the
datatype parameter, all other parameters are optional.
The customs settings will show up in the Optimizer-section of the
Settings-pane of the Labber Measurement dialog, allowing the user to
change their values prior to running a measurement. The values of the
custom parameters will then be accessible as entries in the
input in the
optimize function defined above.
As an example, the code below will define custom settings with three parameters for the Bayesian-Gaussian-Process optimizer.
def define_optimizer_settings(): """Define extra settings for optimizer Returns ------- optimizer_cfg : list of dict List of configuration items for optimizer, each item is a dict. Necessary keys are "name" and "datatype". """ # Bayesian optimization settings optimizer_cfg = [ dict(name='Acquisition function', datatype='COMBO', combo_defs=['LCB', 'EI', 'PI', 'gp_hedge'], def_value='gp_hedge', tooltip=('See https://scikit-optimize.github.io/ for more info'), ), dict(name='kappa', datatype='DOUBLE', def_value=1.96, state_item='Acquisition function', state_values=['LCB', 'gp_hedge'], tooltip=('Controls how much of the variance in the predicted ' + 'values should be taken into account. Higher value ' + 'favours exploration over exploitation and vice versa'), ), dict(name='xi', datatype='DOUBLE', def_value=0.1, state_item='Acquisition function', state_values=['EI', 'PI', 'gp_hedge'], tooltip=('Controls how much improvement one wants over the ' + 'previous best values. Higher value ' + 'favours exploration over exploitation and vice versa'), ), ] return optimizer_cfg
6.3.3. Using custom optimizers¶
To make the new optimizer available to Labber, place it in the local optimizer folder and click the menu alternative “Tools/Reload Optimizers” in the Measurement Setup dialog. This will scan the optimizer folders and update the “Method” control in the general optimizer settings.
It is highly recommended to first test the optimizer in a pure Python
environment. For an example of how to test the optimizer, see the code
at the end of the file
Nelded-mead.py provided in the global