Asynchronous Successive Halving Algorithm

ASHA hyperparameter

class orion.algo.asha.ASHA(space: Space, seed: int | Sequence[int] | None = None, num_rungs: int | None = None, num_brackets: int = 1, repetitions: int | float = inf)[source]

Asynchronous Successive Halving Algorithm

A simple and robust hyperparameter tuning algorithm with solid theoretical underpinnings that exploits parallelism and aggressive early-stopping.

For more information on the algorithm, see original paper at https://arxiv.org/abs/1810.05934.

Li, Liam, et al. “Massively parallel hyperparameter tuning.” arXiv preprint arXiv:1810.05934 (2018)

Parameters
space: `orion.algo.space.Space`

Optimisation space with priors for each dimension.

seed: None, int or sequence of int

Seed for the random number generator used to sample new trials. Default: None

num_rungs: int, optional

Number of rungs for the largest bracket. If not defined, it will be equal to (base + 1) of the fidelity dimension. In the original paper, num_rungs == log(fidelity.high/fidelity.low) / log(fidelity.base) + 1. Default: log(fidelity.high/fidelity.low) / log(fidelity.base) + 1

num_brackets: int

Using a grace period that is too small may bias ASHA too strongly towards fast converging trials that do not lead to best results at convergence (stagglers). To overcome this, you can increase the number of brackets, which increases the amount of resource required for optimisation but decreases the bias towards stragglers. Default: 1

repetitions: int

Number of execution of ASHA. Default is np.inf which means to run ASHA until no new trials can be suggested.

Methods

compute_bracket_idx(num)

suggest(num)

Suggest a number of new sets of parameters.

create_bracket

sample

suggest(num: int) list[Trial][source]

Suggest a number of new sets of parameters.

Sample new points until first rung is filled. Afterwards waits for all trials to be completed before promoting trials to the next rung.

Parameters
num: int, optional

Number of points to suggest. Defaults to 1.

Returns
list of orion.core.worker.trial:Trial or None

A list of trials suggested by the algorithm. The algorithm may opt out if it cannot make a good suggestion at the moment (it may be waiting for other trials to complete), in which case it will return None.

class orion.algo.asha.ASHABracket(owner: ASHA, budgets: list[BudgetTuple], repetition_id: int)[source]

Bracket of rungs for the algorithm ASHA.

Parameters
asha: `ASHA` algorithm

The asha algorithm object which this bracket will be part of.

budgets: list of tuple

Each tuple gives the (n_trials, resource_budget) for the respective rung.

repetition_id: int

The id of hyperband execution this bracket belongs to

Attributes
is_filled

ASHA’s first rung can always sample new trials

Methods

get_candidates(rung_id)

Get a candidate for promotion

is_ready([rung_id])

ASHA's always ready for promotions

promote(num)

Promote the first candidate that is found and return it

get_candidates(rung_id: int) list[Trial][source]

Get a candidate for promotion

Raises
TypeError

If get_candidates is called before the entire rung is completed.

property is_filled: bool

ASHA’s first rung can always sample new trials

is_ready(rung_id: int | None = None) bool[source]

ASHA’s always ready for promotions

promote(num: int) list[Trial][source]

Promote the first candidate that is found and return it

The rungs are iterated over in reversed order, so that high rungs are prioritised for promotions. When a candidate is promoted, the loop is broken and the method returns the promoted trial.

Note

All trials are part of the rungs, for any state. Only completed trials are eligible for promotion, i.e., only completed trials can be part of top-k. Lookup for promotion in rung l + 1 contains trials of any status.

orion.algo.asha.compute_budgets(min_resources: float, max_resources: float, reduction_factor: int, num_rungs: int, num_brackets: int)[source]

Compute the budgets used for ASHA