seldonian.hyperparam_search.HyperSchema¶
- class HyperSchema(hyper_dict)¶
Bases:
object
- __init__(hyper_dict)¶
Container for all hyperparameters one wants to tune and their possible values.
- Parameters:
hyper_dict –
Hyperparameter dictionary adhering to the following format: keys: names of hyperparameters values: dictionary whose format depends on how you want this hyperparameter tuned. If you want to do a grid search over this hyperparameter, the required keys are:
[“values”,”hyper_type”,”tuning_method”], where “values” is the list of values to search, “hyper_type” is one of [“optimization”,”model”,”SA”], specifying the type of hyperparameter, and “tuning_method” is “grid_search”
- If you want to do CMA-ES over this hyperparameter, the required keys are:
[“initial_value”,”min_val”,”max_val”,”hyper_type”,”search distribution”], where “initial_value” is the starting value for this hyperparameter in CMA-ES, “min_val” is the minimum value you want to search for this hyperparameter, “max_val” is the maximum value you want to search for this hyperparameter, “hyper_type” is one of [“optimization”,”model”,”SA”,”tuning_method”], specifying the type of hyperparameter, “search_distribution” is either “uniform” or “log-uniform”. “uniform” searches over a uniform distribution between “min_val” and “max_val”, where “log-uniform” searchs over a log-uniform distribution betwen “min_val” and “max_val”. This is common for step size hyperparameters, for example. “”tuning_method” is “CMA-ES”.
Here is an example for tuning the number of iterations “num_iters” using grid search and the step size “alpha_theta” using CMA-ES:
- hyper_dict = {
- “num_iters”: {
“values”:[100,500,1000,1500], “hyper_type”:”optimization”, “tuning_method”: “grid_search”
}, “alpha_theta”: {
“initial_value”:0.005, “min_val”: 0.0001, “max_val”: 0.1, “hyper_type”:”optimization”, “search_distribution”: “log-uniform” “tuning_method”: “CMA-ES”
}
}
- __repr__()¶
Return repr(self).
Methods
- _validate(hyper_dict)¶
Check that the hyperparameter dictionary is formatted properly and contains valid hyperparameters. Model hyperparameters are specific to the model so we can’t know what they might be ahead of time. Errors regarding model hyperparameters will be caught elsewhere.
- Parameters:
hyper_dict – See __init__ docstring.
- Returns:
A validated hyper_dict