c3.optimizers.optimalcontrol_robust
¶
Module Contents¶
- class c3.optimizers.optimalcontrol_robust.OptimalControlRobust(noise_map, **kwargs)[source]¶
Bases:
c3.optimizers.optimalcontrol.OptimalControl
Object that deals with the open loop optimal control.
- Parameters
dir_path (str) – Filepath to save results
fid_func (callable) – infidelity function to be minimized
fid_subspace (list) – Indeces identifying the subspace to be compared
pmap (ParameterMap) – Identifiers for the parameter vector
callback_fids (list of callable) – Additional fidelity function to be evaluated and stored for reference
algorithm (callable) – From the algorithm library Save plots of control signals
store_unitaries (boolean) – Store propagators as text and pickle
options (dict) – Options to be passed to the algorithm
run_name (str) – User specified name for the run, will be used as root folder
- goal_run(current_params: tensorflow.Tensor) tensorflow.float64 [source]¶
Evaluate the goal function for current parameters.
- Parameters
current_params (tf.Tensor) – Vector representing the current parameter values.
- Returns
Value of the goal function
- Return type
tf.float64
- log_setup() None ¶
Create the folders to store data.
- optimize_controls(setup_log: bool = True) None ¶
Apply a search algorithm to your gateset given a fidelity function.
- goal_run_ode(current_params: tensorflow.Tensor) tensorflow.float64 ¶
Evaluate the goal function using ode solver for current parameters.
- Parameters
current_params (tf.Tensor) – Vector representing the current parameter values.
- Returns
Value of the goal function
- Return type
tf.float64
- goal_run_ode_only_final(current_params: tensorflow.Tensor) tensorflow.float64 ¶
Evaluate the goal function using ode solver for current parameters.
- Parameters
current_params (tf.Tensor) – Vector representing the current parameter values.
- Returns
Value of the goal function
- Return type
tf.float64
- replace_logdir(new_logdir)¶
Specify a new filepath to store the log.
- Parameters
new_logdir –
- set_created_by(config) None ¶
Store the config file location used to created this optimizer.
- load_best(init_point, extend_bounds=False) None ¶
Load a previous parameter point to start the optimization from. Legacy wrapper. Method moved to Parametermap.
- Parameters
init_point (str) – File location of the initial point
extend_bounds (bool) – Whether or not to allow the loaded optimal parameters’ bounds to be extended if they exceed those specified.
- end_log() None ¶
Finish the log by recording current time and total runtime.
- log_best_unitary() None ¶
Save the best unitary in the log.
- log_parameters(params) None ¶
Log the current status. Write parameters to log. Update the current best parameters. Call plotting functions as set up.
- lookup_gradient(x)¶
Return the stored gradient for a given parameter set.
- Parameters
x (np.array) – Parameter set.
- Returns
Value of the gradient.
- Return type
np.array
- fct_to_min(input_parameters: Union[numpy.ndarray, tensorflow.constant]) Union[numpy.ndarray, tensorflow.constant] ¶
Wrapper for the goal function.
- Parameters
input_parameters ([np.array, tf.constant]) – Vector of parameters in the optimizer friendly way.
- Returns
Value of the goal function. Float if input is np.array else tf.constant
- Return type
[np.ndarray, tf.constant]
- fct_to_min_autograd(x)¶
Wrapper for the goal function, including evaluation and storage of the gradient.
- Parameters
- xnp.array
Vector of parameters in the optimizer friendly way.
- float
Value of the goal function.