c3.optimizers.optimalcontrol_robust
¶
Module Contents¶
- class c3.optimizers.optimalcontrol_robust.OptimalControlRobust(noise_map, **kwargs)[source]¶
Bases:
c3.optimizers.optimalcontrol.OptimalControl
Object that deals with the open loop optimal control.
- Parameters
dir_path (str) – Filepath to save results
fid_func (callable) – infidelity function to be minimized
fid_subspace (list) – Indeces identifying the subspace to be compared
pmap (ParameterMap) – Identifiers for the parameter vector
callback_fids (list of callable) – Additional fidelity function to be evaluated and stored for reference
algorithm (callable) – From the algorithm library Save plots of control signals
store_unitaries (boolean) – Store propagators as text and pickle
options (dict) – Options to be passed to the algorithm
run_name (str) – User specified name for the run, will be used as root folder
- log_setup(self) None ¶
Create the folders to store data.
- optimize_controls(self, setup_log: bool = True) None ¶
Apply a search algorithm to your gateset given a fidelity function.
- goal_run(self, current_params: tensorflow.Tensor) tensorflow.float64 ¶
Evaluate the goal function for current parameters.
- Parameters
current_params (tf.Tensor) – Vector representing the current parameter values.
- Returns
Value of the goal function
- Return type
tf.float64
- replace_logdir(self, new_logdir)¶
Specify a new filepath to store the log.
- Parameters
new_logdir –
- set_created_by(self, config) None ¶
Store the config file location used to created this optimizer.
- load_best(self, init_point) None ¶
Load a previous parameter point to start the optimization from. Legacy wrapper. Method moved to Parametermap.
- Parameters
init_point (str) – File location of the initial point
- end_log(self) None ¶
Finish the log by recording current time and total runtime.
- log_best_unitary(self) None ¶
Save the best unitary in the log.
- log_parameters(self) None ¶
Log the current status. Write parameters to log. Update the current best parameters. Call plotting functions as set up.
- lookup_gradient(self, x)¶
Return the stored gradient for a given parameter set.
- Parameters
x (np.array) – Parameter set.
- Returns
Value of the gradient.
- Return type
np.array
- fct_to_min(self, input_parameters: Union[numpy.ndarray, tensorflow.constant]) Union[numpy.ndarray, tensorflow.constant] ¶
Wrapper for the goal function.
- Parameters
input_parameters ([np.array, tf.constant]) – Vector of parameters in the optimizer friendly way.
- Returns
Value of the goal function. Float if input is np.array else tf.constant
- Return type
[np.ndarray, tf.constant]
- fct_to_min_autograd(self, x)¶
Wrapper for the goal function, including evaluation and storage of the gradient.
- Parameters
- xnp.array
Vector of parameters in the optimizer friendly way.
- float
Value of the goal function.