Optimizers

C1 - Optimal control

Object that deals with the open loop optimal control.

class c3.optimizers.optimalcontrol.OptimalControl(fid_func, fid_subspace, pmap, dir_path=None, callback_fids=None, algorithm=None, initial_point: str = '', store_unitaries=False, options={}, run_name=None, interactive=True, include_model=False, logger=None, fid_func_kwargs={})[source]

Bases: Optimizer

Object that deals with the open loop optimal control.

Parameters
  • dir_path (str) – Filepath to save results

  • fid_func (callable) – infidelity function to be minimized

  • fid_subspace (list) – Indeces identifying the subspace to be compared

  • pmap (ParameterMap) – Identifiers for the parameter vector

  • callback_fids (list of callable) – Additional fidelity function to be evaluated and stored for reference

  • algorithm (callable) – From the algorithm library Save plots of control signals

  • store_unitaries (boolean) – Store propagators as text and pickle

  • options (dict) – Options to be passed to the algorithm

  • run_name (str) – User specified name for the run, will be used as root folder

  • fid_func_kwargs (dict) – Additional kwargs to be passed to the main fidelity function.

goal_run(current_params: Tensor) tf.float64[source]

Evaluate the goal function for current parameters.

Parameters

current_params (tf.Tensor) – Vector representing the current parameter values.

Returns

Value of the goal function

Return type

tf.float64

load_model_parameters(adjust_exp: str) None[source]
log_setup() None[source]

Create the folders to store data.

optimize_controls(setup_log: bool = True) None[source]

Apply a search algorithm to your gateset given a fidelity function.

set_callback_fids(callback_fids) None[source]
set_fid_func(fid_func) None[source]

C2 - Calibration

Object that deals with the closed loop optimal control.

class c3.optimizers.calibration.Calibration(eval_func, pmap, algorithm, dir_path=None, exp_type=None, exp_right=None, options={}, run_name=None, logger: Optional[List] = None)[source]

Bases: Optimizer

Object that deals with the closed loop optimal control.

Parameters
  • dir_path (str) – Filepath to save results

  • eval_func (callable) – infidelity function to be minimized

  • pmap (ParameterMap) – Identifiers for the parameter vector

  • algorithm (callable) – From the algorithm library

  • options (dict) – Options to be passed to the algorithm

  • run_name (str) – User specified name for the run, will be used as root folder

goal_run(current_params)[source]

Evaluate the goal function for current parameters.

Parameters

current_params (tf.Tensor) – Vector representing the current parameter values.

Returns

Value of the goal function

Return type

tf.float64

log_pickle(params, seqs, results, results_std, shots)[source]

Save a pickled version of the performed experiment, suitable for model learning.

Parameters
  • params (tf.Tensor) – Vector of parameter values

  • seqs (list) – Strings identifying the performed instructions

  • results (list) – Values of the goal function

  • results_std (list) – Standard deviation of the results, in the case of noisy data

  • shots (list) – Number of repetitions used in averaging noisy data

log_setup() None[source]

Create the folders to store data.

Parameters
  • dir_path (str) – Filepath

  • run_name (str) – User specified name for the run

optimize_controls() None[source]

Apply a search algorithm to your gateset given a fidelity function.

set_eval_func(eval_func, exp_type)[source]

Setter for the eval function.

Parameters

eval_func (callable) – Function to be evaluated

C3 - Characterization

Object that deals with the model learning.

class c3.optimizers.modellearning.ModelLearning(sampling, batch_sizes, pmap, datafiles, dir_path=None, estimator=None, seqs_per_point=None, state_labels=None, callback_foms=[], algorithm=None, run_name=None, options={}, logger: Optional[List] = None)[source]

Bases: Optimizer

Object that deals with the model learning.

Parameters
  • dir_path (str) – Filepath to save results

  • sampling (str) – Sampling method from the sampling library

  • batch_sizes (list) – Number of points to select from each dataset

  • seqs_per_point (int) – Number of sequences that use the same parameter set

  • pmap (ParameterMap) – Identifiers for the parameter vector

  • state_labels (list) – Identifiers for the qubit subspaces

  • callback_foms (list) – Figures of merit to additionally compute and store

  • algorithm (callable) – From the algorithm library

  • run_name (str) – User specified name for the run, will be used as root folder

  • options (dict) – Options to be passed to the algorithm

confirm() None[source]

Compute the validation set, i.e. the value of the goal function on all points of the dataset that were not used for learning.

goal_run(current_params: constant) tf.float64[source]

Evaluate the figure of merit for the current model parameters.

Parameters

current_params (tf.Tensor) – Current model parameters

Returns

Figure of merit

Return type

tf.float64

goal_run_with_grad(current_params)[source]

Same as goal_run but with gradient. Very resource intensive. Unoptimized at the moment.

learn_model() None[source]

Performs the model learning by minimizing the figure of merit.

log_setup() None[source]

Create the folders to store data.

Parameters
  • dir_path (str) – Filepath

  • run_name (str) – User specified name for the run

read_data(datafiles: Dict[str, str]) None[source]

Open data files and read in experiment results.

Parameters

datafiles (dict) – List of paths for files that contain learning data.

select_from_data(batch_size) List[int][source]

Select a subset of each dataset to compute the goal function on.

Parameters

batch_size (int) – Number of points to select

Returns

Indeces of the selected data points.

Return type

list

Optimizer module

Optimizer object, where the optimal control is done.

class c3.optimizers.optimizer.BaseLogger[source]

Bases: object

end_log(opt, logdir)[source]
log_parameters(evaluation, optim_status)[source]
start_log(opt, logdir)[source]
class c3.optimizers.optimizer.BestPointLogger[source]

Bases: BaseLogger

class c3.optimizers.optimizer.Optimizer(pmap: ParameterMap, initial_point: str = '', algorithm: Optional[Callable] = None, store_unitaries: bool = False, logger: Optional[List] = None)[source]

Bases: object

General optimizer class from which specific classes are inherited.

Parameters
  • algorithm (callable) – From the algorithm library

  • store_unitaries (boolean) – Store propagators as text and pickle

  • logger (List) – Logging classes

end_log() None[source]

Finish the log by recording current time and total runtime.

fct_to_min(input_parameters: Union[ndarray, constant]) Union[ndarray, constant][source]

Wrapper for the goal function.

Parameters

input_parameters ([np.array, tf.constant]) – Vector of parameters in the optimizer friendly way.

Returns

Value of the goal function. Float if input is np.array else tf.constant

Return type

[np.ndarray, tf.constant]

fct_to_min_autograd(x)[source]

Wrapper for the goal function, including evaluation and storage of the gradient.

Parameters
xnp.array

Vector of parameters in the optimizer friendly way.

float

Value of the goal function.

goal_run(current_params: Union[ndarray, constant]) Union[ndarray, constant][source]

Placeholder for the goal function. To be implemented by inherited classes.

goal_run_with_grad(current_params)[source]
load_best(init_point) None[source]

Load a previous parameter point to start the optimization from. Legacy wrapper. Method moved to Parametermap.

Parameters

init_point (str) – File location of the initial point

log_best_unitary() None[source]

Save the best unitary in the log.

log_parameters(params) None[source]

Log the current status. Write parameters to log. Update the current best parameters. Call plotting functions as set up.

lookup_gradient(x)[source]

Return the stored gradient for a given parameter set.

Parameters

x (np.array) – Parameter set.

Returns

Value of the gradient.

Return type

np.array

replace_logdir(new_logdir)[source]

Specify a new filepath to store the log.

Parameters

new_logdir

set_algorithm(algorithm: Optional[Callable]) None[source]
set_created_by(config) None[source]

Store the config file location used to created this optimizer.

set_exp(exp: Experiment) None[source]
start_log() None[source]

Initialize the log with current time.

class c3.optimizers.optimizer.TensorBoardLogger[source]

Bases: BaseLogger

log_parameters(evaluation, optim_status)[source]
set_logdir(logdir)[source]
start_log(opt, logdir)[source]
write_params(params, step=0)[source]

Sensitivity analysis

Module for Sensitivity Analysis. This allows the sweeping of the goal function in a given range of parameters to ascertain whether the dataset being used is sensitive to changes in the parameters of interest

class c3.optimizers.sensitivity.Sensitivity(sampling: str, batch_sizes: Dict[str, int], pmap: ParameterMap, datafiles: Dict[str, str], state_labels: Dict[str, List[Any]], sweep_map: List[List[Tuple[str]]], sweep_bounds: List[List[int]], algorithm: str, estimator: Optional[str] = None, estimator_list: Optional[List[str]] = None, dir_path: Optional[str] = None, run_name: Optional[str] = None, options={})[source]

Bases: ModelLearning

Class for Sensitivity Analysis, subclassed from Model Learning

Parameters
  • sampling (str) – Name of the sampling method from library

  • batch_sizes (Dict[str, int]) – Number of points to select from the dataset

  • pmap (ParameterMap) – Model parameter map

  • datafiles (Dict[str, str]) – The datafiles for each of the learning datasets

  • state_labels (Dict[str, List[Any]]) – The labels for the excited states of the system

  • sweep_map (List[List[List[str]]]) – Map of variables to be swept in exp_opt_map format

  • sweep_bounds (List[List[int]]) – List of upper and lower bounds for each sweeping variable

  • algorithm (str) – Name of the sweeping algorithm from the library

  • estimator (str, optional) – Name of estimator method from library, by default None

  • estimator_list (List[str], optional) – List of different estimators to be used, by default None

  • dir_path (str, optional) – Path to save sensitivity logs, by default None

  • run_name (str, optional) – Name of the experiment run, by default None

  • options (dict, optional) – Options for the sweeping algorithm, by default {}

Raises

NotImplementedError – When trying to set the estimator or estimator_list

sensitivity()[source]

Run the sensitivity analysis.

Module contents