iqpopt

Fast optimization of instantaneous quantum polynomial circuits in JAX

https://github.com/xanaduai/iqpopt

Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (11.1%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Fast optimization of instantaneous quantum polynomial circuits in JAX

Basic Info
Statistics
  • Stars: 11
  • Watchers: 5
  • Forks: 5
  • Open Issues: 0
  • Releases: 1
Created about 1 year ago · Last pushed 8 months ago
Metadata Files
Readme License Citation

README.md

IQPopt - IQP circuit optimization with JAX

IQPopt is a package designed for fast optimization of parameterized Instantaneous quantum polynomial (IQP) circuits using JAX. Research paper avaliable here.

Installation

Install with

python pip install .

or in editable mode with python pip install -e .

Creating a circuit

The package can be used to optimize parameterized IQP circuits. These are circuits comprised of gates $\text{exp}(i\thetaj Xj)$, where the generator $Xj$ is a tensor product of Pauli X operators acting on some subset of qubits and $\thetaj$ is a trainable parameter. Input states and measurements are diagonal in the computational (Z) basis.

To define such a circuit (with input state $\vert 0 \rangle$) we need to specify the number of qubits and the parameterized gates

```python import iqpopt as iqp from iqpopt.utils import local_gates

nqubits = 2 gates = localgates(n_qubits, 2)

circuit = iqp.IqpSimulator(n_qubits, gates) ```

Each element of gates corresponds to a trainable parameter, and is given by a list of lists that specifies the generators of the parameter.

For example, in the above the function local_gates returns gates = [[[0]],[[1]],[[0,1]]] which specifies three trainable parameters with gate generators $X0$, $X1$ and $X0X1$.

Note: For very large problems it can be useful to initialize the circuit with the option sparse=True. This uses scipy sparse matrix multiplication in place of JAX and can be significantly more memory efficient.

Expectation values

IQPopt has been designed for fast evaluation of expectation values of Pauli Z tensors.

To estimate the expectation value of a Pauli Z tensor, we represent the operator as a binary string. The estimation uses a Monte Carlo method whose precision is controlled by n_samples.

```python import jax import jax.numpy as jnp

op = jnp.array([0, 1]) #binary array representing Z1 params = jnp.ones(len(circuit.gates)) nsamples = 1000 key = jax.random.PRNGKey(42)

expval, std = circuit.opexpval(params, op, nsamples, key) ``` returns an estimate of $\langle Z_1 \rangle$ as well as the standard deviation of the estimator.

The package also allows for fast batch evaluation of expectation values. If we specify a batch of Z operators by an array

python ops = jnp.array([[0,1],[1,0],[1,1]]) #Z_1, Z_0, Z0Z1 we can also batch evaluate the expectation values in parallel: python expvals, stds = circuit.op_expval(params, ops, n_samples, key)

Note: The estimation of each expectation value in the batch is unbiased, however the estimators may be correlated. This effect can be reduced by increasing nsamples in order to reduce the variance of each estimator, or by using the option `indepestimates=True to return uncorrelated estimates (at the cost of longer runtime).

Training

We can train our circuit with built-in methods. We first define a loss function python def loss(params, circuit, ops, n_samples, key): expvals = circuit.op_expval(params, ops, n_samples, key)[0] return jnp.sum(expvals)

The first argument must be named params and corresponds to the trainable parameters. We can then train the circuit as follows

```python import numpy as np import matplotlib.pyplot as plt

optimizer = "Adam" stepsize = 0.001 niters = 1000 paramsinit = np.random.normal(0, 1/np.sqrt(nqubits), len(circuit.gates)) ops = np.array([[1,1], [1,0], [0,1]]) nsamples = 1000

losskwargs = { "params": paramsinit, "circuit": circuit, "ops": ops, "nsamples": nsamples, }

trainer = iqp.Trainer(optimizer, loss, stepsize) trainer.train(niters, losskwargs)

params = trainer.final_params plt.plot(trainer.losses) ```

Automatic stopping of training is possible using the convergence_interval option of train; see the docstring for more info.

Stochastic bitflip model

One can replace the quantum circuit by an analogous bitflipping model described in arxiv:2501.04776 by initializing the circuit with the bitflip=True option:

python circuit = iqp.IqpSimulator(n_qubits, gates, bitflip=True) This can be useful to judge if the IQP model is making use of interference. Since the bitflipping model is classical, one can also sample from this model for large values of n_qubits.

Generative machine learning with IQP circuits

Training for generative machine learning tasks

We can also view the circuit as a generative model and train it using the maximum mean discrepancy (MMD) distance as a loss function. ```python import iqpopt.genqml as gen from iqpopt.genqml.utils import median_heuristic

n_qubits = 10

toy dataset of low weight bitstrings

Xtrain = np.random.binomial(1,0.5, size=(1000, nqubits)) Xtrain = Xtrain[np.where(X_train.sum(axis=1)<5)]

gates = localgates(nqubits, 2) circuit = iqp.IqpSimulator(nqubits, gates) paramsinit = np.random.normal(0, 1/np.sqrt(n_qubits), len(gates))

loss = gen.mmdlossiqp #MMD loss sigma = medianheuristic(Xtrain) #bandwidth for MMD

losskwargs = { "params": paramsinit, "iqpcircuit": circuit, "groundtruth": Xtrain, "sigma": sigma, "nops": 1000, "n_samples": 1000, }

trainer = iqp.Trainer("Adam", loss, stepsize=0.01) trainer.train(niters=500, losskwargs=loss_kwargs)

params = trainer.finalparams plt.plot(trainer.losses) `` The MMD loss is estimated using a Monte Carlo method; larger values ofnopsandn_samplesresult in more precise estimates. For small circuits, we can generate new samples python samples = circuit.sample(params, shots=100) ` For large circuits this is not tractable due to the complexity of sampling from IQP distributions.

Evaluating the generative model

To evaluate the model, we can use the MMD distance to a test set, or the Kernel Generalized Empirical Likelihood (KGEL); see Suman Ravuri et al. in Understanding Deep Generative Models with Generalized Empirical Likelihoods.

Kernel Generalized Empirical Likelihood (KGEL)

```python

test points from same distribution

Xtest = np.random.binomial(1,0.5, size=(1000, nqubits)) Xtest = Xtest[np.where(X_test.sum(axis=1)<5)]

nwitness = 10 witnesspoints = Xtest[-nwitness:] #witness points for KGEL testdata = Xtest[:-n_witness] #test data for KGEL

kgel, pkgel = gen.kgeloptiqp(circuit, params, witnesspoints, testdata, sigma, nops=1000, n_samples=1000, key=jax.random.PRNGKey(42)) ```

p_kgel is the optimal probability distribution returned from the convex optimization.

Owner

  • Name: Xanadu
  • Login: XanaduAI
  • Kind: organization
  • Email: hello@xanadu.ai
  • Location: Toronto, ON

Quantum Computing Powered by Light

Citation (CITATION.cff)

cff-version: 1.2.0
title: Fast optimization of IQP circuits with JAX
message: >-
  If you use this software, please cite it using the
  metadata from this file.
type: software
authors:
  - given-names: Erik
    family-names: Recio-Armengol
    email: erik.recio@icfo.eu
    affiliation: ICFO (Institute of Photonic Sciences)
  - given-names: Joseph
    family-names: Bowles
    email: joseph@xanadu.ai
    affiliation: Xanadu
identifiers:
  - type: url
    value: https://arxiv.org/abs/2501.04776
    description: Paper's arxiv
repository-code: https://github.com/XanaduAI/iqpopt
keywords:
  - Generative Modeling
  - Quantum Machine Learning

GitHub Events

Total
  • Watch event: 4
  • Delete event: 1
  • Push event: 8
  • Public event: 1
  • Pull request event: 7
  • Fork event: 5
  • Create event: 6
Last Year
  • Watch event: 4
  • Delete event: 1
  • Push event: 8
  • Public event: 1
  • Pull request event: 7
  • Fork event: 5
  • Create event: 6

Issues and Pull Requests

Last synced: 8 months ago

All Time
  • Total issues: 0
  • Total pull requests: 2
  • Average time to close issues: N/A
  • Average time to close pull requests: 7 days
  • Total issue authors: 0
  • Total pull request authors: 2
  • Average comments per issue: 0
  • Average comments per pull request: 0.0
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 0
  • Pull requests: 2
  • Average time to close issues: N/A
  • Average time to close pull requests: 7 days
  • Issue authors: 0
  • Pull request authors: 2
  • Average comments per issue: 0
  • Average comments per pull request: 0.0
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
Pull Request Authors
  • josephbowles (4)
  • erikrecio (2)
  • JerryChen97 (1)
Top Labels
Issue Labels
Pull Request Labels

Dependencies

pyproject.toml pypi
  • cvxpy *
  • jax *
  • jaxopt *
  • matplotlib *
  • numpy *
  • numpyro *
  • optax *
  • pandas *
  • pennylane *
  • scipy *
  • tqdm *