Science Score: 54.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
    Found CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Academic email domains
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (12.7%) to scientific vocabulary
Last synced: 6 months ago · JSON representation ·

Repository

Basic Info
  • Host: GitHub
  • Owner: lfd
  • License: mit
  • Language: TeX
  • Default Branch: main
  • Size: 10.7 MB
Statistics
  • Stars: 0
  • Watchers: 2
  • Forks: 0
  • Open Issues: 4
  • Releases: 0
Created over 1 year ago · Last pushed about 1 year ago
Metadata Files
Readme Contributing License Citation

README.md

From Hope To Heuristics

Realistic Runtime Estimates for Quantum Optimisation in NHEP

:book: Project Description

This is the repository for our contribution to CHEP24 consisting of two key aspects: Firstly, we estimate runtimes and scalability for common NHEP problems addressed via QUBO formulations by identifying minimum energy solutions of intermediate Hamiltonian operators encountered during the annealing process. Secondly, we investigate how the classical parameter space in the QAOA, together with approximation techniques such as a Fourier-analysis based heuristic, proposed by Zhou et al. (2018), can help to achieve (future) quantum advantage, considering a trade-off between computational complexity and solution quality. Those approaches are evaluated on two benchmark problems: the Maxcut problem and the track reconstruction problem.

Approach

For the QUBO formulation of the track reconstruction problem, we build on the HEPQPR.Qallse project. To work with smaller sized QUBOs, we only focus on hit-triplets present in a specified angle, similar to the approach presented by Schwägerl et al..

:rocket: Getting Started

Setup

When cloning, make sure to get the submodule: git clone --recurse-submodules git@github.com:lfd/spectral_gap_nhep.git This will clone our fork of hepqr-qallse recursively.

If you have poetry installed, run poetry install. With pip, make sure to include the dependencies in the submodule hepqpr-qallse (pandas, numpy, plotly) and trackml (pandas, numpy).

Quickstart

After installing all the dependencies, simply execute kedro run

This will run the default pipeline which consists of all individual pipelines described in the following section. You can get an interactive overview of the pipeline in your browser by running kedro viz

Pipelines

QAOA Maxcut

Solving the Maxcut problem using Quadratic Approximate Optimization Algorithm (QAOA). It also calculates the annealing schedules for this problem.

Adiabatic Maxcut

Solving the Maxcut problem using adiabatic quantum computing, aka. annealing.

Qubo Formulation

This pipeline loads event data and prepares a qubo for the following two track reconstruction pipelines.

QAOA Track Reconstruction

Solving the track reconstruction problem using QAOA. It also calculates the annealing schedules for this problem.

Adiabatic Track Reconstruction

Solving the track reconstruction problem using quantum annealing.

Project Structure

The following list gives a brief explanation of the most important locations in our repository: - conf/base/parameters.yml: Parameters used in the experiments - conf/base/catalog.yml: Datasets and models description - data/**: Contains all initial, intermediate and final data(sets) - src/fromhopetoheuristics: Main code divided into pipelines and utilities (shared among several pipelines) - tests: All the tests for verification

Besides that, we make use of two submodules: - hepqpr-qallse: Currently, all the data loading and QUBO formulation is done using this submodule

Hyperparameter Optimization

This project uses Optuna for hyperparameter optimization. There is a dedicated kedro pipeline that takes care of the hyperparameter optimization and submission of jobs to a SLURM cluster. kedro run --pipeline hyperparameter_study

If you don't have a SLURM cluster available, head to pipelines/hyperparameter_study/nodes.py switch the subprocess command such that it spawns a single kedro job instead of a submission to the cluster.

Supposing everything goes well, you can take a look at the experiments by running optuna-dashboard sqlite:///studies/fhth.db supposing that the path to the sqlite database where Optuna stores its results is studies/fhth.db.

Reproduction

The numerical results in our study can be reproduced using the reproduction.sh script. The script executes all runs sequentially. Feel free to change the script for parallelisation, depending on your system size.

Numerical data

All results are stored in the data/ folder. The subfolders 04_adiabatic, 05_qaoa and 06_schedules can contain results in CSV format, with the corresponding run configuration in 00_parameters, stored as JSON.

Proceedings results

We copied the results obtained by us to analysis/proceedings_results/.

Data visualisation

The data for the proceedings article in analysis/proceedings_results/ can be plotted using R with GGplot. The following R libraries are required: - tidyverse - ggh4x - stringr - tikzDevice - patchwork - rjson

If you have R installed on your system, the libraries can be installed via: R -e "install.packages('<library>', dependencies=TRUE, repos='http://cran.rstudio.com/')"

To tikz or not to tikz

For the plots in the article, we used the tikzDevice export, which can be used by setting the variabletikz in the plot script analysis/plot.r to TRUE. If you are fine with plain PDF, keep it as it is.

Plotting

Once everything is set up, we can run the following to obtain the plots: cd analysis Rscript plot.r Output plots can then be found in either analysis/img-tikz, or analysis/img-pdf.

🚧 Contributing

Contributions are highly welcome! Take a look at our Contribution Guidelines.


overview

Owner

  • Name: Digitalisation Lab
  • Login: lfd
  • Kind: organization
  • Email: wolfgang.mauerer@othr.de
  • Location: Regensburg

Labor für Digitalisierung der Ostbayerischen Technischen Hochschule Regensburg

Citation (CITATION.cff)

cff-version: 1.2.0
message: "If you use this software, please cite it as below."
authors:
  - family-names: "Franz"
    given-names: "Maja"
    orcid: "https://orcid.org/0000-0002-2801-7192"
  - family-names: "Schönberger"
    given-names: "Manuel"
    orcid: "https://orcid.org/0000-0002-6939-7582"
  - family-names: "Strobl"
    given-names: "Melvin"
    orcid: "https://orcid.org/0000-0003-0229-9897"
  - family-names: "Kuehn"
    given-names: "Eileen"
    orcid: "https://orcid.org/0000-0002-8034-8837"
  - family-names: "Streit"
    given-names: "Achim"
    orcid: "https://orcid.org/0000-0002-5065-469X"
  - family-names: "Zurita"
    given-names: "Pia"
    orcid: "https://orcid.org/0000-0002-2756-9550"
  - family-names: "Diefenthaler"
    given-names: "Markus"
    orcid: "https://orcid.org/0000-0002-4717-4484"
  - family-names: "Mauerer"
    given-names: "Wolfgang"
    orcid: "https://orcid.org/0000-0002-9765-8313"
title: "From Hope to Heuristic: Realistic Runtime Estimates for Quantum Optimisation in NHEP"
version: 0.1.0
doi: 10.5281/zenodo.14921650
date-released: 2024-10-21
url: "https://indico.cern.ch/event/1338689/contributions/6010081/"

GitHub Events

Total
  • Push event: 2
Last Year
  • Push event: 2