https://github.com/lockwo/rl_qvc_opt
Code for "Optimizing Quantum Variational Circuits with Deep Reinforcement Learning"
https://github.com/lockwo/rl_qvc_opt
deep-learning quantum-computing quantum-machine-learning reinforcement-learning
Last synced: 5 months ago
JSON representation
Code for "Optimizing Quantum Variational Circuits with Deep Reinforcement Learning"
- Host: GitHub
- URL: https://github.com/lockwo/rl_qvc_opt
- Owner: lockwo
- License: apache-2.0
- Created: 2021-09-05T03:42:55.000Z (over 4 years ago)
- Default Branch: main
- Last Pushed: 2024-05-10T05:18:54.000Z (almost 2 years ago)
- Last Synced: 2025-04-02T20:45:11.517Z (12 months ago)
- Topics: deep-learning, quantum-computing, quantum-machine-learning, reinforcement-learning
- Language: Python
- Homepage:
- Size: 74.2 KB
- Stars: 19
- Watchers: 1
- Forks: 5
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
[](http://unitary.fund)
# Reinforcement Learning for Quantum Variational Circuit Optimization
Code for [Optimizing Quantum Variational Circuits with Deep Reinforcement Learning](https://arxiv.org/abs/2109.03188)
To use the pretrained models, don't worry about training or testing. Download deploy folder and use the `mixed` function in `augment.py` to work with your circuit. See the file for the required inputs. Note that deployment usage requires stable_baselines3 (which requires PyTorch) and numpy.
To recreate the results from the paper, use the testing folder. Each file runs and outputs the information as presented in the table. Note that this has dependencies on TensorFlow, TensorFlow-Quantum, sklearn, and stable_baselines3 (which requires PyTorch).
To train your own agent, go to the training folder. Specifiy the maximum sizes and training durations for the agent and run the code.
# Examples for Deployment
The example (written in pennylane) is similar to that used to generate the barren plateaus figures in the paper.
# Questions?
Join the rl_opt channel in the Unitary Fund discord.