My PhD journey ends today🧑🎓
I will continue my quest to study how quantum machines can learn new facets of our universe as a scientist
@GoogleQuantumAI
& a visiting researcher
@MIT
@Stanford
@UCBerkeley
🌈
In 2025, I will be back
@Caltech
as an Assistant Professor of Physics🔭
☃️Is it easy to find local minima?☃️
When a physical system is cooled❄️, Nature doesn't always find the ground state but finds a local minimum
In , we prove that a machine that can cool things to local minima is a universal quantum computer (1/9)
Are there exponential advantages in using quantum machines to learn about the physical world?🎯
We establish the affirmative through rigorous proofs + physical experiments on the Sycamore quantum processor in . [📜1/10]
Just attended the
#Caltech2024
commencement for my PhD🎓
The Clauser prize recipient was kept secret until the very last moment. I was stunned when the president announced my name.
Heartfelt thanks to my advisor
@preskill
, friends, and family for their unwavering support!
I’m so proud of
@RobertHuangHY
, who received the Clauser Prize
@Caltech
for his PhD thesis “Learning in the Quantum Universe,” which was judged “to have the greatest potential to open new avenues of human thought and endeavor.”
#Caltech2024
🤖 Can a machine *efficiently* learn and predict quantum dynamics with arbitrarily high complexity (e.g, exponentially high)?
In our new paper with
@sitanch
and
@preskill
, we give an ML algorithm and prove that it accomplishes this wild task (📜1/7)
This is my first time to publish at
@ScienceMagazine
!🔬 We show that quantum machines🤖 can learn exponentially faster about the physical world🔭. Coincidentally, today happens to be my birthday😋 and this is the one of the coolest present I got🎁
❓Can classical machine learning models solve challenging problems in physics that no classical algorithms could solve?
We prove the affirmative (for quantum many-body problems) in
with
@RichardKueng
,
@giactorlai
,
@victorvalbert
,
@preskill
[🧵1/13]
❓How to learn/train quantum circuits?
Shallow quantum circuits are classically hard to simulate⚡️Despite extensive study, no efficient algorithm for learning/training them is known💀
In , we show how to learn any shallow quantum circuit efficiently💡
Continuing our tradition of supporting outstanding graduate students in their pursuit of research in computer science and related fields, we congratulate our 13th annual PhD Fellowship Program recipients! See the list of 2021 Fellowship recipients below:
Last summer,
@preskill
and I worked with an amazing undergrad, Laura Lewis.
She proved that an ML algo in our recent paper on
@ScienceMagazine
can be exponentially improved:
Reducing the sample complexity from a large poly(n) to only log(n)!
See 🧵1/7
Can we tell apart short-time & exponential-time quantum dynamics?
In , we found the answer to be "No" even though it seems easy.
This discovery leads to new implications in quantum advantages, faster learning, hardness of recognizing phases of matter ...
Our proof that a classical ML model🤖 can learn to solve quantum problems⚛️ better than any classical algorithm🖥️ is now published in
@ScienceMagazine
🔬
It is unbelievable to have two papers📜 out on Science in less than 4 months🤯🤯 This is surreal...
Q: Can we see highly complex entanglement from a few single-qubit measurements?
I always assumed this is impossible as entanglement can be too nonlocal for local measurements to see
In , we prove that it can actually be done for almost all quantum states
What can we ultimately learn about our quantum world?👁️ If we lost all knowledge, how should we re-build our knowledge about quantum systems?📚🔬
With
@S_Flammia
and
@preskill
, we formalize these questions using learning theory and make progress towards answering them (🧵1/8)
An efficient method to learn an interacting system with many particles is central to building better quantum computers and sensors📡⚛️
The physical law sets a limit on how fast we can learn. In , we give the first protocol to reach this limit (📜1/10)
In today’s Quantum Colloquium at Berkeley hosted by Umesh Vazirani, we demystify our recent results with
@sitanch
and
@preskill
proving that ML models can efficiently learn and predict arbitrarily complex (even exponentially complex) quantum dynamics🕵️
My first quantum information paper is published on
@NaturePhysics
!
We combine quantum scrambling and statistical learning to design theoretically-optimal procedure for constructing an efficient classical representation of the quantum many-body system.
With
@preskill
and
@KuengRichard
, we prove that "on average, learning from classical and quantum data are comparable, but for accurate predictions on all inputs, there is an exponential quantum advantage." Now out as editor's suggestion on PRL. 😊😊
Laura Lewis' new work on rigorous ML algorithms for solving quantum many-body problems has just been covered by
@QuantaMagazine
!
Really happy to see that classical shadow🔦 has made its way to Quanta🌟
Understanding the power and limitation of what we can do in the NISQ era has been a big question❓for me since the beginning of my Ph.D.
I am really happy that we finally have something rigorous and general to say about NISQ💡
What are noisy intermediate-scale quantum devices good for? In a new paper joint with
@JordanCotler
,
@RobertHuangHY
, and
@jerryzli
, we define and study a new complexity class, NISQ, that captures the computational power of these devices 🧵 (1/n)
My first IQIM blog is out! I talk about my dream of a world where intelligent beings🧠 are enhanced by quantum sensors📡, quantum memories💾, and quantum computers⚛️, and how one can use math to peek into such a fantastic world.
Can quantum ML models learn from simple input data and predict accurately on *much* harder data? Most existing theories say no. Here, we prove that after learning quantum dynamics on a few simple unentangled states, one can predict accurately on arbitrarily high entangled states
I gave a talk at Simons Institute in Berkeley on using quantum information, statistical learning, and random matrix theory to prove that classical computers could learn to solve challenging quantum many-body problems.
The recording is out:
Is randomness necessary to estimate M observables from only log M quantum measurements, e.g., as in ? In , we show that randomness could be removed to yield even better performance (with application to quantum chemistry).
I will be playing devil's advocate😈 and speaking about how powerful classical AI could be (based on recent theoretical developments) and whether quantum AI should be agitated by the power of classical AI?
#QHack2022
Speaker 🎤
Join Hsin-Yuan Huang (Robert —
@RobertHuangHY
) for a talk on:
"How powerful is classical AI from the standpoint of quantum AI?"
Tuesday Feb 15 at 12:00 Noon EST
I had great fun🥳at the Quantum Colloquium hosted by Umesh Vazirani, where I talked about making predictions in a quantum world. The renown of the previous speakers (Scott Aaronson, John Preskill, Ewin Tang, Mikhail Lukin...) is insane😮
@NatRevPhys
recently invited me to write a short article introducing classical shadows: a technique for learning a succinct classical representation of a quantum state. This technique has undergone many remarkable generalizations from the quantum community in the past year🫂👨🔬👩🔬
Tools of the Trade: Learning quantum states from their classical shadows.
Hsin-Yuan Huang explains how to characterize quantum states using very few measurements, thanks to classical shadow tomography
Neural network🤖 designs a new functional for DFT that improves some existing functionals and comes close to one of the best human-designed functionals. It seems plausible that AI could outperform the best human-designed functionals in the future✨
Can machine learning explain the world at the quantum level? New in
@ScienceMagazine
, with the code also released, our team shows that neural networks can radically improve density functionals with
#DM21
.
Read more:
Code: 1/
I am curious to know what people think about the new perspective by Maria Schuld and Nathan Killoran😃 I think our ultimate goal is still quantum advantage (else we can just use classical computers) but we need some near-term goals to measure progress.
I will be giving a talk at QIP 2021 () combining "Information-theoretic bounds on quantum advantage in machine learning" with Richard Kueng and
@preskill
, and "Power of data in quantum machine learning" with
@JarrodMcclean
and other Google friends! 😆😆
@preskill
@sitanch
I squeezed in some time to conduct numerical experiments for learning quantum processes with up to 50 qubits and 10^6 evolution time; see the newest version of .
The open-source code👩💻🧑💻is at (📜8/7)
"To light a quantum candle is to cast a classical shadow"
-- Ursula K. Le Guin, probably
Check out our newest demo, and learn all about the idea of classical shadows 💡⚛️🌒
Based on the paper by
@RobertHuangHY
, Richard Kueng, &
@preskill
Credit to Chi-Yun Cheng🌟 for an artist's impression🎨 of me studying a machine🤖⚛️ living in the quantum universe (together with a beaver)
--- inspired by my previous IQIM post:
One of the coolest parts😎 is that we demonstrated the significant quantum advantage on a real quantum computer🖥️⚛️: Sycamore superconducting processor built by
@GoogleQuantumAI
. Kudos🙏 to Michael Broughton on fighting with the spiteful quantum noise in a real device. [📜6/10]
Implication for quantum machine learners:
People have begun to consider quantum ML models for quantum problems (instead of classical). This work shows that classical ML models can also solve quantum problems well and gives a strong baseline that QML needs to surpass. [🧵12/13]
We believe this work presents a new possibility illustrating how the combination of quantum technology⚛️ and machine learning🤖 could enable powerful new strategies to learn about nature🍀. [📜9/10]
In the 1st experiment, we consider a hybrid quantum-classical ML model (quantum-enhanced experiments🔬 + classical recurrent neural network🤖) for predicting observables. We observe >1000x improvement over the best classical strategy using the hybrid QML model. [📜7/10]
The results are particularly surprising when we compare them with classical dynamics.
In order for 1D classical circuits to look random, a linear depth is required (see Fig.).
Strangely, 1D quantum dynamics only need log depth to look random!
Hi! So I was coerced into writing this thread---er I mean let me tell you a little bit about quantum learning and testing, and about some of the papers
@sitanch
, Jordan Cotler,
@RobertHuangHY
and I just posted on arxiv:
1/
Since I started my Ph.D. at Caltech in 2018, almost all of my main projects are driven by my aspiration to unravel this problem. It is really nice to find a concrete answer after three years!😊😊 [🧵2/13]
In the 2nd experiment, we consider the hybrid of quantum-enhanced experiments and unsupervised classical ML for uncovering symmetry♓️♑️ in 1D and 2D dynamics. The best known classical strategy failed❌ but the hybrid quantum-classical ML model succeeded⭐️. [📜8/10]
It is great fun to mentor Laura during the summer. Every week there is new and exciting progress!
She is going to talk about this work at QIP 2023
For QIP folks interested in rigorous ML for quantum physics, you should check out her talk! 🧵7/7
Finding ground states is known to be hard for both classical and quantum computers🤖
As a result, ground states are not always "physical," as in they are not always physically observable
When Nature cools a system in a cold bath❄️, she finds a local minimum of energy (2/9)
While the paper in its current form is purely theoretical📜📜📜, our preliminary experiments on quantum computers show that the algorithm is amenable to training large-scale shallow quantum circuits🦣🦣 and many applications follow🌈
So, stay tuned for Version 2! 🌟🌟
Implications for physicists:
While ML has gathered a lot of attention recently, it has been unclear if ML can tackle challenging problems in physics that traditional algorithms fail. This work provides a rigorous foundation for future prospects of ML in physics. [🧵11/13]
In conventional experiments, classical agents (scientists👨🔬👩🔬, ML models🤖, etc.) retrieve/store/process classical information0⃣1⃣ from experiments to learn about the physical world (using physical measurements🔭, classical memory📔& classical computers🖥️). [📜2/10]
As quantum technology advances, we can begin to consider quantum agents/machines🦾 that retrieve/store/process quantum information⚛️ from experiments🧪 to learn about the world🌏 (using quantum sensors, quantum memory & quantum computers). [📜3/10]
The measurement protocol is extremely simple.
Every time, pick a random qubit and measure it in a random X/Y/Z basis; measure all the rest of the qubits in a fixed basis (e.g., Z).
For an n-qubit system, do this for poly(n) times.
That's it!
The key discovery💡 in this work is that:
Given a quantum unitary/process with arbitrarily-high complexity, a low-complexity model🎟️ for predicting the outcome of the quantum process always exists.
The ML model only has to discover that low-complexity model (📜4/7)
One may wonder how ML algorithms can be computationally more powerful than non-ML algorithms 🧐. This is the topic of study in my prior work . Short answer: the computational power is in the data ML learns from. [🧵10/13]
We rigorously established the exponential quantum advantage🌟🔯💫 for predicting non-commuting observables, performing quantum PCA, and learning quantum dynamics. We utilized mathematical tools in with some new proof techniques. [📜5/10]
A independent work by
@felix_led
and
@NDLaRacuente
focusing on designs provide another very interesting perspective via communication :
Two parties can produce unitary designs by exchanging only O(1) qubits.
This is also not true in a classical world!
In this work, we compare the ability of classical and quantum machines🤖⚛️ to learn about the physical world. In particular, we prove that in various tasks, quantum machines can learn from exponentially fewer experiments than classical machines. [📜4/10]
We then prove that our ML model can learn the low-complexity model🎟️ using the classical shadows of the unknown quantum unitary/process *extremely efficiently*.
A key step in the proof resolves a conjecture on quantum Bohnenblust-Hille inequality (📜5/7)
@HyperboIeva
I am here! I would also like to see our community continue to strive here.
There is more random stuff on the feed nowadays, and it seems harder to find posts I missed.
Our results resolved many problems in ML + quantum:
(1) Quantum advantage
showed exp. advantage in learning systems with highly nonlocal correlations.
Our results show superpoly advantages in learning low-complexity system with only local correlations.
Finally, big thanks to
@giactorlai
and
@awscloud
for the large-scale numerical experiments,
@victorvalbert
for insights into quantum phases of matter. And a huge toast 🍻🍻 to my best quantum friends
@RichardKueng
and
@preskill
for the extremely fun journey! 🥳🥳 [🧵13/13]
I really enjoyed this collaboration with Chi-Fang Chen,
@preskill
, and Leo Zhou🍻
It is impossible to prove the result without the unique expertise each of them brings in
I have wondered about the question for a long time. Really happy that it is now resolved!😌 (9/9)
For local unitary perturbations, we show that the optimization (energy) landscape🌆 over the entire quantum state space has a large barren plateau🙁
This allows us to prove that finding local minima under local unitary perturbation is easy with a classical computer💻 (5/9)
Big thanks🫂 to all of my collaborators for realizing the fantastic results🌟! I really love this eclectic team ranging from classical learning theorists, theoretical physicists to quantum computing engineers, who have all contributed greatly. [📜10/10]
While the main theorem is purely conceptual, we feel the protocol may be practically relevant. So, we spent a long time coding things up numerically.
I'm a bit rusty on coding now👨💻 but I had a lot of fun.
Thank you,
@MSoleimanifar
and
@preskill
, for working on this together!
This motivates the following question:
How easy is it to cool physical systems to a local minimum of energy?☃️❄️
To answer this question, we develop new concepts/tools by building on
optimization theory📊
quantum thermodynamics🔥
and Hamiltonian complexity🪢 (3/9)
In contrast, a quantum computer can always find a local minimum efficiently
This is shown using a thermal gradient descent algorithm📉 that mimics cooling in Nature☃️
Together, this proved that finding local minima is classically hard and quantumly easy (8/9)
Our results on PRUs rely on an upcoming paper by
@fermi_ma
and me.
While PRUs have been conjectured to exist, there are no known proof. This conjecture is resolved in this upcoming paper.
The proof is presented in these talks at
@SimonsInstitute
:
@letonyo
Yes! This is state certification, which one can do by running a deep circuit that inverts the target state
The main goal of this work is to understand whether single-qubit measurements could certify efficiently (yes for almost all states; still open if we want all states)
@preskill
I share a similar sentiment. But I think a better way of stating this is: To summarize, we conclude that "currently", there is no indication that "existing" quantum ML will improve on data sets, "which classical ML already performs well".
The picture changes drastically when we consider thermal perturbations❄️🌡️
We develop a set of new techniques to analyze energy landscapes🔋 in thermodynamics
Using these techniques, we prove that in many 2D systems, the landscape is nice & has no suboptimal local minima (6/9)
Our proof is eclectic 🛠️🪝🗝️📐
We give a better quantum optimization algo.💻 leading to new ineq., including quantum Bohnenblust-Hille. We then design an ML model🤖 using the ineq. to learn the low-complexity model🎟️ from the classical shadow🔦 of the quantum dynamics (📜6/7)
Shallow quantum circuits are quantum circuits with a constant depth⚡️⚛️
Despite being constant depth, these quantum circuits can generate classical distributions that are hard to sample from🎇 using classical computers🖥️⌨️
See the proofs in .
We prove that any shallow quantum circuits⚡️⚛️ (with arbitrary unknown structure) can be efficiently learned/trained from a classical dataset🔖
This result provides a new hope✨for the negative results / dark vibes👿 around QML and variational quantum algorithms in recent years
Thanks to Jarrod McClean, Michael Broughton, and all the awesome collaborators for helping me realize this work!
I had great fun at
@Google
during the summer (virtually)!
It's gonna be tough to unravel even a fraction of the cool results in our recent paper in understanding quantum advantage in the presence of data - spearheaded by the amazing
@MoMoRobertHuang
but I'll try! (1/n)
While our first proof is long, it has now become a very simple few-page proof. So check it out if interested!
It has been an exciting journey with Tommy Schuster and Jonas Haferkamp
@haferjonas
!
I am really happy that many open problems I care deeply about are now resolved! 💖
I like to think of shadow overlap as the combination/enhancement of XEB (cross entropy benchmark) and classical shadow
In many cases, shadow overlap behaves like a Hamming distance version of fidelity, similar to local fidelity
A surprising aspect🪄🦄 is that
Learning can be done efficiently on a normal computer, i.e., training the quantum circuit is classically easy.
However, a quantum computer⚛️ is needed to run and make predictions using the trained quantum circuit because of classical hardness.
We prove that in n-qubit systems,
> k-designs form in roughly k log(n) time on any geometry, including 1D.
> PRUs form in poly log(n) time on 1D and poly log log(n) time on all-to-all geometry.
The n scaling improved exponentially over what people previously know.
We define a local minimum of an n-qubit system to be any state that has the minimum energy under small perturbations
A natural choice from quantum computing (e.g., VQE) is local unitary perturbations🔢
However, physically, such perturbations should be thermal❄️🌡️ (4/9)
The class of ML algorithms includes training neural networks that output an exponentially large density matrix. Interestingly, all of these can be done efficiently on a classical computer by using classical shadows of quantum states [🧵6/13]
Are there exponential advantages in using quantum machines to learn about the physical world?🎯
We establish the affirmative through rigorous proofs + physical experiments on the Sycamore quantum processor in . [📜1/10]
For example, shadow overlap can accurately benchmark a quantum device when the popular metric XEB fails.
We also show that by combining shadow overlap with neural networks, we can rigorously predict properties of quantum systems that would normally require exp(n) measurements
It has always been extremely fun to work with
@preskill
and
@sitanch
!!🤩
Thank you both for all the exciting discussions that led to this unbelievable result 🙏🙏 (📜7/7)
This problem has been on my mind for 1.5+ years
After meeting with
@YT59529321
4-5 months ago, and with the help from Di Fang and Yuan Su, I am thrilled that we have resolved this problem🌟 (📜10/10)
Our paper contains many other applications, e.g., the power of time reversal, anti-concentration of 2D random circuits, etc.
There are likely more to be discovered! Rumor has it that our result can be used to show that spacetime can hide blackholes (!)
This applies to a class of 2D systems where the ground states are highly entangled🧶 and are classically hard to find
When the landscape is nice, cooling to a local minimum☃️ is the same as finding ground states
So finding local minima is classically hard (7/9)
We design a class of ML algorithms that can predict ground states after learning from examples of some physical systems. Thm. 1 proves that these ML algorithms can learn/predict accurately and efficiently. [🧵5/13]
The exponential complexity🕸️ makes learning very challenging ⚔️
All known algorithms for learning a general unitary, e.g., our previous analysis in & , require exponential data size or runtime🫠 (📜3/7)
The certification is done by computing a number E[ω] between 0 and 1 from these randomized measurements.
We call E[ω] the "shadow overlap". If the measured state is close to Ψ, E[ω] is close to 1, and vice versa.
So, we can perform certification by looking at E[ω].
Now, say you think of an n-qubit state Ψ and ask:
Is the state I just measured close to Ψ? (Certification)
Maybe you are trying to synthesize a state in the lab to be as close to Ψ as possible.
Or maybe you have created the model Ψ by training neural nets or tensor networks.
Our main theorem states that for almost all Ψ, the data allows us to correctly answer the question.
In particular, we can accurately certify any state Ψ that is a superposition over "well-connected" classical states/bitstrings.