Hsin-Yuan (Robert) Huang Profile Banner
Hsin-Yuan (Robert) Huang Profile
Hsin-Yuan (Robert) Huang

@RobertHuangHY

6,151
Followers
250
Following
57
Media
396
Statuses

Thinking about how machines could discover new physics in our quantum universe 🎲 🎲 🎲

Cambridge, MA
Joined September 2011
Don't wanna be here? Send us removal request.
Pinned Tweet
@RobertHuangHY
Hsin-Yuan (Robert) Huang
1 year
My PhD journey ends today🧑‍🎓 I will continue my quest to study how quantum machines can learn new facets of our universe as a scientist @GoogleQuantumAI & a visiting researcher @MIT @Stanford @UCBerkeley 🌈 In 2025, I will be back @Caltech as an Assistant Professor of Physics🔭
Tweet media one
40
32
1K
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
☃️Is it easy to find local minima?☃️ When a physical system is cooled❄️, Nature doesn't always find the ground state but finds a local minimum In , we prove that a machine that can cool things to local minima is a universal quantum computer (1/9)
Tweet media one
8
110
720
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Are there exponential advantages in using quantum machines to learn about the physical world?🎯 We establish the affirmative through rigorous proofs + physical experiments on the Sycamore quantum processor in . [📜1/10]
Tweet media one
11
121
535
@RobertHuangHY
Hsin-Yuan (Robert) Huang
1 month
Just attended the #Caltech2024 commencement for my PhD🎓 The Clauser prize recipient was kept secret until the very last moment. I was stunned when the president announced my name. Heartfelt thanks to my advisor @preskill , friends, and family for their unwavering support!
@preskill
John Preskill
1 month
I’m so proud of ⁦ @RobertHuangHY ⁩, who received the Clauser Prize ⁦ @Caltech ⁩ for his PhD thesis “Learning in the Quantum Universe,” which was judged “to have the greatest potential to open new avenues of human thought and endeavor.” #Caltech2024
Tweet media one
6
21
513
31
8
497
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
🤖 Can a machine *efficiently* learn and predict quantum dynamics with arbitrarily high complexity (e.g, exponentially high)? In our new paper with @sitanch and @preskill , we give an ML algorithm and prove that it accomplishes this wild task (📜1/7)
Tweet media one
9
86
477
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
This is my first time to publish at @ScienceMagazine !🔬 We show that quantum machines🤖 can learn exponentially faster about the physical world🔭. Coincidentally, today happens to be my birthday😋 and this is the one of the coolest present I got🎁
14
48
441
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
❓Can classical machine learning models solve challenging problems in physics that no classical algorithms could solve? We prove the affirmative (for quantum many-body problems) in with @RichardKueng , @giactorlai , @victorvalbert , @preskill [🧵1/13]
Tweet media one
11
110
423
@RobertHuangHY
Hsin-Yuan (Robert) Huang
6 months
❓How to learn/train quantum circuits? Shallow quantum circuits are classically hard to simulate⚡️Despite extensive study, no efficient algorithm for learning/training them is known💀 In , we show how to learn any shallow quantum circuit efficiently💡
Tweet media one
3
61
391
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
I am really happy to receive 2021 Google Ph.D. fellowship in Quantum Computing! 🤩
@GoogleAI
Google AI
3 years
Continuing our tradition of supporting outstanding graduate students in their pursuit of research in computer science and related fields, we congratulate our 13th annual PhD Fellowship Program recipients! See the list of 2021 Fellowship recipients below:
15
52
418
21
13
377
@RobertHuangHY
Hsin-Yuan (Robert) Huang
1 year
Last summer, @preskill and I worked with an amazing undergrad, Laura Lewis. She proved that an ML algo in our recent paper on @ScienceMagazine can be exponentially improved: Reducing the sample complexity from a large poly(n) to only log(n)! See 🧵1/7
Tweet media one
2
51
375
@RobertHuangHY
Hsin-Yuan (Robert) Huang
17 days
Can we tell apart short-time & exponential-time quantum dynamics? In , we found the answer to be "No" even though it seems easy. This discovery leads to new implications in quantum advantages, faster learning, hardness of recognizing phases of matter ...
Tweet media one
9
63
346
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
Our proof that a classical ML model🤖 can learn to solve quantum problems⚛️ better than any classical algorithm🖥️ is now published in @ScienceMagazine 🔬 It is unbelievable to have two papers📜 out on Science in less than 4 months🤯🤯 This is surreal...
10
53
332
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 months
Q: Can we see highly complex entanglement from a few single-qubit measurements? I always assumed this is impossible as entanglement can be too nonlocal for local measurements to see In , we prove that it can actually be done for almost all quantum states
Tweet media one
10
48
295
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
What can we ultimately learn about our quantum world?👁️ If we lost all knowledge, how should we re-build our knowledge about quantum systems?📚🔬 With @S_Flammia and @preskill , we formalize these questions using learning theory and make progress towards answering them (🧵1/8)
Tweet media one
3
34
245
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
An efficient method to learn an interacting system with many particles is central to building better quantum computers and sensors📡⚛️ The physical law sets a limit on how fast we can learn. In , we give the first protocol to reach this limit (📜1/10)
Tweet media one
2
35
228
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
In today’s Quantum Colloquium at Berkeley hosted by Umesh Vazirani, we demystify our recent results with @sitanch and @preskill proving that ML models can efficiently learn and predict arbitrarily complex (even exponentially complex) quantum dynamics🕵️
4
28
186
@RobertHuangHY
Hsin-Yuan (Robert) Huang
4 years
My first quantum information paper is published on @NaturePhysics ! We combine quantum scrambling and statistical learning to design theoretically-optimal procedure for constructing an efficient classical representation of the quantum many-body system.
4
27
162
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
With @preskill and @KuengRichard , we prove that "on average, learning from classical and quantum data are comparable, but for accurate predictions on all inputs, there is an exponential quantum advantage." Now out as editor's suggestion on PRL. 😊😊
6
26
155
@RobertHuangHY
Hsin-Yuan (Robert) Huang
11 months
Laura Lewis' new work on rigorous ML algorithms for solving quantum many-body problems has just been covered by @QuantaMagazine ! Really happy to see that classical shadow🔦 has made its way to Quanta🌟
1
26
147
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
Understanding the power and limitation of what we can do in the NISQ era has been a big question❓for me since the beginning of my Ph.D. I am really happy that we finally have something rigorous and general to say about NISQ💡
@sitanch
Sitan Chen
2 years
What are noisy intermediate-scale quantum devices good for? In a new paper joint with @JordanCotler , @RobertHuangHY , and @jerryzli , we define and study a new complexity class, NISQ, that captures the computational power of these devices 🧵 (1/n)
Tweet media one
7
43
287
2
15
127
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
My first IQIM blog is out! I talk about my dream of a world where intelligent beings🧠 are enhanced by quantum sensors📡, quantum memories💾, and quantum computers⚛️, and how one can use math to peek into such a fantastic world.
Tweet media one
0
27
107
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
Can quantum ML models learn from simple input data and predict accurately on *much* harder data? Most existing theories say no. Here, we prove that after learning quantum dynamics on a few simple unentangled states, one can predict accurately on arbitrarily high entangled states
@IMathYou2
Matthias C. Caro
2 years
Here goes my first #QML paper tweet in 2022: "Out-of-distribution generalization for learning quantum dynamics" ()!
3
18
91
4
16
104
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
I gave a talk at Simons Institute in Berkeley on using quantum information, statistical learning, and random matrix theory to prove that classical computers could learn to solve challenging quantum many-body problems. The recording is out:
2
10
100
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Is randomness necessary to estimate M observables from only log M quantum measurements, e.g., as in ? In , we show that randomness could be removed to yield even better performance (with application to quantum chemistry).
1
9
99
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
I will be playing devil's advocate😈 and speaking about how powerful classical AI could be (based on recent theoretical developments) and whether quantum AI should be agitated by the power of classical AI?
@PennyLaneAI
PennyLane
2 years
#QHack2022 Speaker 🎤 Join Hsin-Yuan Huang (Robert — @RobertHuangHY ) for a talk on: "How powerful is classical AI from the standpoint of quantum AI?" Tuesday Feb 15 at 12:00 Noon EST
Tweet media one
0
10
29
4
6
91
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
I had great fun🥳at the Quantum Colloquium hosted by Umesh Vazirani, where I talked about making predictions in a quantum world. The renown of the previous speakers (Scott Aaronson, John Preskill, Ewin Tang, Mikhail Lukin...) is insane😮
1
11
91
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
@NatRevPhys recently invited me to write a short article introducing classical shadows: a technique for learning a succinct classical representation of a quantum state. This technique has undergone many remarkable generalizations from the quantum community in the past year🫂👨‍🔬👩‍🔬
@NatRevPhys
Nature Reviews Physics
3 years
Tools of the Trade: Learning quantum states from their classical shadows. Hsin-Yuan Huang explains how to characterize quantum states using very few measurements, thanks to classical shadow tomography
Tweet media one
0
9
42
1
12
78
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Neural network🤖 designs a new functional for DFT that improves some existing functionals and comes close to one of the best human-designed functionals. It seems plausible that AI could outperform the best human-designed functionals in the future✨
@GoogleDeepMind
Google DeepMind
3 years
Can machine learning explain the world at the quantum level? New in @ScienceMagazine , with the code also released, our team shows that neural networks can radically improve density functionals with #DM21 . Read more: Code: 1/
Tweet media one
14
203
622
2
5
66
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
I am curious to know what people think about the new perspective by Maria Schuld and Nathan Killoran😃 I think our ultimate goal is still quantum advantage (else we can just use classical computers) but we need some near-term goals to measure progress.
3
9
63
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
I will be giving a talk at QIP 2021 () combining "Information-theoretic bounds on quantum advantage in machine learning" with Richard Kueng and @preskill , and "Power of data in quantum machine learning" with @JarrodMcclean and other Google friends! 😆😆
2
6
60
@RobertHuangHY
Hsin-Yuan (Robert) Huang
1 year
@preskill @sitanch I squeezed in some time to conduct numerical experiments for learning quantum processes with up to 50 qubits and 10^6 evolution time; see the newest version of . The open-source code👩‍💻🧑‍💻is at (📜8/7)
1
9
56
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Thrilled to see that classical shadow is now on PennyLane!
@PennyLaneAI
PennyLane
3 years
"To light a quantum candle is to cast a classical shadow" -- Ursula K. Le Guin, probably Check out our newest demo, and learn all about the idea of classical shadows 💡⚛️🌒 Based on the paper by @RobertHuangHY , Richard Kueng, & @preskill
2
16
63
0
4
52
@RobertHuangHY
Hsin-Yuan (Robert) Huang
1 year
Credit to Chi-Yun Cheng🌟 for an artist's impression🎨 of me studying a machine🤖⚛️ living in the quantum universe (together with a beaver) --- inspired by my previous IQIM post:
0
4
40
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
One of the coolest parts😎 is that we demonstrated the significant quantum advantage on a real quantum computer🖥️⚛️: Sycamore superconducting processor built by @GoogleQuantumAI . Kudos🙏 to Michael Broughton on fighting with the spiteful quantum noise in a real device. [📜6/10]
Tweet media one
1
4
41
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Implication for quantum machine learners: People have begun to consider quantum ML models for quantum problems (instead of classical). This work shows that classical ML models can also solve quantum problems well and gives a strong baseline that QML needs to surpass. [🧵12/13]
2
2
38
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
We believe this work presents a new possibility illustrating how the combination of quantum technology⚛️ and machine learning🤖 could enable powerful new strategies to learn about nature🍀. [📜9/10]
1
3
37
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
In the 1st experiment, we consider a hybrid quantum-classical ML model (quantum-enhanced experiments🔬 + classical recurrent neural network🤖) for predicting observables. We observe >1000x improvement over the best classical strategy using the hybrid QML model. [📜7/10]
Tweet media one
1
4
37
@RobertHuangHY
Hsin-Yuan (Robert) Huang
1 year
@preskill Thank you, @preskill ! I had great fun in my PhD 🥳 It’s amazing to have you and Thomas Vidick as my PhD advisors!
1
0
34
@RobertHuangHY
Hsin-Yuan (Robert) Huang
17 days
The results are particularly surprising when we compare them with classical dynamics. In order for 1D classical circuits to look random, a linear depth is required (see Fig.). Strangely, 1D quantum dynamics only need log depth to look random!
Tweet media one
1
3
35
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
A nice thread by Jerry Li about our new works viewed from a classical (in contrast to quantum) CS theory person. One of which is my first FOCS paper!🦊
@jerryzli
Jerry Li
3 years
Hi! So I was coerced into writing this thread---er I mean let me tell you a little bit about quantum learning and testing, and about some of the papers @sitanch , Jordan Cotler, @RobertHuangHY and I just posted on arxiv: 1/
3
5
79
0
2
35
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Since I started my Ph.D. at Caltech in 2018, almost all of my main projects are driven by my aspiration to unravel this problem. It is really nice to find a concrete answer after three years!😊😊 [🧵2/13]
1
0
35
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
In the 2nd experiment, we consider the hybrid of quantum-enhanced experiments and unsupervised classical ML for uncovering symmetry♓️♑️ in 1D and 2D dynamics. The best known classical strategy failed❌ but the hybrid quantum-classical ML model succeeded⭐️. [📜8/10]
Tweet media one
1
4
34
@RobertHuangHY
Hsin-Yuan (Robert) Huang
1 year
It is great fun to mentor Laura during the summer. Every week there is new and exciting progress! She is going to talk about this work at QIP 2023 For QIP folks interested in rigorous ML for quantum physics, you should check out her talk! 🧵7/7
3
1
33
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
Finding ground states is known to be hard for both classical and quantum computers🤖 As a result, ground states are not always "physical," as in they are not always physically observable When Nature cools a system in a cold bath❄️, she finds a local minimum of energy (2/9)
1
1
32
@RobertHuangHY
Hsin-Yuan (Robert) Huang
6 months
While the paper in its current form is purely theoretical📜📜📜, our preliminary experiments on quantum computers show that the algorithm is amenable to training large-scale shallow quantum circuits🦣🦣 and many applications follow🌈 So, stay tuned for Version 2! 🌟🌟
1
1
32
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Implications for physicists: While ML has gathered a lot of attention recently, it has been unclear if ML can tackle challenging problems in physics that traditional algorithms fail. This work provides a rigorous foundation for future prospects of ML in physics. [🧵11/13]
1
2
30
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
In conventional experiments, classical agents (scientists👨‍🔬👩‍🔬, ML models🤖, etc.) retrieve/store/process classical information0⃣1⃣ from experiments to learn about the physical world (using physical measurements🔭, classical memory📔& classical computers🖥️). [📜2/10]
Tweet media one
1
1
30
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
As quantum technology advances, we can begin to consider quantum agents/machines🦾 that retrieve/store/process quantum information⚛️ from experiments🧪 to learn about the world🌏 (using quantum sensors, quantum memory & quantum computers). [📜3/10]
Tweet media one
1
2
29
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 months
The measurement protocol is extremely simple. Every time, pick a random qubit and measure it in a random X/Y/Z basis; measure all the rest of the qubits in a fixed basis (e.g., Z). For an n-qubit system, do this for poly(n) times. That's it!
Tweet media one
1
0
28
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
The key discovery💡 in this work is that: Given a quantum unitary/process with arbitrarily-high complexity, a low-complexity model🎟️ for predicting the outcome of the quantum process always exists. The ML model only has to discover that low-complexity model (📜4/7)
2
7
29
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
One may wonder how ML algorithms can be computationally more powerful than non-ML algorithms 🧐. This is the topic of study in my prior work . Short answer: the computational power is in the data ML learns from. [🧵10/13]
2
0
29
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
We rigorously established the exponential quantum advantage🌟🔯💫 for predicting non-commuting observables, performing quantum PCA, and learning quantum dynamics. We utilized mathematical tools in with some new proof techniques. [📜5/10]
Tweet media one
Tweet media two
Tweet media three
1
2
29
@RobertHuangHY
Hsin-Yuan (Robert) Huang
17 days
A independent work by @felix_led and @NDLaRacuente focusing on designs provide another very interesting perspective via communication : Two parties can produce unitary designs by exchanging only O(1) qubits. This is also not true in a classical world!
1
4
28
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
In this work, we compare the ability of classical and quantum machines🤖⚛️ to learn about the physical world. In particular, we prove that in various tasks, quantum machines can learn from exponentially fewer experiments than classical machines. [📜4/10]
Tweet media one
1
2
28
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
We then prove that our ML model can learn the low-complexity model🎟️ using the classical shadows of the unknown quantum unitary/process *extremely efficiently*. A key step in the proof resolves a conjecture on quantum Bohnenblust-Hille inequality (📜5/7)
1
3
27
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Congrats to @AnuragAnshu4 , @Qsrinivasan_1 , Tomotaka Kuwahara, @MSoleimanifar ! It is one of my favorite papers last year 🤩🤩
@NaturePhysics
Nature Physics
3 years
Sample-efficient learning of interacting quantum systems
2
15
60
1
0
26
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
@HyperboIeva I am here! I would also like to see our community continue to strive here. There is more random stuff on the feed nowadays, and it seems harder to find posts I missed.
3
0
26
@RobertHuangHY
Hsin-Yuan (Robert) Huang
17 days
Our results resolved many problems in ML + quantum: (1) Quantum advantage showed exp. advantage in learning systems with highly nonlocal correlations. Our results show superpoly advantages in learning low-complexity system with only local correlations.
1
3
26
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
@jenseisert @preskill I thought I misheard! 🤣🤣
0
0
25
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Finally, big thanks to @giactorlai and @awscloud for the large-scale numerical experiments, @victorvalbert for insights into quantum phases of matter. And a huge toast 🍻🍻 to my best quantum friends @RichardKueng and @preskill for the extremely fun journey! 🥳🥳 [🧵13/13]
2
0
25
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
I really enjoyed this collaboration with Chi-Fang Chen, @preskill , and Leo Zhou🍻 It is impossible to prove the result without the unique expertise each of them brings in I have wondered about the question for a long time. Really happy that it is now resolved!😌 (9/9)
1
0
25
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
For local unitary perturbations, we show that the optimization (energy) landscape🌆 over the entire quantum state space has a large barren plateau🙁 This allows us to prove that finding local minima under local unitary perturbation is easy with a classical computer💻 (5/9)
Tweet media one
1
0
25
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Big thanks🫂 to all of my collaborators for realizing the fantastic results🌟! I really love this eclectic team ranging from classical learning theorists, theoretical physicists to quantum computing engineers, who have all contributed greatly. [📜10/10]
3
2
25
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 months
While the main theorem is purely conceptual, we feel the protocol may be practically relevant. So, we spent a long time coding things up numerically. I'm a bit rusty on coding now👨‍💻 but I had a lot of fun. Thank you, @MSoleimanifar and @preskill , for working on this together!
1
0
24
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
This motivates the following question: How easy is it to cool physical systems to a local minimum of energy?☃️❄️ To answer this question, we develop new concepts/tools by building on optimization theory📊 quantum thermodynamics🔥 and Hamiltonian complexity🪢 (3/9)
1
1
24
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
In contrast, a quantum computer can always find a local minimum efficiently This is shown using a thermal gradient descent algorithm📉 that mimics cooling in Nature☃️ Together, this proved that finding local minima is classically hard and quantumly easy (8/9)
3
4
24
@RobertHuangHY
Hsin-Yuan (Robert) Huang
17 days
Our results on PRUs rely on an upcoming paper by @fermi_ma and me. While PRUs have been conjectured to exist, there are no known proof. This conjecture is resolved in this upcoming paper. The proof is presented in these talks at @SimonsInstitute :
1
1
23
@RobertHuangHY
Hsin-Yuan (Robert) Huang
4 months
@letonyo Yes! This is state certification, which one can do by running a deep circuit that inverts the target state The main goal of this work is to understand whether single-qubit measurements could certify efficiently (yes for almost all states; still open if we want all states)
0
0
22
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
I am super thankful to my amazing collaborators🌟 @RichardKueng , @giactorlai , @victorvalbert , and @preskill for making this happen!🙏
0
0
21
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
@preskill I share a similar sentiment. But I think a better way of stating this is: To summarize, we conclude that "currently", there is no indication that "existing" quantum ML will improve on data sets, "which classical ML already performs well".
1
1
22
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
The picture changes drastically when we consider thermal perturbations❄️🌡️ We develop a set of new techniques to analyze energy landscapes🔋 in thermodynamics Using these techniques, we prove that in many 2D systems, the landscape is nice & has no suboptimal local minima (6/9)
Tweet media one
1
1
21
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
Our proof is eclectic 🛠️🪝🗝️📐 We give a better quantum optimization algo.💻 leading to new ineq., including quantum Bohnenblust-Hille. We then design an ML model🤖 using the ineq. to learn the low-complexity model🎟️ from the classical shadow🔦 of the quantum dynamics (📜6/7)
1
4
21
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
We focus on two important kinds of quantum many-body problems: predicting ground states and classifying phases of matter [🧵4/13]
Tweet media one
1
0
20
@RobertHuangHY
Hsin-Yuan (Robert) Huang
6 months
Shallow quantum circuits are quantum circuits with a constant depth⚡️⚛️ Despite being constant depth, these quantum circuits can generate classical distributions that are hard to sample from🎇 using classical computers🖥️⌨️ See the proofs in .
Tweet media one
1
1
21
@RobertHuangHY
Hsin-Yuan (Robert) Huang
6 months
We prove that any shallow quantum circuits⚡️⚛️ (with arbitrary unknown structure) can be efficiently learned/trained from a classical dataset🔖 This result provides a new hope✨for the negative results / dark vibes👿 around QML and variational quantum algorithms in recent years
1
2
21
@RobertHuangHY
Hsin-Yuan (Robert) Huang
4 years
Thanks to Jarrod McClean, Michael Broughton, and all the awesome collaborators for helping me realize this work! I had great fun at @Google during the summer (virtually)!
@JarrodMcclean
Jarrod McClean
4 years
It's gonna be tough to unravel even a fraction of the cool results in our recent paper in understanding quantum advantage in the presence of data - spearheaded by the amazing @MoMoRobertHuang but I'll try! (1/n)
Tweet media one
3
13
105
0
1
20
@RobertHuangHY
Hsin-Yuan (Robert) Huang
17 days
While our first proof is long, it has now become a very simple few-page proof. So check it out if interested! It has been an exciting journey with Tommy Schuster and Jonas Haferkamp @haferjonas ! I am really happy that many open problems I care deeply about are now resolved! 💖
1
0
20
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 months
I like to think of shadow overlap as the combination/enhancement of XEB (cross entropy benchmark) and classical shadow In many cases, shadow overlap behaves like a Hamming distance version of fidelity, similar to local fidelity
1
2
20
@RobertHuangHY
Hsin-Yuan (Robert) Huang
6 months
A surprising aspect🪄🦄 is that Learning can be done efficiently on a normal computer, i.e., training the quantum circuit is classically easy. However, a quantum computer⚛️ is needed to run and make predictions using the trained quantum circuit because of classical hardness.
1
2
20
@RobertHuangHY
Hsin-Yuan (Robert) Huang
17 days
We prove that in n-qubit systems, > k-designs form in roughly k log(n) time on any geometry, including 1D. > PRUs form in poly log(n) time on 1D and poly log log(n) time on all-to-all geometry. The n scaling improved exponentially over what people previously know.
Tweet media one
1
0
20
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
We define a local minimum of an n-qubit system to be any state that has the minimum energy under small perturbations A natural choice from quantum computing (e.g., VQE) is local unitary perturbations🔢 However, physically, such perturbations should be thermal❄️🌡️ (4/9)
1
0
20
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
The class of ML algorithms includes training neural networks that output an exponentially large density matrix. Interestingly, all of these can be done efficiently on a classical computer by using classical shadows of quantum states [🧵6/13]
1
1
19
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
For those who cannot go through the paywall, here is a thread and the arXiv link to our latest paper on @ScienceMagazine .
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Are there exponential advantages in using quantum machines to learn about the physical world?🎯 We establish the affirmative through rigorous proofs + physical experiments on the Sycamore quantum processor in . [📜1/10]
Tweet media one
11
121
535
0
1
19
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 months
For example, shadow overlap can accurately benchmark a quantum device when the popular metric XEB fails. We also show that by combining shadow overlap with neural networks, we can rigorously predict properties of quantum systems that would normally require exp(n) measurements
Tweet media one
1
0
19
@RobertHuangHY
Hsin-Yuan (Robert) Huang
6 months
I want to thank my collaborators @lyc1178 , @JarrodMcclean , @Isaac__kim , @AnuragAnshu4 , Michael Broughton, and Zeph Landau for working together to resolve this big open question in my heart💖💖💖
1
0
19
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
It has always been extremely fun to work with @preskill and @sitanch !!🤩 Thank you both for all the exciting discussions that led to this unbelievable result 🙏🙏 (📜7/7)
2
3
19
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
This problem has been on my mind for 1.5+ years After meeting with @YT59529321 4-5 months ago, and with the help from Di Fang and Yuan Su, I am thrilled that we have resolved this problem🌟 (📜10/10)
1
0
17
@RobertHuangHY
Hsin-Yuan (Robert) Huang
17 days
Our paper contains many other applications, e.g., the power of time reversal, anti-concentration of 2D random circuits, etc. There are likely more to be discovered! Rumor has it that our result can be used to show that spacetime can hide blackholes (!)
Tweet media one
3
1
19
@RobertHuangHY
Hsin-Yuan (Robert) Huang
10 months
This applies to a class of 2D systems where the ground states are highly entangled🧶 and are classically hard to find When the landscape is nice, cooling to a local minimum☃️ is the same as finding ground states So finding local minima is classically hard (7/9)
2
0
18
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
Some interesting generalizations: by Zhao, Rubin, Miyake. by Chen, Yu, Zeng, Flammia. by Hu, Choi, You. by Helson et al. from Jens Eisert's group.
0
0
18
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 years
We design a class of ML algorithms that can predict ground states after learning from examples of some physical systems. Thm. 1 proves that these ML algorithms can learn/predict accurately and efficiently. [🧵5/13]
Tweet media one
1
0
17
@RobertHuangHY
Hsin-Yuan (Robert) Huang
2 years
The exponential complexity🕸️ makes learning very challenging ⚔️ All known algorithms for learning a general unitary, e.g., our previous analysis in & , require exponential data size or runtime🫠 (📜3/7)
1
3
18
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 months
The certification is done by computing a number E[ω] between 0 and 1 from these randomized measurements. We call E[ω] the "shadow overlap". If the measured state is close to Ψ, E[ω] is close to 1, and vice versa. So, we can perform certification by looking at E[ω].
Tweet media one
1
0
16
@RobertHuangHY
Hsin-Yuan (Robert) Huang
6 months
@Qottmann Was recovering from the extraordinarily fun and intense week from QIP xD TLDR is in the cooking now!
0
0
17
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 months
Now, say you think of an n-qubit state Ψ and ask: Is the state I just measured close to Ψ? (Certification) Maybe you are trying to synthesize a state in the lab to be as close to Ψ as possible. Or maybe you have created the model Ψ by training neural nets or tensor networks.
Tweet media one
1
0
17
@RobertHuangHY
Hsin-Yuan (Robert) Huang
3 months
Our main theorem states that for almost all Ψ, the data allows us to correctly answer the question. In particular, we can accurately certify any state Ψ that is a superposition over "well-connected" classical states/bitstrings.
Tweet media one
1
1
16