Beware the (log)logjam: Quantum error mitigation becomes hard at polyloglog(n) depth

APA

Quek, Y. (2023). Beware the (log)logjam: Quantum error mitigation becomes hard at polyloglog(n) depth. Perimeter Institute for Theoretical Physics. https://pirsa.org/23030084

MLA

Quek, Yihui. Beware the (log)logjam: Quantum error mitigation becomes hard at polyloglog(n) depth. Perimeter Institute for Theoretical Physics, Mar. 15, 2023, https://pirsa.org/23030084

BibTex

          @misc{ scivideos_PIRSA:23030084,
            doi = {10.48660/23030084},
            url = {https://pirsa.org/23030084},
            author = {Quek, Yihui},
            keywords = {Quantum Information},
            language = {en},
            title = {Beware the (log)logjam: Quantum error mitigation becomes hard at polyloglog(n) depth},
            publisher = {Perimeter Institute for Theoretical Physics},
            year = {2023},
            month = {mar},
            note = {PIRSA:23030084 see, \url{https://scivideos.org/index.php/pirsa/23030084}}
          }
          

Yihui Quek Freie Universität Berlin

Source Repository PIRSA

Abstract

Quantum error mitigation has been proposed as a means to combat unwanted and unavoidable errors in near-term quantum computing using no or few additional quantum resources, in contrast to fault-tolerant schemes that come along with heavy overheads. Error mitigation has been successfully applied to reduce noise in near-term applications. 

In this work, however, we identify strong limitations to the degree to which quantum noise can be effectively `undone' for larger system sizes. We set up a framework that rigorously captures large classes of error mitigation schemes in use today. The core of our argument combines fundamental limits of statistical inference with a construction of families of random circuits that are highly sensitive to noise.

We show that even at poly loglog depth, a super-polynomial number of samples is needed in the worst case to estimate the expectation values of noiseless observables, the principal task of error mitigation. Notably, our construction implies that scrambling due to noise can kick in at exponentially smaller depths than previously thought. They also impact other near-term applications, constraining kernel estimation in quantum machine learning, causing an earlier emergence of noise-induced barren plateaus in variational quantum algorithms and ruling out exponential quantum speed-ups in estimating expectation values in the presence of noise or preparing the ground state of a Hamiltonian.

Zoom link:  https://pitp.zoom.us/j/95736148335?pwd=akZLaHE5aStNQVZOeVFETlltNzVwdz09