I will present some recent work on the interplay between contextuality, entanglement, and magic in multiqubit systems. Taking a foundational inquiry into entanglement in the Kochen-Specker theorem as our point of departure, I will proceed to outline some questions this raises about the role of these resources in models of multiqubit quantum computation. The purpose of this talk is to raise questions that can hopefully feed into the discussion sessions.
A universal and well-motivated notion of classicality for an operational theory is explainability by a generalized-noncontextual ontological model. I will here explain what notion of classicality this implies within the framework of generalized probabilistic theories. I then prove that for any locally tomographic theory, every such classical model is given by a complete frame representation. Using this powerful constraint on the space of possible classical representations, I will then prove that the stabilizer subtheory has a unique classical representation—namely Gross's discrete Wigner function. This provides deep insights into the relevance of Gross's representation within quantum computation. It also implies that generalized contextuality is also a necessary resource for universal quantum computation in the state injection model.
Quantum Darwinism proposes that the proliferation of redundant information plays a major role in the emergence of objectivity out of the quantum world. Is this kind of objectivity necessarily classical? We show that if one takes Spekkens’s notion of noncontextuality as the notion of classicality and the approach of Brandão, Piani, and Horodecki to quantum Darwinism, the answer to the above question is “‘yes,” if the environment encodes the proliferated information sufficiently well. Moreover, we propose a threshold on this encoding, above which one can unambiguously say that classical objectivity has emerged under quantum Darwinism.
We give a simple description of rectangular matrices that can be implemented by a post-selected stabilizer circuit. Given a matrix with entries in dyadic cyclotomic number fields $\mathbb{Q}(\exp(i\frac{2\pi}{2^m}))$, we show that it can be implemented by a post-selected stabilizer circuit if it has entries in $\mathbb{Z}[\exp(i\frac{2\pi}{2^m})]$ when expressed in a certain non-orthogonal basis. This basis is related to Barnes-Wall lattices. Our result is a generalization to a well-known connection between Clifford groups and Barnes-Wall lattices. We also show that minimal vectors of Barnes-Wall lattices are stabilizer states, which may be of independent interest. Finally, we provide a few examples of generalizations beyond standard Clifford groups.
Joint work with Sebastian Schonnenbeck
This talk will present work-in-progress towards a new programming methodology for Cliffords, where n-ary Clifford unitaries over qudits can be expressed as functions on compact Pauli. Inspired by the fact that projective Cliffords correspond to center-fixing automorphisms on the Pauli group, we develop a type system where well-typed expressions correspond to symplectic morphisms---that is, linear transformations that respect the symplectic form. This language is backed up by a robust categorical and operational semantics, and well-typed functions can be efficiently simulated and synthesized into circuits via Pauli tableaus.
Fault-tolerant protocols and quantum error correction (QEC) are essential to building reliable quantum computers from imperfect components that are vulnerable to errors. Optimizing the resource and time overheads needed to implement QEC is one of the most pressing challenges that will facilitate a transition from NISQ to the fault tolerance era. In this talk, I will discuss two intriguing ideas that can significantly reduce these overheads. The first idea, erasure qubits, relies on an efficient conversion of the dominant noise into erasure errors at known locations, greatly enhancing the performance of QEC protocols. The second idea, single-shot QEC, guarantees that even in the presence of measurement errors one can perform reliable QEC without repeating measurements, incurring only constant time overhead.
Whilst tomography has dominated the theory behind reconstructing/approximating quantum objects, such as states or channels, conducting full tomography is often not necessary in practice. If one is interested in learning properties of a quantum system, side-stepping the exponential lower bounds of tomography is then possible. In this talk, we will introduce various learning models for approximating quantum objects, survey the literature of quantum learning theory and explore instances where learning can be fully time- and sample efficient.
Particle physics underpins our understanding of the world at a fundamental level by describing the interplay of matter and forces through gauge theories. Yet, despite their unmatched success, the intrinsic quantum mechanical nature of gauge theories makes important problem classes notoriously difficult to address with classical computational techniques. A promising way to overcome these roadblocks is offered by quantum computers, which are based on the same laws that make the classical computations so difficult. Here, we present a quantum computation of the properties of the basic building block of two-dimensional lattice quantum electrodynamics, involving both gauge fields and matter. This computation is made possible by the use of a trapped-ion qudit quantum processor, where quantum information is encoded in d different states per ion, rather than in two states as in qubits. Qudits are ideally suited for describing gauge fields, which are naturally high-dimensional, leading to a dramatic reduction in the quantum register size and circuit complexity. Using a variational quantum eigensolver we find the ground state of the model and observe the interplay between virtual pair creation and quantized magnetic field effects. The qudit approach further allows us to seamlessly observe the effect of different gauge field truncations by controlling the qudit dimension. Our results open the door for hardware-efficient quantum simulations with qudits in near-term quantum devices.
IN PERSON - Lorenzo Catani, Matthew Fox, Hlér Kristjánsson, Gabrielle Tournaire
VIRTUAL - Jonte Hance, Sidiney Montanhano, Shiroman Prakash, Amr Sabry
BosonSampling is one of the leading candidate models for a demonstration of quantum computational advantage. However, there are still important gaps between our best theoretical results and what can be implemented realistically in the laboratory. One of the largest gaps concerns the scaling between the number of modes (m) and number of photons (n) in the experiment. The original proposal by Aaronson and Arkhipov, as well as all subsequent improvements, required m to scale as n^2, whereas most state-of-the-art typically operate in a regime where m is linear in n. In this talk, I will describe how our recent work bridges this gap by providing evidence that BosonSampling remains hard even for m as low as 2n. I will review the template for proofs of computational advantage used in BosonSampling and other proposals, and discuss how we solved the new challenges that appear in this regime.
It is known that measurement-based quantum computations (MBQCs) which compute a non-linear Boolean function with sufficiently high probability of success are contextual, i.e., they cannot be described by a non-contextual hidden variable model. It is also known that contexuality has descriptions in terms of cohomology [1,2]. And so it seems in range to obtain a cohomological description of MBQC. And yet, the two connections mentioned above are not easily strung together. In a previous work [3], the cohomological description for MBQC was provided for the temporally flat case. Here we present the extension to the general temporally ordered case.
[1] S. Abramsky, R. Barbosa, S. Mansfield, The Cohomology of Non-Locality and Contextuality, EPTCS 95, 2012, pp. 1-14
[2] C. Okay, S. Roberts, S.D. Bartlett, R. Raussendorf, Topological proofs of contextuality in quantum mechanics, Quant. Inf. Comp. 17, 1135-1166 (2017).
[3] R. Raussendorf, Cohomological framework for contextual quantum computations, Quant. Inf. Comp. 19, 1141-1170 (2019)
This is jount work with Polina Feldmann and Cihan Okay