Privacy-preserving machine learning with tensor networks

APA

Pozas Kerstjens, A. (2024). Privacy-preserving machine learning with tensor networks. Perimeter Institute for Theoretical Physics. https://pirsa.org/24030105

MLA

Pozas Kerstjens, Alejandro. Privacy-preserving machine learning with tensor networks. Perimeter Institute for Theoretical Physics, Mar. 04, 2024, https://pirsa.org/24030105

BibTex

          @misc{ scivideos_PIRSA:24030105,
            doi = {10.48660/24030105},
            url = {https://pirsa.org/24030105},
            author = {Pozas Kerstjens, Alejandro},
            keywords = {Quantum Foundations},
            language = {en},
            title = {Privacy-preserving machine learning with tensor networks},
            publisher = {Perimeter Institute for Theoretical Physics},
            year = {2024},
            month = {mar},
            note = {PIRSA:24030105 see, \url{https://scivideos.org/pirsa/24030105}}
          }
          

Alejandro Pozas Kerstjens University of Geneva (UNIGE)

Source Repository PIRSA
Collection

Abstract

In this talk, I will argue and practically illustrate that insights in quantum information, concretely coming from the tensor network representations of quantum many-body states, can help in devising better privacy-preserving machine learning algorithms. In the first part, I will show that standard neural networks are vulnerable to a type of privacy leak that involves global properties of the data used for training, thus being a priori resistant to standard protection mechanisms. In the second, I will show that tensor networks, when used as machine learning architectures, are invulnerable to this vulnerability. The proof of the resilience is based on the existence of canonical forms for such architectures. Given the growing expertise in training tensor networks and the recent interest in tensor-based reformulations of popular machine learning architectures, these results imply that one may not have to be forced to make a choice between accuracy in prediction and ensuring the privacy of the information processed when using machine learning on sensitive data.

---

Zoom link