Attention is all you get

APA

Ginsparg, P. (2019). Attention is all you get. Perimeter Institute for Theoretical Physics. https://pirsa.org/19070013

MLA

Ginsparg, Paul. Attention is all you get. Perimeter Institute for Theoretical Physics, Jul. 11, 2019, https://pirsa.org/19070013

BibTex

          @misc{ scivideos_PIRSA:19070013,
            doi = {10.48660/19070013},
            url = {https://pirsa.org/19070013},
            author = {Ginsparg, Paul},
            keywords = {Quantum Matter},
            language = {en},
            title = {Attention is all you get},
            publisher = {Perimeter Institute for Theoretical Physics},
            year = {2019},
            month = {jul},
            note = {PIRSA:19070013 see, \url{https://scivideos.org/index.php/pirsa/19070013}}
          }
          

Paul Ginsparg Cornell University

Source Repository PIRSA
Talk Type Conference

Abstract

For the past decade, there has been a new major architectural fad in deep learning every year or two. One such fad for the past two years has been the transformer model, an implementation of the attention method which has superseded RNNs in most sequence learning applications. I'll give an overview of the model, with some discussion of non-physics applications, and intimate some possibilities for physics.