This paper presents a text generation approach that involves copying and pasting text segments from an existing collection, resulting in better generation quality and comparable inference efficiency to autoregressive models. Domain adaptation and performance gains are also observed. 00:00 Section: 1 Introduction 03:09 Section: 2 Background: Neural Text Generation 05:40 Section: 3 Copy-Generator 07:59 Section: Ethical Consideration 11:14 Section: Context-Independent Token Embeddings 14:07 Section: 4 Experimental Setup 17:41 Section: 4.3 Automatic Evaluation Metrics 20:33 Section: Results 23:24 Section: Case Study 26:27 Section: Results 28:56 Section: Dense Retrieval YouTube: @ArxivPapers PODCASTS: Apple Podcasts: Spotify:
Hide player controls
Hide resume playing