Hello world!

Natural Language Processing

Mass: Masked sequence to sequence pre-training for language generation

xyz1 2023. 6. 8. 13:31

Background

  • Encoder, Decoder
  • Auto Encoder Pretraining

Abstract

  • 기존 토큰 별로 mask 하여 학습한 것과 달리, segment 단위로 학습을 진행함.

 

Methods

 

Reference

https://www.youtube.com/watch?v=v7diENO2mEA