Tags
DatasetsNLPGSG
Provider
Abstract
Pre-training with Extracted Gap-sentences for Abstractive SUmmarization Sequence-to-sequence models, or Pegasus, uses self-supervised objective Gap Sentences Generation (GSG) to train a transformer encoder-decoder model.