Tags
DatasetsNLPGSG
Provider
URL
https://github.com/google-research/pegasus
Abstract
Pre-training with Extracted Gap-sentences for Abstractive SUmmarization Sequence-to-sequence models, or Pegasus, uses self-supervised objective Gap Sentences Generation (GSG) to train a transformer encoder-decoder model.
Paper
Pegasus: Pre-Training with Extracted Gap-Sentences for Abstractive Summarization