👋
Hi!
Pegasus

Pegasus

Tags
DatasetsNLPGSG
Provider

Google

URL
https://github.com/google-research/pegasus
Abstract

Pre-training with Extracted Gap-sentences for Abstractive SUmmarization Sequence-to-sequence models, or Pegasus, uses self-supervised objective Gap Sentences Generation (GSG) to train a transformer encoder-decoder model.

Paper

Pegasus: Pre-Training with Extracted Gap-Sentences for Abstractive SummarizationPegasus: Pre-Training with Extracted Gap-Sentences for Abstractive Summarization

Logo