Description
“BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension Transformer” The Facebook BART (Bidirectional and Auto-Regressive Transformer) model is a state-of-the-art language generation model that was introduced by Facebook AI in 2019. It is based on the transformer architecture and is designed to handle a wide range of natural language processing tasks such as text generation, summarization, and machine translation.
This pre-trained model is DistilBART fine-tuned on the Extreme Summarization (XSum) Dataset.
Predicted Entities
How to use
bart = BartTransformer.pretrained("distilbart_xsum_12_6") \
.setTask("summarize:") \
.setMaxOutputLength(200) \
.setInputCols(["documents"]) \
.setOutputCol("summaries")
val bart = BartTransformer.pretrained("distilbart_xsum_12_6")
.setTask("summarize:")
.setMaxOutputLength(200)
.setInputCols("documents")
.setOutputCol("summaries")
Model Information
Model Name: | distilbart_xsum_12_6 |
Compatibility: | Spark NLP 5.5.0+ |
License: | Open Source |
Edition: | Official |
Input Labels: | [documents] |
Output Labels: | [generation] |
Language: | en |
Size: | 853.7 MB |
References
https://huggingface.co/sshleifer/distilbart-xsum-12-6