DistilBART Model

From GM-RKB
Jump to navigation Jump to search

A DistilBART Model is a knowledge-distilled transformer-based text summarization model that applies knowledge distillation techniques to a BART architecture to create efficient abstractive summarization systems.