DistilBART Model

From GM-RKB
(Redirected from Distilled BART Model)
Jump to navigation Jump to search

A DistilBART Model is a knowledge-distilled transformer-based text summarization model that applies knowledge distillation techniques to a BART architecture to create efficient abstractive summarization systems.