Paper Title :Multi Document Text Summarization using Distilled Transformers
Author :Akshay Mahale, Priyanka Mishra, Sai Lakshmi Reddi, Selvakuberan Karuppasamy, Subhashini Lakshminarayanan
Article Citation :Akshay Mahale ,Priyanka Mishra ,Sai Lakshmi Reddi ,Selvakuberan Karuppasamy ,Subhashini Lakshminarayanan ,
(2022 ) " Multi Document Text Summarization using Distilled Transformers " ,
International Journal of Electrical, Electronics and Data Communication (IJEEDC) ,
pp. 27-30,
Volume-10,Issue-7
Abstract : Abstract - With increase in amount of unstructured data the need to extract meaningful and precise insights has become equally critical. Text data is often stored in different file formats ranging from pdfs, docx and images. With the help of advanced abstractive summarization techniques (Transformers), the amount of time and efforts spent on extracting useful insights from lengthy and varied documents can be reduced. In our approach, we have identified distilled transformers to solve our problem in much faster and better way. Distillation is a compression technique that involves training a small model to mimic the behaviors of a larger model. This helps to get better performance over existing Transformer models with additional benefit of lightweight, responsive and energy efficient. We have tested our hypothesis with distilBART and distilPEGASUS and got promising results on metrics like ROUGE scores.
Keywords - Document Summarization, Distil Transformers, Abstractive Summarization, distilBART, distilPEGASUS, NLP.
Type : Research paper
Published : Volume-10,Issue-7
DOIONLINE NO - IJEEDC-IRAJ-DOIONLINE-18914
View Here
Copyright: © Institute of Research and Journals
|
 |
| |
 |
PDF |
| |
Viewed - 13 |
| |
Published on 2022-11-17 |
|