DOIONLINE

DOIONLINE NO - IJEEDC-IRAJ-DOIONLINE-18914

Publish In
International Journal of Electrical, Electronics and Data Communication (IJEEDC)-IJEEDC
Journal Home
Volume Issue
Issue
Volume-10,Issue-7  ( Jul, 2022 )
Paper Title
Multi Document Text Summarization using Distilled Transformers
Author Name
Akshay Mahale, Priyanka Mishra, Sai Lakshmi Reddi, Selvakuberan Karuppasamy, Subhashini Lakshminarayanan
Affilition
Pages
27-30
Abstract
Abstract - With increase in amount of unstructured data the need to extract meaningful and precise insights has become equally critical. Text data is often stored in different file formats ranging from pdfs, docx and images. With the help of advanced abstractive summarization techniques (Transformers), the amount of time and efforts spent on extracting useful insights from lengthy and varied documents can be reduced. In our approach, we have identified distilled transformers to solve our problem in much faster and better way. Distillation is a compression technique that involves training a small model to mimic the behaviors of a larger model. This helps to get better performance over existing Transformer models with additional benefit of lightweight, responsive and energy efficient. We have tested our hypothesis with distilBART and distilPEGASUS and got promising results on metrics like ROUGE scores. Keywords - Document Summarization, Distil Transformers, Abstractive Summarization, distilBART, distilPEGASUS, NLP.
  View Paper