Comprehensive Survey of Abstractive Text Summarization Techniques

Authors

  • Kundan Chaudhari D Batu, Nutan College of Engineering and Research Vishnupuri Talegaon Dabhade, India. Author
  • Raj Mahale D Batu, Nutan College of Engineering and Research Vishnupuri Talegaon Dabhade, India. Author
  • Fardeen Khan D Batu, Nutan College of Engineering and Research Vishnupuri Talegaon Dabhade, India. Author
  • Shradha Gaikwad D Batu, Nutan College of Engineering and Research Vishnupuri Talegaon Dabhade, India. Author
  • Kavita Jadhav D Batu, Nutan College of Engineering and Research Vishnupuri Talegaon Dabhade, India. Author

DOI:

https://doi.org/10.47392/IRJAEM.2024.0323

Keywords:

Courtesy, Drill, Interpreter, Encoder, Abstractive Text Summarization

Abstract

Text summarization using pre-trained encoders has become a crucial technique for efficiently managing large volumes of text data. The rise of automatic summarization systems addresses the need to process ever-increasing data while meeting user-specific requirements. Recent scientific research highlights significant advancements in abstractive summarization, with a particular focus on neural network-based methods. A detailed review of various neural network models for abstractive summarization identifies five key components essential to their design: encoder-decoder architecture, mechanisms, training strategies and optimization algorithms, dataset selection, and evaluation metrics. Each of these elements is pivotal in enhancing the summarization process. This study aims to provide a thorough understanding of the latest developments in neural network-based abstractive summarization models, offering insights into the evolving field and underscoring the associated challenges. Qualitative analysis using a concept matrix reveals common design trends in contemporary neural abstractive summarization systems. Notably, BERT-based encoder-decoder models have emerged as leading innovations, representing the most recent progress in the field. Based on the insights from this review, the study recommends integrating pre-trained language models with neural network techniques to achieve optimal performance in abstractive summarization tasks. As the volume of online information continues to surge, the field of automatic text summarization has garnered significant attention within the Natural Language Processing (NLP) community. Spanning over five decades, researchers have approached this problem from diverse angles, exploring various domains and employing a multitude of paradigms. This survey aims to delve into some of the most pertinent methodologies, focusing on both single-document and multiple-document summarization techniques, with a particular emphasis on empirical methods and extractive approaches. Additionally, the survey explores promising strategies that target specific intricacies of the summarization task. Notably, considerable attention is dedicated to the automatic evaluation of summarization systems, recognizing its pivotal role in guiding future research endeavors.

Downloads

Download data is not yet available.

Downloads

Published

2024-07-15