Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode

Main Article Content

Saroj Anand Tripathy
Sharmila Ashok

Abstract

With the rise of the Internet, we now have a lot of information at our disposal. We 're swamped from many sources — news, social media, to name a few, office emails. This paper addresses the problem of reading through such extensive information by summarizing it using text summarizer based on Abstractive Summarization using deep learning models, i.e. using bidirectional Long Short-Term Memory (LSTM) networks and Pointer Generator mode. The LSTM model (which is a modification of the Recurrent Neural Network) is trained and tested on the Amazon Fine Food Review dataset using the Bahadau Attention Model Decoder with the use of Conceptnet Numberbatch embeddings that are very similar and better to GloVe. Pointer Generator mode is trained and tested by the CNN / Daily Mail dataset and the model uses both Decoder and Attention inputs. But due 2 major problems in LSTM model like the inability of the network to copy facts and repetition of words the second method is, i.e., Pointer Generator mode is used. This paper in turn aims to provide an analysis on both the models to provide a better understanding of the working of the models to enable to create a strong text summarizer. The main purpose is to provide reliable summaries of datasets or uploaded files, depending on the choice of the user. Unnecessary sentences will be rejected in order to obtain the most important sentences.

Article Details

How to Cite
Tripathy, S. A., & Ashok, S. (2023). Abstractive method-based Text Summarization using Bidirectional Long Short-Term Memory and Pointer Generator Mode. Journal of Applied Research and Technology, 21(1), 73–86. https://doi.org/10.22201/icat.24486736e.2023.21.1.1446
Section
Articles