DAY 94-100 DAYS MLCODE: Text Summarization using Sequence-To-Sequence models – Part 2
In the previous blog, we discussed the paper related to Text Summarization, on
Get To The Point: Summarization with Pointer-Generator Networks.
The Text Summarization code of paper can be found here. I’m going to understand and try to run in the google
Let’s start the data first from the source and run instruction to generate the batch train and validation files.
Once we have all the data, lets train the system. This is huge data so system may take a long time to train.
Text summarization of this paper is also explained nicely here
As per the above blog, Though the Text summarization model produces abstractive summaries, the wording is usually quite close to the original text. Higher-level abstraction – such as more powerful, compressive paraphrasing – remains unsolved.
References:
Original paper:
Get To The Point: Summarization with Pointer-Generator Networks.