Publications
CONFERENCE (INTERNATIONAL) Incorporating Topic Sentence on Neural News Headline Generation
Jan Wira Gotama Putra(Tokyo Tech), Hayato Kobayashi, Nobuyuki Shimizu
The 19th International Conference on Computational Linguistics and Intelligent Text Processing (CICLing 2018)
March 18, 2018
Most past studies on neural news headline generation train
the encoder-decoder model using a first sentence of the document aligned
with a headline. However, it is found that the first sentence might not
provide sufficient information. This study proposes to use a topic sentence
as the input instead of the first sentence for neural news headline
generation task. The topic sentence is defined as the most newsworthy
sentence and has been studied in the past. Experimental result shows
that the model trained on the topic sentence has a better generalization
than the model trained using the first sentence. Training the model
using both the first and topic sentences increases the performance even
further compared to only training using the topic sentence in a certain
case. We conclude that using topic sentence is a strategy of giving a more
informative information into the neural network compared to using first
sentence, while keeping the input length as short as possible at the same
time.
Slides Download (556KB)
PDF : Incorporating Topic Sentence on Neural News Headline Generation