See-Liu-Manning Text Summarization System
Jump to navigation
Jump to search
A See-Liu-Manning Text Summarization System is a Text Summarization System that can solve a See-Liu-Manning Text Summarization Task by implementing See-Liu-Manning Text Summarization Algorithms.
- Context:
- it was developed by See et al. (2017).
- Resource(s): Software repository is available at https://github.com/abisee/pointer-generator
- System's Architecture:
- It is based on Pointer-Generator Neural Network architecture.
- Training Systems and Tools:
- Baseline models are training using the following tools:
- Adagrad (Duchi et al., 2011) with learning rate 0.15 and an initial accumulator value of 0.1
- Beam Search with beam size 4;
- Gradient Clipping with a maximum gradient norm of 2;
- See-Liu-Manning Coverage Loss Function.
- Baseline models are training using the following tools:
- Example(s):
- Training a model:
python run_summarization.py --mode=train --data_path=/path/to/chunked/train_* --vocab_path=/path/to/vocab --log_root=/path/to/a/log/directory --exp_name=myexperiment
- Running a model on evaluation set:
python run_summarization.py --mode=eval --data_path=/path/to/chunked/val_* --vocab_path=/path/to/vocab --log_root=/path/to/a/log/directory --exp_name=myexperiment
- Running beam search decoding:
python run_summarization.py --mode=decode --data_path=/path/to/chunked/val_* --vocab_path=/path/to/vocab --log_root=/path/to/a/log/directory --exp_name=myexperiment
- Training a model:
- Counter-Examples:
- See: Natural Language Generation System, Natural Language Processing System, Machine Translation System, ROUGE Metric, Meteor Universal Metric.
References
2017
- (See et al., 2017) ⇒ Abigail See, Peter J. Liu, and Christopher D. Manning. (2017). “Get To The Point: Summarization with Pointer-Generator Networks.” In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). DOI:10.18653/v1/P17-1099.
2011
- (Duchi et al., 2011) ⇒ John Duchi, Elad Hazan, and Yoram Singer. (2011). “Adaptive Subgradient Methods for Online Learning and Stochastic Optimization.” In: The Journal of Machine Learning Research, 12.