Transformer-based Neural Language Model
Jump to navigation
Jump to search
A Transformer-based Neural Language Model is a deep neural LM that is a transformer-based deep NNet.
- Context:
- It can be produced by a Transformer-based Neural Language Modeling System (that can solve a Transformer-based Neural Network-based Language Modeling Task).
- It can range from being a Character-Level Transformer-based Neural Network-based LM to being a Word/Token-Level Transformer-based Neural Network-based LM.
- It can range from being a Forward Transformer-based Neural Network-based Language Model to being a Backward Transformer-based Neural Network-based Language Model to being a Bi-Directional Transformer-based Neural Network-based Language Model.
- Example(s):
- BART, ElMO, GPT-2, Turing-NLG.
- See: Neural NLG.