natural language processing model
C25414
concept
A natural language processing model is a computational system designed to understand, interpret, generate, and manipulate human language in a meaningful way.
Observed surface forms (16)
- masked language model ×2
- multilingual language model ×2
- natural language processing system ×2
- transformer-based language model ×2
- Transformer model ×1
- autoregressive transformer model ×1
- bag-of-words model ×1
- document understanding model ×1
- language model architecture ×1
- multilingual sequence-to-sequence model ×1
- permutation language model ×1
- sequence-to-sequence transformer model ×1
- text classification model ×1
- text-to-text model ×1
- topic model ×1
- transformer-based neural network architecture ×1
Instances (17)
- Elmo
- Aristo project via concept surface "natural language processing system"
-
“A Question-Answering System for High School Algebra Word Problems”
via concept surface "natural language processing system"
surface form: A Question-Answering System for High School Algebra Word Problems
- RoBERTa via concept surface "masked language model"
- DistilBERT via concept surface "transformer-based language model"
- XLNet
- T5 via concept surface "text-to-text model"
- BART via concept surface "sequence-to-sequence transformer model"
- DeBERTa via concept surface "transformer-based language model"
- Bloom via concept surface "multilingual language model"
- XLM-R via concept surface "multilingual language model"
- mBART via concept surface "multilingual sequence-to-sequence model"
- Longformer via concept surface "transformer-based neural network architecture"
- LayoutLM via concept surface "document understanding model"
- ARC2 via concept surface "text classification model"
- Transformer-XL via concept surface "language model architecture"
- Latent Dirichlet Allocation via concept surface "topic model"