Transformer
E102296
UNEXPLORED
Transformer is a neural network architecture based on self-attention mechanisms that has become the foundation for modern large language models and many state-of-the-art systems in natural language processing.
Aliases (1)
- BERT ×2
Referenced by (4)
| Subject (surface form when different) | Predicate |
|---|---|
|
GPT-3
→
GPT-4 → |
architecture |
|
Hugging Face Transformers
("BERT")
→
|
supportsModelType |
|
Google Search
("BERT")
→
|
usesAlgorithm |