BERT variant
C39886
concept
A BERT variant is a transformer-based language model derived from the original BERT architecture, modified in aspects such as pretraining objectives, architecture, or domain specialization to improve performance on specific tasks or datasets.
Observed surface forms (1)
- deep contextual word representation ×1
Instances (3)
- ALBERT
-
BERT
surface form: BERT_BASE
- Embeddings from Language Models via concept surface "deep contextual word representation"