DeBERTa

E435870 UNEXPLORED

DeBERTa is a transformer-based language model developed by Microsoft that improves upon BERT and RoBERTa using disentangled attention and enhanced mask decoder mechanisms for superior natural language understanding.


Referenced by (1)
Subject (surface form when different) Predicate
Hugging Face Transformers
supportsModelType

Please wait…