DistilBERT

E435865 UNEXPLORED

DistilBERT is a smaller, faster, and lighter-weight distilled version of the BERT language model designed to retain most of its performance while being more efficient for practical NLP applications.


Referenced by (1)
Subject (surface form when different) Predicate
Hugging Face Transformers
supportsModelType

Please wait…