Distil-Bert-Base-Uncased-Hf

Language model for masked language modeling and general‑purpose NLP tasks.

DistilBERT is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for masked language modeling and as a backbone for various NLP tasks.

Not supported

This model is currently not supported on any Mobile chipset.

To see performance metrics for this model on other chipsets, click the button below.

View for other chipsets

Technical Details

Model checkpoint:distil_bert_base_uncased_hf
Input resolution:1x384
Number of parameters:11.3M
Model size (float):43.3 MB

Applicable Scenarios

  • Text Classification
  • Sentiment Analysis
  • Named Entity Recognition

Supported Mobile Form Factors

  • Phone
  • Tablet

License

Tags

  • backbone

Supported Mobile Devices

  • Samsung Galaxy S21
  • Samsung Galaxy S21 Ultra
  • Samsung Galaxy S22 5G
  • Samsung Galaxy S22 Ultra 5G
  • Samsung Galaxy S22+ 5G
  • Samsung Galaxy S23
  • Samsung Galaxy S23 Ultra
  • Samsung Galaxy S23+
  • Samsung Galaxy S24
  • Samsung Galaxy S24 Ultra
  • Samsung Galaxy S24+
  • Samsung Galaxy S25
  • Samsung Galaxy S25 Ultra
  • Samsung Galaxy S25+
  • Samsung Galaxy Tab S8
  • Snapdragon 7 Gen 4 QRD
  • Snapdragon 8 Elite Gen 5 QRD
  • Xiaomi 12
  • Xiaomi 12 Pro

Supported Mobile Chipsets

  • Snapdragon® 7 Gen 4 Mobile
  • Snapdragon® 8 Elite Mobile
  • Snapdragon® 8 Elite Gen 5 Mobile
  • Snapdragon® 8 Gen 1 Mobile
  • Snapdragon® 8 Gen 2 Mobile
  • Snapdragon® 8 Gen 3 Mobile
  • Snapdragon® 888 Mobile

Looking for more? See models created by industry leaders.

Discover Model Makers