Mobile-Bert-Uncased-Google

Language model for masked language modeling and general‑purpose NLP tasks.

MOBILEBERT is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for masked language modeling and as a backbone for various NLP tasks.

Not supported

This model is currently not supported on any IoT chipset.

To see performance metrics for this model on other chipsets, click the button below.

View for other chipsets

Technical Details

Model checkpoint:mobile_bert_uncased_google
Input resolution:1x384
Number of parameters:25.3M
Model size (float):130 MB

Applicable Scenarios

  • Text Classification
  • Sentiment Analysis
  • Named Entity Recognition

License

Tags

  • backbone

Supported IoT Devices

  • Dragonwing IQ-9075 EVK
  • Dragonwing IQ-X5121
  • Dragonwing IQ-X7181
  • Dragonwing Q-8750
  • QCS8275 (Proxy)
  • QCS8550 (Proxy)

Supported IoT Chipsets

  • Qualcomm® QCS8275 (Proxy)
  • Qualcomm® QCS8550 (Proxy)
  • Qualcomm® QCS9075

Looking for more? See models created by industry leaders.

Discover Model Makers