Mobile-Bert-Uncased-Google

Language model for masked language modeling and general‑purpose NLP tasks.

MOBILEBERT is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for masked language modeling and as a backbone for various NLP tasks.

Technical Details

Model checkpoint:mobile_bert_uncased_google
Input resolution:1x384
Number of parameters:25.3M
Model size (float):130 MB

Applicable Scenarios

  • Text Classification
  • Sentiment Analysis
  • Named Entity Recognition

License

Tags

  • backbone

Supported Automotive Devices

  • SA7255P ADP
  • SA8295P ADP
  • SA8775P ADP

Supported Automotive Chipsets

  • Qualcomm® SA7255P
  • Qualcomm® SA8295P
  • Qualcomm® SA8775P

Related Models

See all models

Looking for more? See models created by industry leaders.

Discover Model Makers