Mobile-Bert-Uncased-Google
Language model for masked language modeling and general‑purpose NLP tasks.
MOBILEBERT is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for masked language modeling and as a backbone for various NLP tasks.
Not supported
This model is currently not supported on any Automotive chipset.
To see performance metrics for this model on other chipsets, click the button below.
View for other chipsetsTechnical Details
Model checkpoint:mobile_bert_uncased_google
Input resolution:1x384
Number of parameters:25.3M
Model size (float):130 MB
Applicable Scenarios
- Text Classification
- Sentiment Analysis
- Named Entity Recognition
License
Model:APACHE-2.0
Tags
- backbone
Supported Automotive Devices
- SA7255P ADP
- SA8255P ADP
- SA8295P ADP
- SA8650P ADP
- SA8775P ADP
Supported Automotive Chipsets
- Qualcomm® SA7255P
- Qualcomm® SA8295P
- Qualcomm® SA8775P
Looking for more? See models created by industry leaders.
Discover Model Makers






