Bert-Base-Uncased-Hf
Language model for masked language modeling and general‑purpose NLP tasks.
Bert is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for masked language modeling and as a backbone for various NLP tasks.
Technical Details
Model checkpoint:google-bert/bert-base-uncased
Input resolution:1x384
Number of parameters:110M
Model size (float):418 MB
Applicable Scenarios
- Text Classification
- Sentiment Analysis
- Named Entity Recognition
License
Model:APACHE-2.0
Tags
- backbone
Supported Automotive Devices
- SA7255P ADP
- SA8295P ADP
- SA8775P ADP
Supported Automotive Chipsets
- Qualcomm® SA7255P
- Qualcomm® SA8295P
- Qualcomm® SA8775P
Related Models
See all modelsLooking for more? See models created by industry leaders.
Discover Model Makers









