Bert-Base-Uncased-Hf

Language model for masked language modeling and general‑purpose NLP tasks.

Bert is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for masked language modeling and as a backbone for various NLP tasks.

Technical Details

Model checkpoint:google-bert/bert-base-uncased
Input resolution:1x384
Number of parameters:110M
Model size (float):418 MB

Applicable Scenarios

  • Text Classification
  • Sentiment Analysis
  • Named Entity Recognition

License

Tags

  • backbone

Supported Compute Devices

  • Snapdragon X Elite CRD
  • Snapdragon X Plus 8-Core CRD

Supported Compute Chipsets

  • Snapdragon® X Elite
  • Snapdragon® X Plus 8-Core

Related Models

See all models

Looking for more? See models created by industry leaders.

Discover Model Makers