Albert-Base-V2-Hf

Language model for masked language modeling and general‑purpose NLP tasks.

ALBERT is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for masked language modeling and as a backbone for various NLP tasks.

Technical Details

Model checkpoint:albert/albert-base-v2
Input resolution:1x384
Number of parameters:11.8M
Model size (float):43.9 MB

Applicable Scenarios

  • Text Classification
  • Sentiment Analysis
  • Named Entity Recognition

Supported Form Factors

  • Phone
  • Tablet
  • IoT
  • XR

License

Tags

  • backbone

Supported Devices

  • QCS8275 (Proxy)
  • QCS8450 (Proxy)
  • QCS8550 (Proxy)
  • QCS9075 (Proxy)
  • SA7255P ADP
  • SA8295P ADP
  • SA8775P ADP
  • Samsung Galaxy S21
  • Samsung Galaxy S21 Ultra
  • Samsung Galaxy S21+
  • Samsung Galaxy S22 5G
  • Samsung Galaxy S22 Ultra 5G
  • Samsung Galaxy S22+ 5G
  • Samsung Galaxy S23
  • Samsung Galaxy S23 Ultra
  • Samsung Galaxy S23+
  • Samsung Galaxy S24
  • Samsung Galaxy S24 Ultra
  • Samsung Galaxy S24+
  • Samsung Galaxy S25
  • Samsung Galaxy S25 Ultra
  • Samsung Galaxy S25+
  • Samsung Galaxy Tab S8
  • Snapdragon 8 Elite Gen 5 QRD
  • Snapdragon X Elite CRD
  • Snapdragon X Plus 8-Core CRD
  • Xiaomi 12
  • Xiaomi 12 Pro
  • XR2 Gen 2 (Proxy)

Supported Chipsets

  • Qualcomm® QCS8275 (Proxy)
  • Qualcomm® QCS8550 (Proxy)
  • Qualcomm® QCS9075 (Proxy)
  • Qualcomm® SA7255P
  • Qualcomm® SA8295P
  • Qualcomm® SA8775P
  • Snapdragon® 8 Elite Mobile
  • Snapdragon® 8 Elite Gen 5 Mobile
  • Snapdragon® 8 Gen 1 Mobile
  • Snapdragon® 8 Gen 2 Mobile
  • Snapdragon® 8 Gen 3 Mobile
  • Snapdragon® 888 Mobile
  • Snapdragon® X Elite
  • Snapdragon® X Plus 8-Core

Looking for more? See models created by industry leaders.

Discover Model Makers