Albert-Base-V2-Hf
Language model for masked language modeling and general‑purpose NLP tasks.
ALBERT is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for masked language modeling and as a backbone for various NLP tasks.
Technical Details
Model checkpoint:albert/albert-base-v2
Input resolution:1x384
Number of parameters:11.8M
Model size (float):43.9 MB
Applicable Scenarios
- Text Classification
- Sentiment Analysis
- Named Entity Recognition
License
Model:APACHE-2.0
Tags
- backbone
Supported Compute Devices
- Snapdragon X Elite CRD
- Snapdragon X Plus 8-Core CRD
Supported Compute Chipsets
- Snapdragon® X Elite
- Snapdragon® X Plus 8-Core
Looking for more? See models created by industry leaders.
Discover Model Makers








