Electra-Bert-Base-Discrim-Google
Language model for masked language modeling and general‑purpose NLP tasks.
ELECTRABERT is a lightweight BERT model designed for efficient self‑supervised learning of language representations. It can be used for identify unnatural or artificially modified text and as a backbone for various NLP tasks.
Technical Details
Model checkpoint:google/electra-base-discriminator
Input resolution:1x384
Number of parameters:109M
Model size (float):417 MB
Applicable Scenarios
- Text Classification
- Sentiment Analysis
- Named Entity Recognition
Licenses
Source Model:APACHE-2.0
Deployable Model:AI-HUB-MODELS-LICENSE
Tags
- backbone
Supported IoT Devices
- QCS8275 (Proxy)
- QCS8550 (Proxy)
- QCS9075 (Proxy)
Supported IoT Chipsets
- Qualcomm® QCS8275 (Proxy)
- Qualcomm® QCS8550 (Proxy)
- Qualcomm® QCS9075 (Proxy)
Looking for more? See models created by industry leaders.
Discover Model Makers








