PLaMo-1B
State‑of‑the‑art large language model useful on a variety of language understanding and generation tasks.
PLaMo‑1B is the first small language model (SLM) in the PLaMo™ Lite series from Preferred Networks (PFN), designed to power AI applications for edge devices including mobile, automotive, and robots across various industrial sectors. This model builds on the advancements of PLaMo‑100B, a 100‑billion parameter large language model (LLM) developed from the ground up by PFN’s subsidiary Preferred Elements (PFE). Leveraging high‑quality Japanese and English text data generated by PLaMo‑100B, PLaMo‑1B has been pre‑trained on a total of 4 trillion tokens. As a result, it delivers exceptional performance in Japanese benchmarks, outperforming other SLMs with similar parameter sizes. In evaluations such as Jaster 0‑shot and 4‑shot, PLaMo‑1B has demonstrated performance on par with larger LLMs, making it a highly efficient solution for edge‑based AI tasks.
Technical Details
Applicable Scenarios
- Dialogue
- Content Generation
- Customer Support
Supported Form Factors
- Phone
- Tablet
Licenses
Tags
- llm
- generative-ai
- quantized
Supported Devices
- Snapdragon 8 Elite QRD
Supported Chipsets
- Snapdragon® 8 Elite Mobile
Models from Preferred Networks
See all model makersLooking for more? See pre‑optimized models for all solutions.
Browse All