Qualcomm® AI HubAI Hub
HomeCompute ModelsDeepLabV3-Plus-MobileNet-Quantized

DeepLabV3-Plus-MobileNet-Quantized

Quantized Deep Convolutional Neural Network model for semantic segmentation.

DeepLabV3 Quantized is designed for semantic segmentation at multiple scales, trained on various datasets. It uses MobileNet as a backbone.

Snapdragon® X Elite
5.24ms
Inference Time
1MB
Memory Usage
100NPU
Layers

Technical Details

Model checkpoint:VOC2012
Input resolution:513x513
Number of parameters:5.80M
Model size:6.04 MB

Applicable Scenarios

  • Anomaly Detection
  • Inventory Management

Licenses

Source Model:MIT
Deployable Model:AI Model Hub License

Tags

  • quantized
    A “quantized” model can run in low or mixed precision, which can substantially reduce inference latency.

Supported Compute Chipsets

  • Snapdragon® X Elite