HomeAI ModelsDeepLabV3-Plus-MobileNet-Quantized

    DeepLabV3-Plus-MobileNet-Quantized

    Quantized Deep Convolutional Neural Network model for semantic segmentation.

    DeepLabV3 Quantized is designed for semantic segmentation at multiple scales, trained on various datasets. It uses MobileNet as a backbone.

    2.62ms
    Inference Time
    0-55MB
    Memory Usage
    99NPU
    Layers

    Technical Details

    Model checkpoint:VOC2012
    Input resolution:513x513
    Number of parameters:5.80M
    Model size:6.04 MB

    Applicable Scenarios

    • Anomaly Detection
    • Inventory Management

    Supported Form Factors

    • Phone
    • Tablet
    • IoT

    Licenses

    Source Model:MIT
    Deployable Model:AI Model Hub License

    Tags

    • quantized
      A “quantized” model can run in low or mixed precision, which can substantially reduce inference latency.

    Supported Devices

    • Google Pixel 3
    • Google Pixel 3a
    • Google Pixel 3a XL
    • Google Pixel 4
    • Google Pixel 4a
    • Google Pixel 5a 5G
    • QCS6490 (Proxy)
    • QCS8250 (Proxy)
    • QCS8550 (Proxy)
    • RB3 Gen 2 (Proxy)
    • RB5 (Proxy)
    • Samsung Galaxy S21
    • Samsung Galaxy S21 Ultra
    • Samsung Galaxy S21+
    • Samsung Galaxy S22 5G
    • Samsung Galaxy S22 Ultra 5G
    • Samsung Galaxy S22+ 5G
    • Samsung Galaxy S23
    • Samsung Galaxy S23 Ultra
    • Samsung Galaxy S23+
    • Samsung Galaxy S24
    • Samsung Galaxy S24 Ultra
    • Samsung Galaxy S24+
    • Samsung Galaxy Tab S8
    • Xiaomi 12
    • Xiaomi 12 Pro

    Supported Chipsets

    • Qualcomm® QCS6490
    • Qualcomm® QCS8250
    • Qualcomm® QCS8550
    • Snapdragon® 8 Gen 1 Mobile
    • Snapdragon® 8 Gen 2 Mobile
    • Snapdragon® 8 Gen 3 Mobile
    • Snapdragon® 888 Mobile