Open
Description
Issue Description
Hi, I am trying to run TinyYOLO object detection in real time on Android. For my inference code block, I am experiencing around 2800ms per inference, which is around 0.5FPS.
Few Methods tried:
- Increase android VM Heap size from 256MB to 512MB, memory during app running is floating around 300MB, performance not improved significantly.
- thought of using Threading for inference but the prediction will not be synced with the actual frame
Can you share some ideas on how to further improve the inference time?
The objdetection activity code: https://gist.github.com/yptheangel/f1e3c3dfd64c470d151890be00465c7a
Gradle dependecies:
https://gist.github.com/yptheangel/6af7ed1febd21b5b4d2339eb8ce985b5
Version Information
Phone: Huawei P9+
OS: Android OS 7.
CPU Specs : arm64 Octa-core (4x2.5 GHz Cortex-A72 & 4x1.8 GHz Cortex-A53)
Model used : TinyYOLOv2 from DL4J Model zoo (59MB)
Activity