Skip to content
This repository has been archived by the owner on Oct 16, 2019. It is now read-only.

Have you tried using the quantized MobileNet from Caffe 2? Does it work? #5

Open
yxchng opened this issue Mar 14, 2019 · 1 comment
Labels
question Further information is requested

Comments

@yxchng
Copy link

yxchng commented Mar 14, 2019

https://github.com/caffe2/models/tree/master/mobilenet_v2_quantized#model

@yxchng yxchng changed the title Have you tried using the quantized mobilenet from caffe2? Have you tried using the quantized MobileNet from Caffe 2? Does it work? Mar 14, 2019
@cedrickchee cedrickchee added the question Further information is requested label Apr 12, 2019
@cedrickchee
Copy link
Owner

cedrickchee commented May 11, 2019

I have tried running the quantized MobileNet v2 from your link. According to the docs there, I have ensure my Caffe2 has QNNPACK based Int8 operators before running inference on it.

But, the Android app crashed with the following error:

A/native: [F int8_fc_op.h:43] Check failed: K == W.t.size(1) 
    terminating.
A/libc: Fatal signal 6 (SIGABRT), code -6 in tid 7829 (Camera Backgrou), pid 7790 (facebook.f8demo)

I fixed that by normalizing the input images in the same way the pre-trained model expect by modifying native-lib.cpp. This time, the app didn't crash but stuck at the "LOADING" view and Logcat is telling me:

2019-05-11 14:07:09.378 10561-10594/facebook.f8demo I/Adreno: Invalid colorspace 1

I guess I have to modify native-lib.cpp for the image preprocessing to work with the different mean and std.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants