- Notifications
You must be signed in to change notification settings - Fork2
MLModelScope mobile Predictor (mPredictor) for Qualcomm Neural Processing Engine (SNPE)
License
abhiutd/snpe-predictor
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
Go binding for Qualcomm Neural Processing Engine (SNPE) C++ API. It is also referred to as MLModelScope SNPE mobile Predictor (SNPE mPredictor). It is used to perform model inference on mobile devices. It is used by theQualcomm SNPE agent inMLModelScope to perform model inference in Go. More importantly, it can be used as a standalone predictor in any given Android application. Refer to [Usage Modes](Usage Modes) for further details.
Download and install go-mxnet:
go get -v github.com/abhiutd/snpe-predictor
The binding requires Qualcomm SNPE, Gomobile and other Go packages.
The Qualcomm SNPE C++ library is expected to be under/opt/snpe
.
Download pre-built binaries as instructed inSNPE documentation.
If you get an error about not being able to write to/opt
then perform the following
sudo mkdir -p /opt/snpesudo chown -R `whoami` /opt/snpe
If you are using custom path for build files, change CGO_CFLAGS, CGO_CXXFLAGS and CGO_LDFLAGS enviroment variables. Refer toUsing cgo with the go command.
For example,
export CGO_CFLAGS="${CGO_CFLAGS} -I/tmp/snpe/include" export CGO_CXXFLAGS="${CGO_CXXFLAGS} -I/tmp/snpe/include" export CGO_LDFLAGS="${CGO_LDFLAGS} -L/tmp/snpe/lib"
You can install the dependency throughgo get
.
cd $GOPATH/src/github.com/abhiutd/snpe-predictorgo get -u -v ./...
Or useDep.
dep ensure -v
This installs the dependency invendor/
. It is the preferred option.
Also, one needs to installgomobile
to be able to generate Java/Objective-C bindings of the mPredictor.
go get golang.org/x/mobile/cmd/gomobilegomobile init
Configure the linker environmental variables since the Qualcomm SNPE C++ library is under a non-system directory. Place the following in either your~/.bashrc
or~/.zshrc
file
Linux
export LIBRARY_PATH=$LIBRARY_PATH:/opt/snpe/libexport LD_LIBRARY_PATH=/opt/snpe/lib:$DYLD_LIBRARY_PATH
macOS
export LIBRARY_PATH=$LIBRARY_PATH:/opt/snpe/libexport DYLD_LIBRARY_PATH=/opt/snpe/lib:$DYLD_LIBRARY_PATH
SNPE mPredictor is written in Go, binded with SNPE C++ API. To be able to use it in a mobile application, you would have to generate appropriate bindings (Java for Android). We provide bindings off-the-shelf inbindings, but you can generate your own by using the following command.
gomobile bind -o bindings/android/snpe-predictor.aar -target=android/arm64 -v github.com/abhiutd/snpe-predictor
This command buildssnpe-predictor.aar
binary for Android witharm64
ISA.
One can employ SNPE mPredictor to perform model inference in multiple ways, which are listed below.
- Standalone Predictor (mPredictor)
There are four main API calls to be used for performing model inference in a given mobile application.
// create a SNPE mPredictorNew()// perform inference on given input dataPredict()// generate output predictionsReadPredictedOutputFeatures()// delete the SNPE mPredictorClose()
Refer tocbits.go for details on the inputs/outputs of each API call.
- MLModelScope Mobile Agent
Download MLModelScope mobile agent fromagent. It has Tensorflow Lite and Qualcomm SNPE mPredictors in built. Refer to its documentation to understand its usage.
- MLModelScope web UI
Choose Qualcomm SNPE as framework and one of the available mobile devices as hardware backend to perform model inference through web interface.