- Notifications
You must be signed in to change notification settings - Fork22
🧠⚙️ Standalone native implementation of the Web Neural Network API
License
webmachinelearning/webnn-native
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
| Backend \ OS | Windows | Linux |
|---|---|---|
| null (for unit test) | ||
| DirectMLX | ||
| OpenVINO | ||
| XNNPACK | ||
| oneDNN | ||
| MLAS |
WebNN-native is a native implementation of theWeb Neural Network API.
It provides several building blocks:
- WebNN C/C++ headers that applications and other building blocks use.
- The
webnn.hthat is an one-to-one mapping with the WebNN IDL. - A C++ wrapper for the
webnn.h
- The
- Backend implementations that use platforms' ML APIs:
- DirectML on Windows 10
- DirectMLX on Windows 10
- OpenVINO on Windows 10 and Linux
- oneDNN on Windows 10 and Linux
- XNNPACK on Windows 10 and Linux
- MLAS on Windows 10 and Linux
- Other backends are to be added
WebNN-native uses the code of other open source projects:
- The code generator and infrastructure code ofDawn project.
- The DirectMLX and device wrapper ofDirectML project.
- TheXNNPACK project.
- TheoneDNN project.
- TheMLAS project.
WebNN-native uses the Chromium build system and dependency management so you need toinstall depot_tools and add it to the PATH.
Notes:
- On Windows, you'll need to set the environment variable
DEPOT_TOOLS_WIN_TOOLCHAIN=0. This tells depot_tools to use your locally installed version of Visual Studio (by default, depot_tools will try to download a Google-internal version).
Get the source code as follows:
# Clone the repo as "webnn-native"> git clone https://github.com/webmachinelearning/webnn-native.git webnn-native&&cd webnn-native# Bootstrap the gclient configuration> cp scripts/standalone.gclient .gclient# Fetch external dependencies and toolchains with gclient> gclient sync
Generate build files usinggn args out/Debug orgn args out/Release.
A text editor will appear asking build options, the most common option isis_debug=true/false; otherwisegn args out/Release --list shows all the possible options.
To build with a backend, please set the corresponding option from following table.
| Backend | Option |
|---|---|
| DirectML | webnn_enable_dml=true |
| DirectMLX | webnn_enable_dmlx=true |
| OpenVINO | webnn_enable_openvino=true |
| XNNPACK | webnn_enable_xnnpack=true |
| oneDNN | webnn_enable_onednn=true |
| MLAS | webnn_enable_mlas=true |
Then useninja -C out/Release orninja -C out/Debug to build WebNN-native.
Notes
- To build with XNNPACK backend, please build XNNPACK first, e.g. by
./scripts/build-local.sh. For Windows build, it requires supplying -DCMAKE_MSVC_RUNTIME_LIBRARY="MultiThreaded$<$CONFIG:Debug:Debug>" to set MSVC static runtime library. - To build with oneDNN backend, please build oneDNN first by following thebuild from source instructions.
- To build with MLAS backend, please build MLAS (part of ONNX Runtime) first by following theBuild ONNX Runtime for inferencing, e.g., by
.\build.bat --config Release --parallel --enable_msvc_static_runtimefor Windows build.
Run unit tests:
> ./out/Release/webnn_unittestsRun end2end tests on a default device:
> ./out/Release/webnn_end2end_testsYou can also specify a device to run end2end tests using "-d" option, for example:
> ./out/Release/webnn_end2end_tests -d gpuCurrently "cpu", "gpu" and "default" are supported, more devices are to be supported in the future.
Notes:
- For OpenVINO backend, pleaseinstall 2021.4 version andset the environment variables before running the end2end tests.
- The current implementation of oneDNN and MLAS backends is mainly for the investigation of WebNNOperation Level Execution use case. So only a limited set of tests (such as of conv2d) is expected to pass.
Apache 2.0 Public License, please seeLICENSE.
About
🧠⚙️ Standalone native implementation of the Web Neural Network API
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Releases
Packages0
Uh oh!
There was an error while loading.Please reload this page.
Contributors13
Uh oh!
There was an error while loading.Please reload this page.