Overview

Arm NN is an inference engine for CPUs, GPUs, and NPUs. It bridges the gap between existing neural network frameworks and the underlying hardware IP. It enables efficient translation of existing neural network frameworks, such as TensorFlow and Caffe, allowing them to run efficiently and without modification across Arm Cortex CPUs and Arm Mali GPUs.

Arm NN now supports networks defined using the Open Neural Network Exchange (ONNX) format.

This guide shows you how to set up and configure your Arm NN build environment, so that you can use the ONNX format with Arm NN. This guide also shows how to test that your build has completed successfully.

Next