Overview

Arm NN is an inference middleware for CPUs, GPUs, and NPUs. Arm NN bridges the gap between existing NN frameworks and the underlying IP. Arm NN enables efficient translation of existing neural network frameworks, like TensorFlow and Caffe. Arm NN allows these neural networks to run efficiently, without modification, across Arm Cortex-A CPUs, Arm Mali GPUs, and the Arm Machine Learning NPU processor.

Arm NN provides backends to allow workloads to run on Cortex-A CPUs, Mali-GPUs, and Arm ML processors.

Arm NN also lets you write your own custom backends to interface with third-party devices, as shown in the following diagram:

Write your own custom backends to interface with 3rd party devices diagram

This guide shows you how to write a custom backend for Arm NN, providing an example custom backend to illustrate the process. First, the guide takes you through the steps that are required to compile the custom plugin with Arm NN. Next, the guide explains how to run the tests to check that the plugin is working correctly. Finally, the guide explores the custom backend and shows how to write your own plugin.

Next