ML on the Edge with Fluent Bit TensorFlow plugin
The CNCF project Fluent Bit is a popular open-source log processor with a pluggable architecture. Low memory and CPU footprints have made Fluent Bit suitable to run on both cloud environments and low-resource devices. With the power of different plugins and the integrated stream processor, Fluent Bit works as data collector, transformer and router with extensive capabilities.
Intelligence on resource-constraint edge devices is one of the hot topics in the area of IoT data analytics. In this session, we present Fluent Bit TensorFlow plugin, which allows running TensorFlow Lite inference (optimized for Arm processors) on a stream of input data and routes the results to another processing component, or output. We will demonstrate an edge ML case study on a JETSON Nano board with Arm Cortex-A57 processor that runs TensorFlow Lite inference on a stream of input images using CPU/GPU as the computation device.