Deploy the network with actions

Here, you are going to deploy the network with actions for using gestures to trigger audio file playback and to control lighting.

First, copy the trained model.h5 file back over to the Raspberry Pi 3 from the PC where you ran the train.py script.

The run.py script will execute the model using the following code:

Code excerpt of the run.py script

Note: This is slightly simpler than in Episode 1 as we do not need to convert the pixels into features with MobileNet. Instead, the ConvNet directly converts pixels into predictions.

You can run this with the following command:

python run.py model.h5

A continuous update of the predicted class is printed to stderr, but as yet no actions are taken. This is a useful baseline for your own projects.

For the demo video, the file story.py was created. This uses the same code but performs different actions based on the predicted class.

To recap, the list of classes used in this demo are:

Class
Action
0 None - moving around, sitting, putting on and taking off a coat
1 Door - coming in or leaving through the door
2 Light - pointing at the main light to turn it on/off
3 Music - holding both hands up to start the music playing
4 Stop - holding up one hand to dim the volume of the music

There is only one action for Class 1 (Door) as the network was not trained to distinguish between somebody coming into the room and leaving it. This would have increased the complexity of recording and handling the data. Instead, the Room class tracks the state of the room, assuming that there is only ever one person who uses it. So the first time the door opens it must be somebody coming in the room and the next time it must be somebody going out of the room.

This is an oversimplification of the real world that would only work for a demo. A more robust alternative is to train the same, or a separate, network to independently detect whether the room is empty or not. However, this is outside the scope of this guide in order to keep the training section here as straightforward as possible.

One of the benefits of machine learning directly on devices, known as "on the edge", is how easy it is to combine it with regular programming to build more complex behaviors. The simple state machine controlling the demo in story.py looks like this: 

conditional statements in story.py to elicit control

Part of the power of training neural networks is that they provide a great API to the real world. In this case, the python code can tell whether someone is opening the door or pointing at a light by checking a simple variable!

The only complexity above is the use of self.ready, where the state machine is ready for a new action once the previous action has finished and the state returns to 0. This is a simple way to move from predicting states (is someone pointing at the light?) to defining actions (toggle the light when someone points at it, but do not toggle it on and off continuously while they are pointing at it).

The Raspberry Pi sits at a great intersection of the physical and information worlds. It can send email and push notifications, it can also turn home electrical appliances and lights on and off, it can play and record video and sound. Here, just a few simple actions are used:

  • Playing sounds and music.
  • Locking and unlocking a remote computer.
  • Turning on and off a standing lamp.

The entire code for these actions is:

code demonstrating how actions are mapped to control

You can write any code you want here to modify this for your own system. At a minimum, you will need to provide an audio/epic.mp3 file, as the one used in the demo is not redistributed in the GitHub repository. You will also need to change or remove the "ssh zero ..." commands to lock and unlock the screen - in the demo video, the screen was driven by a Raspberry Pi Zero, with passwordless SSH configured to allow the Pi 3 running the network to execute a command to lock or unlock the screen.

The self.light.on() and self.light.off() commands are provided by the Energenie class in the gpiozero python module. The demo was run with an Energenie Pi-Mote and socket, which uses a radio link between the Raspberry Pi and a socket to allow it to be turned on and off via software.

Previous Next