This entry is the next part of a series in which I share my experience building a solar-powered camera node for a ZigBee mesh network. These posts are not intended to be a tutorial on ZigBee; there are lots of places to find that information, one of the most digestible being Building Wireless Sensor Networks by Robert Faludi.
In the last episode, I described my first attempt at getting the ZigBee modules communicating with messages, using a JSON format. It worked, but this approach was not very efficient for a couple of reasons:
- It increased the bandwidth required to send each image segment by a factor of two.
- It required a Base64 encoding and decoding step for each image segment.
Another shortcoming of this approach is that it used the XBee’s transparent serial or ‘AT’ mode, in which the connection to both ends look like a serial cable between the remote node and the Coordinator. This was not going to suit my ultimate objective of having multiple remote nodes, all sending images to a central repository through the Coordinator, because I would need to reconfigure the Coordinator each time I wanted to receive data from a different remote node. Transparent AT mode is great for one-to-one node communications, but that’s not my real goal here.
The XBees have another mode, called API mode, in which a client can send and receive low-level data packets and individually address each one. I will use this mode to resolve the shortcomings of my JSON-based approach.
There are a number of client libraries that handle the marshaling and unmarshaling of the underlying ZigBee data packets. There’s an Arduino library called xbee-arduino which I could see is well-supported and mature.
For the server-side, in the last episode I used a Ruby script running on my Mac Pro. This was easy to get going and worked well for early development. For the future, though, I want the Coordinator to use a server running on a Raspberry Pi. This is of course because Raspberry Pis are cool, and a Pi can be running 24×7 much more cheaply and easily under a pile of papers on my desk than running my Mac Pro all the time.
Although it’s essentially a general purpose Linux machine, and I can install any language, it seems the language with a lot of support in the Raspberry Pi community is Python. This is a happy coincidence, as Python is a language I’m interested in learning; my DiUS colleague Fred Rotbart recently used it on a data visualisation and prediction project, and I was really impressed by its speed and power. Surely the Coordinator server is a perfect opportunity to hone my (presently non-existent) Python skills. Hence I needed a Python version of the XBee API-mode library, which also already exists. It’s called python-xbee.
With the pieces in place for API mode, I needed a new protocol, this time based on binary messages. I don’t have much space to play with; for a Series 2 XBee with ZB Pro firmware like the modules I’m using, the maximum payload size for each packet is 84 bytes. Using encryption reduces the size further. So far I’ve been using 32-byte segments for the images, mainly so the very small serial buffers on the camera and Arduino don’t get overwhelmed; I’ll stick with that size to make sure the segments can fit into a packet even if I want to use ZigBee encryption later.
At this stage I want to send two types of messages from remote nodes, for images and information, but there may be more in the future. So I’ll use the first byte of the payload to indicate the message type. Just for fun, I will choose arbitrary values: 0xf3 in the first byte of the payload to indicate a packet contains an information message, and 0xf4 for image messages.
Creating an information message on the Arduino is easy with the PString class.
Having built the payload, the request is assembled by creating the destination address and pointing to the payload buffer. In a ZigBee network, the address of the Coordinator is always 0x0 (and remember there is always only one Coordinator in a PAN). Therefore if I address the packet with the Coordinator address, the router nodes in the mesh network will make sure the packet is routed to the right node, according to the best network path, a judgement each router makes on the number of hops required to get a packet to the destination and the signal strength of each hop.
The complete Arduino code now looks like this:
You will see at the end of the setup() function that now I’m putting the camera in motion detection mode, so it will take a photo when it detects movement, rather than taking a photo as quickly as possible. This seems to be the best option for when I deploy it in the field, as a photo will likely be most interesting when something moves. In the loop() function, it checks whether motion has been detected; if so, it temporarily turns off motion detection while it takes a photo and sends it to the Coordinator, and then turns it back on.
On the Coordinator server-side, as before, the program will continuously listen for messages and handle them according to their type. Now that I’m using API mode, messages can arrive at the Coordinator from any remote node, and so image segment messages need to be matched up with their source; to get this going, though, for now we’ll continue to assume a single source remote node.
The eventual goal is to be able to view the images from any client, anywhere. Rather than store the messages on the local server, I decided the Python script should push the images to Amazon Web Services (AWS) S3 storage. Later on, I’ll be able to grab the images from each remote node and display them on a web page or iPhone client.
The Python script for the server looks like this:
As a test, I left it running over the length of a day. It works well, although there are packets dropped now and then, which is something I’ll need to check when distances between nodes get more realistic. I may need to add some intelligence into the protocol to implement resending lost packets. But in the main, I’m happy with how it’s working.
Here’s what the camera snapped when I was out of the room and the dog somehow got off her leash. She did what Labradoodles do: go looking for food in the kitchen.