Hands-on with the Internet of Things – Part 4

This entry is the next part of a series in which I share my experience building a solar-powered camera node for a ZigBee mesh network. These posts are not intended to be a tutorial on ZigBee; there are lots of places to find that information, one of the most digestible being Building Wireless Sensor Networks by Robert Faludi.

In the last episode, I described my first attempt at getting the ZigBee modules communicating with messages, using a JSON format. It worked, but this approach was not very efficient for a couple of reasons:

  • It increased the bandwidth required to send each image segment by a factor of two.
  • It required a Base64 encoding and decoding step for each image segment.

Another shortcoming of this approach is that it used the XBee’s transparent serial or ‘AT’ mode, in which the connection to both ends look like a serial cable between the remote node and the Coordinator. This was not going to suit my ultimate objective of having multiple remote nodes, all sending images to a central repository through the Coordinator, because I would need to reconfigure the Coordinator each time I wanted to receive data from a different remote node. Transparent AT mode is great for one-to-one node communications, but that’s not my real goal here.

The XBees have another mode, called API mode, in which a client can send and receive low-level data packets and individually address each one. I will use this mode to resolve the shortcomings of my JSON-based approach.

There are a number of client libraries that handle the marshaling and unmarshaling of the underlying ZigBee data packets. There’s an Arduino library called xbee-arduino which I could see is well-supported and mature.

For the server-side, in the last episode I used a Ruby script running on my Mac Pro. This was easy to get going and worked well for early development. For the future, though, I want the Coordinator to use a server running on a Raspberry Pi. This is of course because Raspberry Pis are cool, and a Pi can be running 24×7 much more cheaply and easily under a pile of papers on my desk than running my Mac Pro all the time.

Although it’s essentially a general purpose Linux machine, and I can install any language, it seems the language with a lot of support in the Raspberry Pi community is Python. This is a happy coincidence, as Python is a language I’m interested in learning; my DiUS colleague Fred Rotbart recently used it on a data visualisation and prediction project, and I was really impressed by its speed and power. Surely the Coordinator server is a perfect opportunity to hone my (presently non-existent) Python skills. Hence I needed a Python version of the XBee API-mode library, which also already exists. It’s called python-xbee.

With the pieces in place for API mode, I needed a new protocol, this time based on binary messages. I don’t have much space to play with; for a Series 2 XBee with ZB Pro firmware like the modules I’m using, the maximum payload size for each packet is 84 bytes. Using encryption reduces the size further. So far I’ve been using 32-byte segments for the images, mainly so the very small serial buffers on the camera and Arduino don’t get overwhelmed; I’ll stick with that size to make sure the segments can fit into a packet even if I want to use ZigBee encryption later.

At this stage I want to send two types of messages from remote nodes, for images and information, but there may be more in the future. So I’ll use the first byte of the payload to indicate the message type. Just for fun, I will choose arbitrary values: 0xf3 in the first byte of the payload to indicate a packet contains an information message, and 0xf4 for image messages.

Creating an information message on the Arduino is easy with the PString class.

char payload[2+info.length()];
payload[0] = 0xf3; // payload type
PString infoString(&payload[1], sizeof(payload)-1);

view rawgistfile1.ino hosted with ❤ by GitHub

Having built the payload, the request is assembled by creating the destination address and pointing to the payload buffer. In a ZigBee network, the address of the Coordinator is always 0x0 (and remember there is always only one Coordinator in a PAN). Therefore if I address the packet with the Coordinator address, the router nodes in the mesh network will make sure the packet is routed to the right node, according to the best network path, a judgement each router makes on the number of hops required to get a packet to the destination and the signal strength of each hop.

// Specify the address of the remote XBee (this is the SH + SL)
XBeeAddress64 addr64 = XBeeAddress64(0x0, 0x0); // Coordinator address
// Create a TX Request
ZBTxRequest zbTx = ZBTxRequest(addr64, (uint8_t *)payload, sizeof(payload));
// Send the request

view rawgistfile1.ino hosted with ❤ by GitHub

The complete Arduino code now looks like this:

// This program takes a snapshot and sends it to the XBee module
#include <Adafruit_VC0706.h>
#include <SoftwareSerial.h>
#include <XBee.h>
#include <PString.h>
// On Uno: camera TX connected to pin 2, camera RX to pin 3:
SoftwareSerial cameraconnection = SoftwareSerial(2, 3); // Arduino RX, TX
Adafruit_VC0706 cam = Adafruit_VC0706(&cameraconnection);
XBee xbee = XBee();
void sendImagePayload(uint16_t bytesRemaining, uint8_t bufferLength, uint8_t *buffer) {
uint8_t payload[bufferLength+4];
payload[0] = 0xf4; // payload type
payload[1] = highByte(bytesRemaining);
payload[2] = lowByte(bytesRemaining);
payload[3] = bufferLength;
for (uint8_t i=0; i<bufferLength; i++) {
payload[i+4] = buffer[i];
// Specify the address of the remote XBee (this is the SH + SL)
XBeeAddress64 addr64 = XBeeAddress64(0x0, 0x0); // Coordinator address
// Create a TX Request
ZBTxRequest zbTx = ZBTxRequest(addr64, payload, sizeof(payload));
// Send your request
void sendInfoPayload(String info) {
info.replace(‘\n’, ‘,’);
info.replace(‘\r’, ‘ ‘);
char payload[2+info.length()];
payload[0] = 0xf3; // payload type
PString infoString(&payload[1], sizeof(payload)-1);
// Specify the address of the remote XBee (this is the SH + SL)
XBeeAddress64 addr64 = XBeeAddress64(0x0, 0x0); // Coordinator address
// Create a TX Request
ZBTxRequest zbTx = ZBTxRequest(addr64, (uint8_t *)payload, sizeof(payload));
// Send your request
void setup() {
sendInfoPayload(“VC0706 Camera snapshot test”);
// Try to locate the camera
if (cam.begin()) {
sendInfoPayload(“Camera found.”);
else {
sendInfoPayload(“No camera found?”);
// Get the camera version
char *reply = cam.getVersion();
if (reply == 0) {
sendInfoPayload(“Failed to get version”);
else {
// Set the picture size
cam.setImageSize(VC0706_640x480); // biggest
// Turn on motion detection
boolean flag = cam.setMotionDetect(true); // turn it on
if (cam.getMotionDetect())
sendInfoPayload(“Motion detection is ON.”);
sendInfoPayload(“Motion detection is OFF.”);
void loop() {
if (cam.motionDetected()) {
// Take the picture
sendInfoPayload(“Motion detected – taking snap now…”);
if (! cam.takePicture())
sendInfoPayload(“Failed to snap!”);
sendInfoPayload(“Picture taken!”);
// Get the size of the image (frame) taken
uint16_t jpglen = cam.frameLength();
sendInfoPayload(“Sending ” + String(jpglen, DEC) + ” byte image.”);
int32_t time = millis();
// Read all the data up to # bytes!
while (jpglen > 0) {
// read 32 bytes at a time;
uint8_t *buffer;
uint8_t bytesToRead = min(32, jpglen); // change 32 to 64 for a speedup but may not work with all setups!
buffer = cam.readPicture(bytesToRead);
jpglen -= bytesToRead;
sendImagePayload(jpglen, bytesToRead, buffer);
time = millis() – time;
sendInfoPayload(String(time)+” ms elapsed”);

view rawgistfile1.ino hosted with ❤ by GitHub

You will see at the end of the setup() function that now I’m putting the camera in motion detection mode, so it will take a photo when it detects movement, rather than taking a photo as quickly as possible. This seems to be the best option for when I deploy it in the field, as a photo will likely be most interesting when something moves. In the loop() function, it checks whether motion has been detected; if so, it temporarily turns off motion detection while it takes a photo and sends it to the Coordinator, and then turns it back on.

On the Coordinator server-side, as before, the program will continuously listen for messages and handle them according to their type. Now that I’m using API mode, messages can arrive at the Coordinator from any remote node, and so image segment messages need to be matched up with their source; to get this going, though, for now we’ll continue to assume a single source remote node.

The eventual goal is to be able to view the images from any client, anywhere. Rather than store the messages on the local server, I decided the Python script should push the images to Amazon Web Services (AWS) S3 storage. Later on, I’ll be able to grab the images from each remote node and display them on a web page or iPhone client.

The Python script for the server looks like this:

# Continuously read the serial port and process IO data received from a remote XBee.
from xbee import ZigBee
import serial
import struct
from array import array
import boto
import boto.s3
from datetime import datetime, date, time
import sys
from boto.s3.key import Key
AWS_ACCESS_KEY_ID = <your access key here>
bucketName = AWS_ACCESS_KEY_ID.lower() + ‘-farmcam-bucket’
conn = boto.connect_s3()
bucket = conn.lookup(bucketName)
if bucket is None:
bucket = conn.create_bucket(bucketName, location=boto.s3.connection.Location.APSoutheast2)
def ByteToHex(byteStr):
# Convert a byte string to it’s hex string representation e.g. for output.
return ”.join( [ “%02X” % ord( x ) for x in byteStr ] ).strip()
def percent_cb(complete, total):
ser = serial.Serial(‘/dev/tty.usbserial-A901JYV9’, 57600)
xbee = ZigBee(ser, escaped=True)
imageBytes = array(‘B’)
byteCount = 0
# Continuously read and print packets
print ‘Continuously read and print packets’
while True:
response = xbee.wait_read_frame()
msgType = struct.unpack(“B”, response[‘rf_data’][0])[0]
if msgType == 0xf3:
print ‘info received from ‘ + ByteToHex(response[‘source_addr_long’]) + ‘: ‘ + response[‘rf_data’][1:]
elif msgType == 0xf4:
bytesRemaining = struct.unpack(“H”, response[‘rf_data’][1:3][::-1])[0]
numberOfBytesRxd = struct.unpack(“B”, response[‘rf_data’][3])[0]
byteCount += numberOfBytesRxd
for i in range(numberOfBytesRxd):
imageBytes.append(struct.unpack(“B”, response[‘rf_data’][4+i])[0])
if bytesRemaining <= 0:
# Write the image to a file
dt = datetime.now()
fileName = ByteToHex(response[‘source_addr_long’]) + ‘-‘ + dt.strftime(“%Y-%m-%d-%H-%M-%S-%f”) + ‘.jpg’
output_file = open(‘../’ + fileName, ‘wb’)
# Write the file to S3
print ‘Uploading %s to Amazon S3 bucket %s’ % (fileName, bucketName)
k = Key(bucket)
k.key = fileName
k.set_contents_from_filename(‘../’ + fileName, cb=percent_cb, num_cb=10)
# reset for the next one
imageBytes = array(‘B’)
byteCount = 0
print ‘not a recognised message type.’
except KeyboardInterrupt:

view rawgistfile1.py hosted with ❤ by GitHub

As a test, I left it running over the length of a day. It works well, although there are packets dropped now and then, which is something I’ll need to check when distances between nodes get more realistic. I may need to add some intelligence into the protocol to implement resending lost packets. But in the main, I’m happy with how it’s working.

Here’s what the camera snapped when I was out of the room and the dog somehow got off her leash. She did what Labradoodles do: go looking for food in the kitchen.

Want to know more about how DiUS can help you?



Level 3, 31 Queen St
Melbourne, Victoria, 3000
Phone: 03 9008 5400

DiUS wishes to acknowledge the Traditional Custodians of the lands on which we work and gather at both our Melbourne and Sydney offices. We pay respect to Elders past, present and emerging and celebrate the diversity of Aboriginal peoples and their ongoing cultures and connections to the lands and waters of Australia.

Subscribe to updates from DiUS

Sign up to receive the latest news, insights and event invites from DiUS straight into your inbox.

© 2024 DiUS®. All rights reserved.

Privacy  |  Terms