Baby Monitor using BeagleBone and MIMO Smart Baby Monitor sensor (part 1)

I’m a new dad and consider myself a technology innovator, so it is probably to be expected that I’d look for high-tech ways to monitor my baby, beyond the standard remote cameras. Of course, not all the ways I want to monitor are really easy to reproduce, but the folks at Rest Devices have done something scalable. They created the MIMO Smart Baby Monitor. It uses Bluetooth Low Energy to send data back from the sensor, making it very easy to connect to a BeagleBone Black (or SeeedStudio BeagleBone Green) using a BLE dongle (such as http://www.amazon.com/Plugable-Bluetooth-Adapter-Raspberry-Compatible/dp/B009ZIILLI as we did in https://jkridner.wordpress.com/2014/07/08/ibm-tutorial-on-using-ti-sensortag-with-beaglebone-black/).
The most critical reason I chose to do this is because I don’t want to rely on a cloud service to monitor my child. While I appreciate having the phone app and relatively reliable proxy of the Rest Devices provided cloud service, I really wanted to make sure I had reliable monitoring and notifications within the home, not depending on any network other than between my BeagleBone and the sensor itself. With this sort of setup, I am also looking at eventually automating the bottle warmer such that I have a warm bottle available right as he’s starting to stir. Of course, I still want to be able to visualize things on a web page, but this time just one served up in my private network on a BeagleBone.
For this part 1 article, I’m just going to show you how I started to extract the data. I’ll start looking at analyzing real-world data in part 2.
For my setup, I opted to use an inexpensive Chromebook so that I could hack and monitor via a local web interface. Eventually, I’ll plug the BeagleBone in to wall-wart power and wire it via Ethernet to my local LAN.
mimo_connect_1
I started with the ‘bluez’ tools and the ‘hcitool’ in particular. It took me a second to note that scanning for BLE devices mean using the ‘lescan’ argument.
root@beaglebone:/var/lib/cloud9# hcitool lescan
LE Scan ...
9C:20:7B:A2:1A:5F (unknown)
9C:20:7B:A2:1A:5F (unknown)
68:D9:3C:91:EA:80 (unknown)
7C:D1:C3:00:DA:1E (unknown)
68:D9:3C:91:EA:80 (unknown)
00:07:80:77:C4:5A (unknown)
00:07:80:77:C4:5A
7C:D1:C3:00:DA:1E (unknown)
C1:87:44:21:74:CD (unknown)
C1:87:44:21:74:CD fA7

To figure out which of these devices is actually the MIMO sensor, I opted to utilize a Python library called ‘bluepy’. I discovered it had some simple tools for reading BLE endpoints. Installing ‘bluepy’ was done trivially with ‘pip’. Other devices tended to show some identifying information. Some guesswork led me to believe this was the right sensor.
root@beaglebone:/var/lib/cloud9# python /usr/local/lib/python2.7/dist-packages/bluepy/btle.py 00:07:80:77:C4:5A
Connecting to: 00:07:80:77:C4:5A, address type: public
Service :
Characteristic , supports NOTIFY READ
-> '\xaa\xed\x18\x01\x01\x00\x00\x9d\x00R\x86Y'

At this point, I needed to adapt an example for ‘bluepy’ to read this data repeatedly. I called the Python code ‘readturtle.py’:
import binascii
import struct
import time
from bluepy.btle import UUID, Peripheral
uuid = UUID("d96a513d-a6d8-4f89-9895-ca131a0935cb")

p = Peripheral("00:07:80:77:C4:5A", "public")

try:
ch = p.getCharacteristics()[0]
if (ch.supportsRead()):
while 1:
val = binascii.b2a_hex(ch.read())
print str(val) + ""
time.sleep(1)

finally:
p.disconnect()

Running ‘python readturtle.py’ gave me some results.
root@beaglebone:/var/lib/cloud9# python readturtle.py

aaed1800ab00000000000000000000d8e2ca9258
aaed1800a600000000000000000000d7f9ca9558
aaed1800a1000000000000000000001d380a9558

This looked interesting, but it wasn’t obvious to me what the data meant. I figured I could try to reverse-engineer it by applying various stimulus and looking at the changes, but I’ve also been monitoring the Quantified Self forums off and on for a while; I figured it wouldn’t hurt to ask if anyone was trying the same thing. Amazingly, an individual from Rest Devices hopped on and gave me a sufficient break down of the data format (respiration, motion, temperature, etc.) that I should be able to move forward quickly now.
https://forum.quantifiedself.com/t/tapping-into-mimo-smart-baby-monitor/1758
For the on-going work, keep a look out for my next blog post where I actually visualize some respiratory data from my baby. Also, check out my notes and code on my Github gist at:
https://gist.github.com/jadonk/f2323348eb7706889f88