Installing Ubuntu Core and Ubuntu 16.04 on Intel NUC

I have an Intel® NUC Kit DE3815TYKHE and I finally got some time to reinstall the OS on it. I’d also installed an SSD drive which I had lying on the top shelve of my study room. My intention is to install Ubuntu Core on the eMMC and Ubuntu 16.04 on the SSD. Ubuntu Core to try out the snap packages including the snap package for Azure IoT Edge. Ubuntu 16.04 so that I have a local dev environment instead of having to constantly spin up my Linux VM in Azure just to try things out.

  1. Upgrade BIOS. At the time of writing this post, the latest BIOS update for this NUC is version 0060.
  2. Follow the steps of flashing Ubuntu Core on the Intel NUC at the Ubuntu developer page.
  3. Instead of a standard Ubuntu image, I used the Linux Lite 64bit image. Follow the instructions here. This is based on an Ubuntu 16.04 xenial image.
  4. Installing the grub bootloader failed for me. Instead I manually installed grub on my Linux Lite drive.
  5. Open up a shell window. Credit goes to the person who posted on this forum thread. It works like a charm. [Note: anywhere you see “XY” or “X”, change that to the correct drive letter (“X”) and partition number (“Y”) for your LL root partition. To list your partitions, just run lsblk command]
    sudo mount /dev/sdXY /mnt
    for i in /dev /dev/pts /proc /sys; do sudo mount -B $i /mnt$i; done
    sudo chroot /mnt
    grub-install /dev/sdX
    update-grub
    exit
    for i in /sys /proc /dev/pts /dev; do sudo umount /mnt$i; done
    sudo umount /mnt
  6. I turned off UEFI boot, and just stuck to Legacy boot in the BIOS. Works for me.

Running IoT Edge on Raspbian/arm32/arm-hf

I reckon that I’ll always try to start my technical articles with the following tl;dr to introduce a summary of a lengthy post.

TL;DR – Azure IoT Edge is a project which enables edge processing and analytics in IoT solutions. The modules within the IoT Edge gateway can be written in different programing languages (native C, as well as different module language bindings available such as Node.js, .NET, .NET Core and Java) and can run on platforms such as Windows, and various Linux distros. As part of what I do in my day job, I work with customers and partners as they build their edge modules. One of the key asks is to be able to write modules in .NET Core and deploy on Raspberry Pi due to its ease of use and popularity for PoC and prototyping purposes. This article explains how to run modules written for .NET Core within the same Azure IoT Edge framework.

This post is timely as the .NET Core engineering team just announced less than a week ago that the .NET Core Runtime ARM32 builds is now available. More details about this announcement available here. Please do take note that these builds are community supported, not yet supported by Microsoft and have a preview status. For prototyping purposes, I wouldn’t complain too much about this. My plan was to get the existing modules for .NET Core cross-compiled on my dev machine for linux-arm as well as .NET Core runtime 2.0.0. This cannot be performed on Raspbian because only .NET Core runtime is available, not SDK, and there are native runtime shared libraries for Raspbian which must be cross-compiled. I will demonstrate how you could pull a Docker container with the right cmake toolchain to cross-compile for Raspbian,

According to the documentation in .NET Core Sample for Azure IoT Edge, the current version of the .NET Core binding and sample modules were tested and written in .NET Core v1.1.1. However I’d also tested that this works with .NET Core 2.0.0.

Cross-compiling Azure IoT Edge native runtime for ARM32

If you do a search in NuGet, you will find that there are a number of NuGet packages for Azure IoT Edge which contain native runtime libraries for a number of platforms namely Windows x64, Ubuntu 16.04 LTS x64, Debian 8 x64 and .NET Standard. How about for Raspbian which is a flavour of Debian 8 on ARM32? Instead of waiting for this to be released, you cross-compile the runtime libraries. After all this is one of the benefits of the open-source nature of Azure IoT Edge project.

Within the Azure IoT Edge repo jenkins folder, we have a build script just for cross-compiling for Raspberry Pi. This script is called raspberrypi_c.sh and it will spit out an output which is the cmake toolchain file called toolchain-rpi.cmake. To make things simpler, the Azure IoT engineering team had created a Docker image. However there is no guarantee that this image will be kept on Docker Hub at all times. Now that it is, you can find it here. There are heaps of other Docker images available as well.

In your dev machine which has Docker daemon installed (not your Raspberry Pi), pull down the image by:

docker pull aziotbld/raspberrypi-c

Then run the Docker container:

docker run -it aziotbld/raspberrypi-c

Inside the Docker container do the following steps:

  1. Update and Install all the pre-requisite libraries.
apt-get update 

apt-get install -y libunwind8 libunwind8-dev gettext libicu-dev liblttng-ust-dev libcurl4-openssl-dev libssl-dev uuid-dev apt-transport-https

2. Install .NET Core SDK on this Docker container. The image contains Ubuntu 14.04.

curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg

mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg 

sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-trusty-prod trusty main" > /etc/apt/sources.list.d/dotnetdev.list'

apt-get update

apt-get install dotnet-sdk-2.0.0

3. Git clone the Azure IoT Edge repository.

git clone https://github.com/Azure/iot-edge.git4

4. Target .NET Core 2.0.0 for Azure IoT Edge .NET Core binding and modules. To do so modify all of the .NET Core modules’ csproj files by doing the following:

i. Change <TargetFramework> from “netstandard1.3” to netcoreapp2.0 so that it looks like the following:

    <TargetFramework>netcoreapp2.0</TargetFramework>

Note: .NET Standard is a specification for .NET APIs which form the base class libraries (BCL). In the original, specifying .NET Standard 1.3 means that .NET Core 1.0 implements .NET Standard 1.3, which also means that it exposes all APIs defined in .NET Standard versions 1.0 through 1.3. More information about this here.

ii. Comment <NetStandardImplicitPackageVersion> like the line listed here.

You can now publish specifically for the linux-arm runtime. In the shell script for tools/build_dotnet_core.sh, add a -r flag to the dotnet commands. E.g., in the lines for

dotnet restore and dotnet build, after adding the -r flag, the lines look like the following:

dotnet restore -r linux-arm $project
dotnet build -r linux-arm $build_config $project

5. For some reasons the Test nuget packages were missing, I trust everything is tested fine, so I will comment the last few lines of my build_dotnet_core.sh script.

#for project in $projects_to_test
#do
#    dotnet test $project
#    [ $? -eq 0 ] || exit $?
#done

6. Run

cd iot-edge
chmod 755 ./tools/build_dotnet_core.sh
./tools/build_dotnet_core.sh

7. Now it is time to cross-compile native runtime library for Raspbian 8 ARM32. Create a symbolic link to the RPiTools in /home/jenkins because the raspberrypi_c.sh script expects the RPiTools in the home directory which for the root user is the following:

ln -sd /home/jenkins/RPiTools/ /root/RPiTools

8. Run

chmod ./jenkins/raspberrypi_c.sh
./jenkins/raspberrypi_c.sh

Note: If you encounter cmake errors, just delete the install-dep directory and re-run the shell script above.

This creates a toolchain file at ./toolchain-rpi.cmake

9. Now is time to build Azure IoT Edge with .NET Core binding targeting 2.0.0 along with the cmake toolchain file.

./tools/build.sh --disable-native-remote-modules --enable-dotnet-core-binding --toolchain-file ./toolchain-rpi.cmake

10. All the native runtime libraries required to run your gateway on Raspbian and targeting .NET Core runtime 2.0.0 are in the following directories:

iot-edge/install-deps/lib/*.so
iot-edge/build/bindings/dotnetcore/libdotnetcore.so
iot-edge/build/samples/dotnet_core_module_sample/ - *.so and *.dll
iot-edge/build/modules/ - *.so and *.dll

11. Now it is worthwhile to commit the changes you have made in the Docker container in a new image which is properly tagged/labelled. You should also copy the entire iot-edge folder in a tar ball out of this Docker container, and move it to your Raspberry Pi device. Steps required to do this is outside of this tutorial.

12. You can check out the Azure IoT Edge samples from its GitHub repo.

13. I tried out this simulated_ble sample within the dotnetcore folder. Please make sure that you have .NET Core 2.0.0 runtime installed on your Raspberry Pi device. I added this within the a loaders section in my gw-config.json file. Actually I’m not too sure if this is how it works in Linux, but I was paranoid anyhow. The right way is to export the directory in the LD_LIBRARY_PATH environment variable, I think. 🙂

    "loaders": [
        {
            "type": "dotnetcore",
            "name": "dotnetcore",
            "configuration": {
                "binding.path": "/home/pi/iot-edge/build/bindings/dotnetcore/libdotnetcore.so",
                "binding.coreclrpath": "/opt/dotnet-2.0.0/shared/Microsoft.NETCore.App/2.0.0/libcoreclr.so",
                "binding.trustedplatformassemblieslocation": "/opt/dotnet-2.0.0/shared/Microsoft.NETCore.App/2.0.0/"
            }
        }
    ],

14. To be even more paranoid, I copied all the native runtime libraries (*.so) and DLLs for the .NET Core binding modules into my execution folder. I also copied and rename a native gateway host as gw and place it within the same folder.

15. The result…..

pi@raspberryfai:~/iot-edge-samples/dotnetcore/simulated_ble/src/bin/Debug/netcoreapp2.0$ ./gw gw-config.json
Gateway is running.
Device: 01:02:03:03:02:01, Temperature: 10.00
Device: 01:02:03:03:02:01, Temperature: 11.00
Device: 01:02:03:03:02:01, Temperature: 12.00
Device: 01:02:03:03:02:01, Temperature: 13.00
Device: 01:02:03:03:02:01, Temperature: 14.00

VOILA!

Bridging network adapters to share Internet connection with your RPi2/Windows 10 IOT Core

In my previous post, I shared about the workaround in order to share Internet connection via ICS when the option is disabled due to domain group policy. I learned that there is an easier option to share the Internet connection of your Wi-Fi adapter to devices connected to your Ethernet adapter, like a Raspberry Pi running Windows 10 IoT Core. Here are the steps:

  1. Open Network and Sharing Center.
  2. Change adapter settings.
  3. Select both your Wi-Fi and Ethernet adapter.
  4. Right-mouse click and select the option to bridge
  5. Make sure that the Internet Protocol Version 4 (TCP/IPv4) properties are set to “Obtain an IP address automatically”.
  6. In order to find your Windows 10 IoT Core device’s IP address, run Windows10IoTCoreWatcher. Windows10Io
7. Right-mouse click on the board item, and select “Copy IP address”.
8. Follow the PowerShell documentation here to use PowerShell to connect to your running device. You can also follow the instructions here to use SSH to connect to your device.
If this method fails, please fall back to the ICS setup workaround.

Windows 10 IoT Core / Raspbian on Raspberry Pi 2 using Windows 10’s Internet Connection Sharing (ICS)

You just got yourself a Raspberry Pi 2 (RPi 2). You could be running Raspbian or Windows 10 IoT Core. You don’t have access to a hub/switch/router to connect the RPi 2 for Internet connection. The next best solution is by connecting the RPi 2 to your PC via Ethernet and sharing your Wi-Fi’s internet connection via Internet Connection Sharing (ICS). When you go to the Wi-Fi adapter properties, you got some bad news:WiFi-ICS-disabled

What do you do? Here’s a workaround which is definitely NOT endorsed by your friendly network administrator, but it works. NOTE: This workaround is NOT permanent and it is not meant to flout your network administrator’s group policy because they are rules for good reasons; security, etc. 

  1. To enable sharing on the WiFi adapter, run the following command in a Command Prompt run as Administrator.
netsh wlan set hostednetwork mode=allow

 

  1. Run regedit. Go to Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\Network Connections. Edit NC_ShowSharedAccessUI, and enter 1 in value data.
  1. Go back to Wi-Fi adapter properties, now you will see the Sharing tab. In case you don’t see the Sharing tab, this could be due to the reason that you have not connected your Ethernet adapter (for those that comes in a USB dongle). You need at least two network adapters to be present in order to do ICS. Check the box that says “Allow other network users to connect through this computer’s Internet connection”.

WiFi-ICS-avail

  1. Go to your Ethernet adapter properties. Check out the Internet Protocol Version 4 (TCP/IPv4) Properties. You will see the following preconfigured for you. Do not change these settings.

ethernet-ipv4settings

 

  1. Connect the network cable between your Raspberry Pi 2 and your Windows 10 machine via the Ethernet port.
  1. When you start up your Windows 10 IoT Core on your Raspberry Pi 2, you will see that the IP address is dynamically set to an IP address like 192.168.137.2. Voila, this means that you have Internet connection shared with your RPi 2.
RPi2-win10-dashboard

7. Follow the PowerShell documentation here to use PowerShell to connect to your running device. You can also follow the instructions here to use SSH to connect to your device.

From <http://ms-iot.github.io/content/en-US/win10/SetupRPI.htm>

  1. To make sure ICS is enabled properly, just ping any Internet site.

pinganysite

To start sending events from Windows 10 IoT Core to Azure IoT Hub:

In your Visual Studio 2015 UWP project, go to your project properties, and configure the remote machine IP.
vs2015-rpi2-props

Configure remote machine IP as 192.168.137.2, or any other IP address which you got from Step 6. Run your project.

Check Device Explorer for event has been received at the IOT Hub.deviceexplorer

Finally, a word of caution. If you don’t see ICS sharing available in your Wi-Fi adapter settings anymore, this is because the group policy has been re-applied to your machine. That’s ok, it’s meant to protect your machine after all. When you need to enable ICS for another instance, just re-do the steps above.

Using Azure Stream Analytics to tap into an Event Hub data stream

The pre-requisite is to make sure that you have requested for Stream Analytics preview if you have not already done so.

1. Create a Stream Analytics job. Jobs can only be created in 2 regions while the service is under preview.

sa-1

2. Add an input to your job., This is the best part because we get to tap into an Event Hub to analyse the continuous sequence of event data stream and potentially transform it by the same job.

sa-2-addinput

3. Select Event Hub.

sa-3-addeh4. Configure the event hub settings. You could also tap into an event hub from a different subscription.

sa-4-ehsettings

5. Due to my event hub’s event data being serialized in JSON format, this is exactly what I will select in the following step.

sa-5-serializationsetting

Under the Query tab, I just insert a simple query like the following.

SELECT * FROM getfityall 

I’m not doing any transformation yet, I just want to make sure that the event data sent by my Raspberry Pi via Event Hub’s REST API is done properly.

Next on the list of steps is to setup the output in the job.

6. Add an output to your job. I’m using a BLOB storage just to keep things simple so that I could use open the CSV file in Excel to take a look at the data stream.

sa-6-output

7. Setup the BLOB storage settings.

sa-7-blobsettings

8. Specify the serialization settings. I’m choosing CSV for obvious reason stated above.

sa-8-serialization

As I pump telemetry data from my Raspberry Pi, I could see my CSV file created/updated. Just go to the container view in your BLOB storage, and download the CSV file.

sa-9

Below is what my event data stream looks like.  It shows event data points captured from two Raspberry Pis, one using the MPL3115 temperature sensor (part of the Xtrinsic sensor board), and another using MCP9808 temperature sensor. The fun begins as I could write some funky transformation logic in the query and do some real-time complex event processing.

streamanalytics-temperature

Using more accurate temperature sensor with my Raspberry Pi

When I stumbled upon the MCP9808 precision temperature sensor, I was sold based upon its promise of up to 0.25 degrees Celsius accuracy. Just like my Freescale Xtrinsic sensor board, there’s a Python library that that allows me to use the MC9808 temperature sensor with my Raspbbery Pi.

My one and only challenge was doing a proper soldering work because the MCP9808 temperature sensor board comes as a breakout board. I referred to this tutorial to prepare the header strip (with the pins) that comes with the sensor board, inserted it into my breadboard, placed the breakout board over the pins, and solder! It was tough. Being a software guy, there is a big mental hurdle because there is not much undo feature or “let’s try to step into this code to see if it works or fails”. It’s a big challenge to hold a soldering iron and try not to apply too much heat when trying to apply solder onto the pins, else the sensor board would be toast.

The end result: I did a better job with the first breakout board (I bought 3 sensor boards as contingency), and the other two boards were not so lucky under the soldering iron.

Here’s what the MCP9808 sensor looks like on my breadboard with the 4 connections to my Raspberry Pi through the very usefulT-Cobbler Plus connector.

mcp9808

Here’s a photo of the GPIO cable nicely coming out of my Raspberry Pi clear case and connecting to the T-Cobbler Plus.

mcp9808-whole

Here’s a “creative way” of making sure the clear case cover still protects most of my Raspberry Pi even with the protruded Xtrinsic sensor board. Priceless 🙂

xtrinsicsensorboard-rubberband

In my follow up post, I will describe how I send the temperature data stream to an Azure Event Hub, and then do simple analytics of the data on transit using Azure Stream Analytics.

Installing Windows Developer Program for IoT image on Intel Galileo Gen 2

In order to setup and install the Windows image onto Intel Galileo Gen 2, the best way is to follow the setup instructions in the Windows Developer Program for IoT.
Please be very careful with the folder where you will save the .cmd and .wim file. The best is to store the downloaded files in a simple to find folder, and a folder without any space in its name. Otherwise the .cmd file will not execute properly.
You may also want to rename the .wim file to a simpler name like galileo_v2.wim. When the .cmd file executes correctly, you should see the following:

C:\Temp>apply-BootMedia.cmd -image galileo_v2.wim -destination D:\ -hostname galileofai -password xxxxxx

C:\Temp>rem **************************************************************************

C:\Temp>rem ** Copyright (c) Microsoft Open Technologies, Inc.  All rights reser
ved.

C:\Temp>rem ** Licensed under the BSD 2-Clause License.

C:\Temp>rem ** See License.txt in the project root for license information.

C:\Temp>rem **************************************************************************

**** Temporary changing time zone to 'Pacific Standard Time'
**** Fat32 is local time based, and the images are created in a pacific time zone. If there is a mismatch Windows will bug check after 5 minutes.
**** Set-up work folder: C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483
**** Retrieveing C:\Temp\galileo_v2.wim
****          to C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483

Deployment Image Servicing and Management tool
Version: 6.3.9600.17031

Mounting image
[==========================100.0%==========================]
The operation completed successfully.
**** Customizing image C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483\galileo_v2.wim
****        mounted at C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483\galileo_v2.wim.mount

Deployment Image Servicing and Management tool
Version: 6.3.9600.17031

Saving image
[==========================100.0%==========================]
Unmounting image
[==========================100.0%==========================]
The operation completed successfully.
**** Applying image C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483\galileo_v2.wim
****             to D:\

Deployment Image Servicing and Management tool
Version: 6.3.9600.17031

Applying image
[==========================100.0%==========================]
The operation completed successfully.
**** Mounting D:\\Windows\System32\config\SYSTEM
****       to HKEY_USERS\Galileo-15483-SYSTEM
**** Setting hostname to galileofai
**** Restoring time zone to 'Pacific Standard Time'
****
****   Successfully applied C:\Temp\galileo_v2.wim
****                     to D:\
****
****              hostname: galileofai
****              timezone: Pacific Standard Time
****              Username: Administrator
****              Password: xxxxxxxxxx
****
**** Done.

How to send sensor data from Raspberry Pi to Azure Event Hub using a Python script

This post follows what I intended to do which is to pump sensor data consisting of temperature and altitude readings from the Xtrinsic sensor board to my Azure Event Hub named getfityall. The sensor board comes with some Python scripts already as you have seen in my earlier posts. Coupled with Microsoft Azure Python packages from the SDK, I could easily reuse a very nifty Python script I found from Kloud’s blog, a very competent Microsoft cloud partner, to send sensor data to my Azure Event Hub. The repurposed Python script looks like the following:

import sys
import azure
import socket

from azure.servicebus.servicebusservice import (
  ServiceBusSASAuthentication
  )

from azure.http import (
  HTTPRequest,
  HTTPError
  )

from azure.http.httpclient import _HTTPClient

class EventHubClient(object):

  def sendMessage(self,body,partition):
    eventHubHost = "youreventhubname.servicebus.windows.net"

    httpclient = _HTTPClient(service_instance=self)

    sasKeyName = "yourprocessorname"
    sasKeyValue = "yourprocessoraccesskey"

    authentication = ServiceBusSASAuthentication(sasKeyName,sasKeyValue)

    request = HTTPRequest()
    request.method = "POST"
    request.host = eventHubHost
    request.protocol_override = "https"
    request.path = "/youreventhubname/publishers/" + partition \
    + "/messages?api-version=2014-05"
    request.body = body
    request.headers.append(('Content-Type', \
    'application/atom+xml;type=entry;charset=utf-8'))

    authentication.sign_request(request, httpclient)

    request.headers.append(('Content-Length', str(len(request.body))))

    status = 0

    try:
        resp = httpclient.perform_request(request)
        status = resp.status
    except HTTPError as ex:
        status = ex.status

    return status

class EventDataParser(object):

  def getMessage(self,payload,sensorId):
    host = socket.gethostname()
    body = "{ \"DeviceId\" : \"" + host + "\""
    msgs = payload.split(",")

    for msg in msgs:
      sensorType = msg.split(":")[0]
      sensorValue = msg.split(":")[1]
      body += ", "
      body += "\"SensorId\" : \"" + sensorId \
           + "\", \"SensorType\" : \"";
      body += sensorType + "\", \"SensorValue\" : " \
           + sensorValue + " }"
    return body

I saved and named this Python script as sendtelemtry.py.

Then in the mpl3115a2.py script, add the following import statements:

import socket
import sendtelemetry

At the end of the script, replace the Python code with the following:

mpl = mpl3115a2()
mpl.initAlt()
#mpl.initBar()
mpl.active()
time.sleep(1)
while 1:

        #print "MPL3115:", "\tAlt.", mpl.getAlt(), "\tTemp:", mpl.getTemp()
        hubClient = sendtelemetry.EventHubClient()
        parser = sendtelemetry.EventDataParser()
        hostname = socket.gethostname()
        message = "temperature:"+repr(mpl.getTemp())+",altitude:"+repr(mpl.getAlt())
        body = parser.getMessage(message,"mpl3115")
        hubStatus = hubClient.sendMessage(body,hostname)
        print "[RPi-&gt;AzureEventHub]\t[Data]"+message+"\t[Status]"+str(hubStatus)
        time.sleep(0.1)

Then execute this script by doing

sudo python mpl3115a2.py

The Azure Event Hub REST API returns HTTP status code to indicate the result of the send event REST call. A HTTP status code of 201 means success. Read more about the send event REST call here. You can monitor your Azure Event Hub dashboard to see the incoming messages.

In the next post I will share more about what I’m doing with the preview feature of Azure Stream Analytics to do a toll-gate analysis of event data in transit. Thus far I had only been doing descriptive analytics of data at rest. This ought to be interesting because it requires a different understanding of what I would like to analyze.

Fresh raspberry pi at my service

I am super glad that the items I ordered from Element14 arrived overnight. I got not one but 3 MEMS Sensor board, and it is working well. Proof that I somewhat “overused” the previous board so much so that the temperature and altitude reading stopped working.

pi@raspberryclay ~/rpi_sensor_board $ sudo python mpl3115a2.py
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128

I got a new Raspberry Pi too, it’s for my co-worker, Clayton. It’s pretty simple to clone the micro-SD card from raspberryfai and write it on to the new one. Just use the Win32DiskImager. It works like a charm.

Just to make sure my ISS agent which sends the right telemetry data which consists of MEMS sensor temperature/altitude reading, here’s a good OData feed of the data all captured in the cloud.

raspclayiss

I’m thinking of the next steps in my experimentation such as:

  • Revert to plain old way of sending telemetry data from my Raspberry Pis to Azure Event Hubs using a friendly AMQP client/library.
  • Do real-time analytics of multiple data streams from Fitbit, Strava, Raspberry Pi (temperature/altitude). I’ve been mucking around with the new preview of Azure Stream Analytics to some success.

I have 2 more packages of nice toys to be delivered. I’d ordered an Arduino Starter Kit, and 2 Intel Galileo 2, among other nifty breakout boards, sensors, and kits. Stay tuned.

 

Sending sensor board data from Raspberry Pi to Intelligent Systems Service (ISS)

I have been mucking around with the Xtrinsic-sense board which is an add-on to my Raspberry Pi. In my previous post, I had executed the Python scripts that came with the sensor library that I cloned out of this github. There is a good article that explains about how to enable Xtrinsic-sense board in Raspbian.

However my aim is to get the Intelligent Systems Service (ISS) sample C application to invoke the sensor board shared library to retrieve the altimeter and temperature values, assign that to a string property within the data model, then send it to ISS. The sensor board library comes in the form of a shared library called sensor.so.

Hence I started on a journey to try to invoke the functions within the sensor.so out of mirroring the Python code, but in the sample C application. Now I must admit that Python coding skills is non-existent, and my C coding skills were really rusty. The last time I wrote a C/C++ application was in 2000/2001! What’s surprising is that I haven’t completed forgotten C, it’s like riding a bike, suddenly remnants of makefile, linking with dynamic/static libraries, include files, and function pointers in C slowly came back to memory.

But I’m still hopeless in Python, I gave up because I didn’t understand what the following code does except that it does some byte shifting, but for what reason, it beats me.

def getTemp(self):
    t = self.readTemp()
    t_m = (t >> 8) & 0xff;
    t_l = t & 0xff;

    if (t_l > 99):
        t_l = t_l / 1000.0
    else:
        t_l = t_l / 100.0
    return (t_m + t_l)

I knew there has to be an easier way to invoke the sensor board library in C, but where’s the sample code? After much searching, I found a reverse-engineered sensor.so with C source code! How amazing, thanks to this bloke named Lars Christensen. It was just hot off the oven, to be specific just last Saturday 13 Sept. To get started, all I had to do was to clone his source from github. In my raspberrywifai SSH session, I issued the following command (provided you have already installed git too).

git clone https://github.com/larsch/rpi-mems-sensor.git

There was a slight problem. The mpl3115a2.h header file was empty. No worries, just go to the github repository at https://github.com/larsch/rpi-mems-sensor, click on mpl3115a2.h, copy the content, and paste it in the file in your working directory on RaspPi.

You also need to download and extract the bcm2835 library. Then just follow the steps on how to make it. This creates a static library called libbcm2835.a. Next, make the reverse-engineered sensor board library so that you create a new sensor.so, and libmemssensor.a. I was only interested in the static libraries of libbcm2835.a and libmemssensor.a so that I can compile test.c. There wasn’t a proper make statement for test so I just issued my own gcc command.

gcc -Wall -I /home/pi/rpi-mems-sensor  -L. -lmemssensor -lbcm2835 -static test.c -o test libmemssensor.a libbcm2835.a

Do not just run ./test because initializing bcm2835 requires elevated permission. If you do that you will get the following error:

pi@raspberryfai ~/rpi-mems-sensor $ ./test
bcm2835_init: Unable to open /dev/mem: Permission denied

So do this instead:

sudo ./test

And voila! I get the altimeter and temperature reading.

raw alt = 11968
alt: 46.750000, temp: 23.812500
raw alt = 11968
alt: 46.750000, temp: 23.812500
raw alt = 11968
alt: 46.750000, temp: 23.812500
raw alt = 11968

The Python code which I just couldn’t comprehend is the equivalent of the following C code:

 

double getTemp() {
    int t = MPL3115A2_Read_Temp();
    int t_m = (t >> 8) & 0xFF;
    int t_l = t & 0xFF;
    if (t_m > 0x7f) t_m = t_m - 256;
    return t_m + t_l / 256.0;
}

It still beats me what it does, but that’s alright, I just wanted this code to work within the ISS sample C application. Next I embedded this code within the DATAPROVIDER_RESULT DATA_PROVIDER_INIT function. I didn’t want to create a new data model so I reused the sample data model, and chose to assign the altimeter and temperature reading to the stringProperty, as shown in the following code:

// setting StringProperty which contains the alt and temperature
bcm2835_init();
MPL3115A2_Init_Alt();
MPL3115A2_Active();
sleep(1);
char tempinfo[1024];
snprintf(tempinfo, sizeof(tempinfo), "alt: %f, temp: %f\n", getAlt($
stringProperty = tempinfo;

Compile (also includes modifying the makefile to ensure the sensor board header files and static libraries are correctly linked), and then run the ISSAgent app, and wait for it……..

Info: Microsoft Intelligent Systems Agent Console
Info: Using device endpoint: getfityall.device.intelligentsystems.azure.net
Info: Registering device: Name=raspberrywifai, FriendlyName=raspberrywifai, ModelName=Contoso.Device, Description=raspberrywifai
Info: x-ms-activity-id: 557374f8-2342-4bfb-bb7b-ce19199fa77e
Info: Getting endpoints for device raspberrywifai
Info: x-ms-activity-id: 229227bf-483b-4914-a613-e447bb7caca5
Info: Got ingress queue namespace=uswepm01c02x0sb20-0.servicebus.windows.net, name=getfityall/raspberrywifai
Info: Got commands queue namespace=uswepm01c02x0sb20-0.servicebus.windows.net, topic path=DeviceBank1, subscription name=raspberrywifai
Info: Got per device token=[secret]
raw alt = 11152
Info: Uploading message 2e5f2634-bee2-4e88-b1c3-3ed991f25f41 for device raspberrywifai (payload={"@iot.devicename":"raspberrywifai", "value":[ {"@odata.context":"Contoso\/$metadata#Device\/$entity", "@odata.id":"Device('raspberrywifai')", "@Microsoft.IntelligentSystems.Vocabulary.V1.Timestamp":"2014-09-17T00:34:41Z", "structProperty":{"simpleField":17, "structField":{"int32Field":10, "int64Field":34, "doubleField":40.000000000000000, "stringField":"Wed Sep 17 10:34:40 2014\u000A", "guidField":"0F1E2D3C-4B5A-6978-8796-A5B4C3D2E1F0", "binaryField":"AAAA", "dateTimeOffsetField":"2014-09-17T10:34:40Z", "booleanField":true}}, "stringProperty":"alt: 43.562500, temp: 22.812500\u000A", "dateTimeOffsetProperty":"2014-09-17T10:34:40Z"} ]}) ...Info: Success

Or more specifically:

"stringProperty":"alt: 43.562500, temp: 22.812500\u000A"

Woo hoo! I got it. Now moving on to the next steps. This opens up other possibilities such as sending temperature as an event on a pre-determined interval, and also set up alarm when temperature falls below or goes above a certain value. The fun begins! 🙂