Bridging network adapters to share Internet connection with your RPi2/Windows 10 IOT Core

In my previous post, I shared about the workaround in order to share Internet connection via ICS when the option is disabled due to domain group policy. I learned that there is an easier option to share the Internet connection of your Wi-Fi adapter to devices connected to your Ethernet adapter, like a Raspberry Pi running Windows 10 IoT Core. Here are the steps:

  1. Open Network and Sharing Center.
  2. Change adapter settings.
  3. Select both your Wi-Fi and Ethernet adapter.
  4. Right-mouse click and select the option to bridge
  5. Make sure that the Internet Protocol Version 4 (TCP/IPv4) properties are set to “Obtain an IP address automatically”.
  6. In order to find your Windows 10 IoT Core device’s IP address, run Windows10IoTCoreWatcher. Windows10Io
7. Right-mouse click on the board item, and select “Copy IP address”.
8. Follow the PowerShell documentation here to use PowerShell to connect to your running device. You can also follow the instructions here to use SSH to connect to your device.
If this method fails, please fall back to the ICS setup workaround.

Windows 10 IoT Core / Raspbian on Raspberry Pi 2 using Windows 10’s Internet Connection Sharing (ICS)

You just got yourself a Raspberry Pi 2 (RPi 2). You could be running Raspbian or Windows 10 IoT Core. You don’t have access to a hub/switch/router to connect the RPi 2 for Internet connection. The next best solution is by connecting the RPi 2 to your PC via Ethernet and sharing your Wi-Fi’s internet connection via Internet Connection Sharing (ICS). When you go to the Wi-Fi adapter properties, you got some bad news:WiFi-ICS-disabled

What do you do? Here’s a workaround which is definitely NOT endorsed by your friendly network administrator, but it works. NOTE: This workaround is NOT permanent and it is not meant to flout your network administrator’s group policy because they are rules for good reasons; security, etc. 

  1. To enable sharing on the WiFi adapter, run the following command in a Command Prompt run as Administrator.
netsh wlan set hostednetwork mode=allow

 

  1. Run regedit. Go to Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Policies\Microsoft\Windows\Network Connections. Edit NC_ShowSharedAccessUI, and enter 1 in value data.
  1. Go back to Wi-Fi adapter properties, now you will see the Sharing tab. In case you don’t see the Sharing tab, this could be due to the reason that you have not connected your Ethernet adapter (for those that comes in a USB dongle). You need at least two network adapters to be present in order to do ICS. Check the box that says “Allow other network users to connect through this computer’s Internet connection”.

WiFi-ICS-avail

  1. Go to your Ethernet adapter properties. Check out the Internet Protocol Version 4 (TCP/IPv4) Properties. You will see the following preconfigured for you. Do not change these settings.

ethernet-ipv4settings

 

  1. Connect the network cable between your Raspberry Pi 2 and your Windows 10 machine via the Ethernet port.
  1. When you start up your Windows 10 IoT Core on your Raspberry Pi 2, you will see that the IP address is dynamically set to an IP address like 192.168.137.2. Voila, this means that you have Internet connection shared with your RPi 2.
RPi2-win10-dashboard

7. Follow the PowerShell documentation here to use PowerShell to connect to your running device. You can also follow the instructions here to use SSH to connect to your device.

From <http://ms-iot.github.io/content/en-US/win10/SetupRPI.htm>

  1. To make sure ICS is enabled properly, just ping any Internet site.

pinganysite

To start sending events from Windows 10 IoT Core to Azure IoT Hub:

In your Visual Studio 2015 UWP project, go to your project properties, and configure the remote machine IP.
vs2015-rpi2-props

Configure remote machine IP as 192.168.137.2, or any other IP address which you got from Step 6. Run your project.

Check Device Explorer for event has been received at the IOT Hub.deviceexplorer

Finally, a word of caution. If you don’t see ICS sharing available in your Wi-Fi adapter settings anymore, this is because the group policy has been re-applied to your machine. That’s ok, it’s meant to protect your machine after all. When you need to enable ICS for another instance, just re-do the steps above.

Azure API App – FTP Connector – How to solve “227 Entering passive mode error”

I really like the Azure Logic Apps. It reminds me of the good old days of workflows in WF except that this meant for simple workflow logic, but it does the trick. I particularly like the FTP Connector and the Azure BLOB connector. Due to the trigger function not yet implemented in the Azure BLOB connector, I found a workaround which was to use the FTP Connector, then use the Blob Connector as an action. But in this particular IoT scenario, it is hardly a workaround, it’s a necessity because the “thing” could only upload my payload in an FTP server or send an email with the payload as attachment. More about this IoT scenario I am working on in a later post.

For the past few days, I’d been stuck looking at this one error. When I clicked on the “Output Links” in my trigger history, this highly elusive message was shown:

“message”: “Unable to connect to the Server. The remote server returned an error: 227 Entering Passive Mode (104,43,19,174,193,60).\r\n.”

This is super weird because “227 Entering Passive Mode” is hardly an error, it’s a valid FTP status message for passive mode. So why is this an error? Before jumping into a conclusion that this is a bug in the FTP connector, I tried 3 different options for having a FTP server in Azure:
1. Azure websites
2. A Linux VM running vsftpd.
3. A Windows Server 2012 R2 Datacenter running FTP Server

I tried all of the above in that order. Only (1) worked, but my “thing” could not upload the event data into an Azure website FTP server. I wasn’t even going to try to fix what’s in my “thing” because this “thing” is pretty much a blackbox, if you will. It’s white actually but you get what I mean. So I tried options (2) and (3).

After trying out all kinds of configuration and creating incoming rules and Azure endpoints to allow traffic, it finally dawned upon me. If I want to make a VM instance available across a range of ports, I have to specify the ports by adding them as endpoints in my Azure management portal. However the easier solution is to enable an instance-level public IP address for that VM instance. With this configuration I can communicate directly with my VM instance using the public IP address. The advantage of doing so is that I can immediately enable a passive FTP server, because “passive” essentially means that it can choose ports dynamically.

ftpdataports
These port ranges can be huge, so I don’t think you would want to specify/add 1001 port numbers as endpoints on your VM instance. While it costs to have a instance-level public IP address, it’s well worth the money. Trust me.

Just remember to create an incoming rule for your firewall to allow your data channel port range.

What a relief when I finally saw the following in the FTP connector trigger history.

ftpconnectortriggersok

Potential solution to HTTP 500 Error with your WordPress site on Azure Websites

It’s been awhile since I blogged, and the embarrassing part about this is because my blog has been down and I haven’t really got the time to really troubleshoot. I did what anyone would do, searched online which pointed me to a few posts on MSDN and Stackoverflow but nothing really did it for me. So in case you found this post through the same searches, this could be the potential solution for you.

If you get a HTTP 500 error, and you should FTP into your Azure website deployment to figure out the exact error. Here’s where you can find your HTTP detailed errors:

awsdetailederror

 

Just click on the file according to the latest datetimestamp.

If your error looks like the following:

pythonfastcgimoduleerror

That means you have enabled Python in your Azure website, which you do not need because WordPress runs on PHP.

Go to your Azure website configuration and disable it like shown below:

python-off

 

This ought to fix it.

Using Azure Stream Analytics to tap into an Event Hub data stream

The pre-requisite is to make sure that you have requested for Stream Analytics preview if you have not already done so.

1. Create a Stream Analytics job. Jobs can only be created in 2 regions while the service is under preview.

sa-1

2. Add an input to your job., This is the best part because we get to tap into an Event Hub to analyse the continuous sequence of event data stream and potentially transform it by the same job.

sa-2-addinput

3. Select Event Hub.

sa-3-addeh4. Configure the event hub settings. You could also tap into an event hub from a different subscription.

sa-4-ehsettings

5. Due to my event hub’s event data being serialized in JSON format, this is exactly what I will select in the following step.

sa-5-serializationsetting

Under the Query tab, I just insert a simple query like the following.

SELECT * FROM getfityall 

I’m not doing any transformation yet, I just want to make sure that the event data sent by my Raspberry Pi via Event Hub’s REST API is done properly.

Next on the list of steps is to setup the output in the job.

6. Add an output to your job. I’m using a BLOB storage just to keep things simple so that I could use open the CSV file in Excel to take a look at the data stream.

sa-6-output

7. Setup the BLOB storage settings.

sa-7-blobsettings

8. Specify the serialization settings. I’m choosing CSV for obvious reason stated above.

sa-8-serialization

As I pump telemetry data from my Raspberry Pi, I could see my CSV file created/updated. Just go to the container view in your BLOB storage, and download the CSV file.

sa-9

Below is what my event data stream looks like.  It shows event data points captured from two Raspberry Pis, one using the MPL3115 temperature sensor (part of the Xtrinsic sensor board), and another using MCP9808 temperature sensor. The fun begins as I could write some funky transformation logic in the query and do some real-time complex event processing.

streamanalytics-temperature

Send accurate temperature data from Raspberry Pi to Azure Event Hub

This is a follow-up post to my previous one which is about sending accurate temperature data from the MCP9808 temperature sensor board to an Azure Event Hub. This is done through the MCP9808 Python library provided by Adafruit, and also one which I have repurposed from the Xtrinsic sensor board. This is the updated version. Inside the ~/Adafruit_Python_MCP9808/examples directory, I made a copy of the simpletest.py script as send2eventhub.py.

First step is to import eventhubms and socket.

import socket
import eventhubms

The parts which I had modified are the following:

# Loop printing measurements every second.
print 'Press Ctrl-C to quit.'
hubClient = eventhubms.EventHubClient()
parser = eventhubms.EventDataParser()
hostname = socket.gethostname()

while True:
temp = sensor.readTempC()
message = 'Temperature: {0:0.3F}'.format(temp)
body = parser.getMessage(message,"MCP9808")
print "\n"+body+""
hubStatus = hubClient.sendMessage(body,hostname)
print "[RPi-&gt;EventHub]\t[Data]"+message+"\t[Status]"+str(hubStatus)
time.sleep(1.0)

The event hub client is no different that the one I described in my previous post for sending data from the Xtrinsic sensor board. The event data is sent to the event hub via its REST API. It’s pretty simple.

Then to make sure that the event data is sent correctly and can be consumed from the event hub, I made use of Azure Stream Analytics which is the simplest way to set the event hub as an input. Otherwise you would have to write code to either directly receive events from the event hub or use the EventProcessorHost like how I used in my scalable event hub processor in an Azure worker role.

Since this is a separate topic by itself, I will be writing another post about how to create a new Stream Analytics job to add an event hub as the data stream from which events data will be consumed and transformed by the Stream Analytics job.

Using more accurate temperature sensor with my Raspberry Pi

When I stumbled upon the MCP9808 precision temperature sensor, I was sold based upon its promise of up to 0.25 degrees Celsius accuracy. Just like my Freescale Xtrinsic sensor board, there’s a Python library that that allows me to use the MC9808 temperature sensor with my Raspbbery Pi.

My one and only challenge was doing a proper soldering work because the MCP9808 temperature sensor board comes as a breakout board. I referred to this tutorial to prepare the header strip (with the pins) that comes with the sensor board, inserted it into my breadboard, placed the breakout board over the pins, and solder! It was tough. Being a software guy, there is a big mental hurdle because there is not much undo feature or “let’s try to step into this code to see if it works or fails”. It’s a big challenge to hold a soldering iron and try not to apply too much heat when trying to apply solder onto the pins, else the sensor board would be toast.

The end result: I did a better job with the first breakout board (I bought 3 sensor boards as contingency), and the other two boards were not so lucky under the soldering iron.

Here’s what the MCP9808 sensor looks like on my breadboard with the 4 connections to my Raspberry Pi through the very usefulT-Cobbler Plus connector.

mcp9808

Here’s a photo of the GPIO cable nicely coming out of my Raspberry Pi clear case and connecting to the T-Cobbler Plus.

mcp9808-whole

Here’s a “creative way” of making sure the clear case cover still protects most of my Raspberry Pi even with the protruded Xtrinsic sensor board. Priceless :)

xtrinsicsensorboard-rubberband

In my follow up post, I will describe how I send the temperature data stream to an Azure Event Hub, and then do simple analytics of the data on transit using Azure Stream Analytics.

Installing Windows Developer Program for IoT image on Intel Galileo Gen 2

In order to setup and install the Windows image onto Intel Galileo Gen 2, the best way is to follow the setup instructions in the Windows Developer Program for IoT.
Please be very careful with the folder where you will save the .cmd and .wim file. The best is to store the downloaded files in a simple to find folder, and a folder without any space in its name. Otherwise the .cmd file will not execute properly.
You may also want to rename the .wim file to a simpler name like galileo_v2.wim. When the .cmd file executes correctly, you should see the following:

C:\Temp>apply-BootMedia.cmd -image galileo_v2.wim -destination D:\ -hostname galileofai -password xxxxxx

C:\Temp>rem **************************************************************************

C:\Temp>rem ** Copyright (c) Microsoft Open Technologies, Inc.  All rights reser
ved.

C:\Temp>rem ** Licensed under the BSD 2-Clause License.

C:\Temp>rem ** See License.txt in the project root for license information.

C:\Temp>rem **************************************************************************

**** Temporary changing time zone to 'Pacific Standard Time'
**** Fat32 is local time based, and the images are created in a pacific time zone. If there is a mismatch Windows will bug check after 5 minutes.
**** Set-up work folder: C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483
**** Retrieveing C:\Temp\galileo_v2.wim
****          to C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483

Deployment Image Servicing and Management tool
Version: 6.3.9600.17031

Mounting image
[==========================100.0%==========================]
The operation completed successfully.
**** Customizing image C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483\galileo_v2.wim
****        mounted at C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483\galileo_v2.wim.mount

Deployment Image Servicing and Management tool
Version: 6.3.9600.17031

Saving image
[==========================100.0%==========================]
Unmounting image
[==========================100.0%==========================]
The operation completed successfully.
**** Applying image C:\Users\hoongfai\AppData\Local\Temp\apply-BootMedia-15483\galileo_v2.wim
****             to D:\

Deployment Image Servicing and Management tool
Version: 6.3.9600.17031

Applying image
[==========================100.0%==========================]
The operation completed successfully.
**** Mounting D:\\Windows\System32\config\SYSTEM
****       to HKEY_USERS\Galileo-15483-SYSTEM
**** Setting hostname to galileofai
**** Restoring time zone to 'Pacific Standard Time'
****
****   Successfully applied C:\Temp\galileo_v2.wim
****                     to D:\
****
****              hostname: galileofai
****              timezone: Pacific Standard Time
****              Username: Administrator
****              Password: xxxxxxxxxx
****
**** Done.

How to send sensor data from Raspberry Pi to Azure Event Hub using a Python script

This post follows what I intended to do which is to pump sensor data consisting of temperature and altitude readings from the Xtrinsic sensor board to my Azure Event Hub named getfityall. The sensor board comes with some Python scripts already as you have seen in my earlier posts. Coupled with Microsoft Azure Python packages from the SDK, I could easily reuse a very nifty Python script I found from Kloud’s blog, a very competent Microsoft cloud partner, to send sensor data to my Azure Event Hub. The repurposed Python script looks like the following:

import sys
import azure
import socket

from azure.servicebus.servicebusservice import (
  ServiceBusSASAuthentication
  )

from azure.http import (
  HTTPRequest,
  HTTPError
  )

from azure.http.httpclient import _HTTPClient

class EventHubClient(object):

  def sendMessage(self,body,partition):
    eventHubHost = "youreventhubname.servicebus.windows.net"

    httpclient = _HTTPClient(service_instance=self)

    sasKeyName = "yourprocessorname"
    sasKeyValue = "yourprocessoraccesskey"

    authentication = ServiceBusSASAuthentication(sasKeyName,sasKeyValue)

    request = HTTPRequest()
    request.method = "POST"
    request.host = eventHubHost
    request.protocol_override = "https"
    request.path = "/youreventhubname/publishers/" + partition \
    + "/messages?api-version=2014-05"
    request.body = body
    request.headers.append(('Content-Type', \
    'application/atom+xml;type=entry;charset=utf-8'))

    authentication.sign_request(request, httpclient)

    request.headers.append(('Content-Length', str(len(request.body))))

    status = 0

    try:
        resp = httpclient.perform_request(request)
        status = resp.status
    except HTTPError as ex:
        status = ex.status

    return status

class EventDataParser(object):

  def getMessage(self,payload,sensorId):
    host = socket.gethostname()
    body = "{ \"DeviceId\" : \"" + host + "\""
    msgs = payload.split(",")

    for msg in msgs:
      sensorType = msg.split(":")[0]
      sensorValue = msg.split(":")[1]
      body += ", "
      body += "\"SensorId\" : \"" + sensorId \
           + "\", \"SensorType\" : \"";
      body += sensorType + "\", \"SensorValue\" : " \
           + sensorValue + " }"
    return body

I saved and named this Python script as sendtelemtry.py.

Then in the mpl3115a2.py script, add the following import statements:

import socket
import sendtelemetry

At the end of the script, replace the Python code with the following:

mpl = mpl3115a2()
mpl.initAlt()
#mpl.initBar()
mpl.active()
time.sleep(1)
while 1:

        #print "MPL3115:", "\tAlt.", mpl.getAlt(), "\tTemp:", mpl.getTemp()
        hubClient = sendtelemetry.EventHubClient()
        parser = sendtelemetry.EventDataParser()
        hostname = socket.gethostname()
        message = "temperature:"+repr(mpl.getTemp())+",altitude:"+repr(mpl.getAlt())
        body = parser.getMessage(message,"mpl3115")
        hubStatus = hubClient.sendMessage(body,hostname)
        print "[RPi-&gt;AzureEventHub]\t[Data]"+message+"\t[Status]"+str(hubStatus)
        time.sleep(0.1)

Then execute this script by doing

sudo python mpl3115a2.py

The Azure Event Hub REST API returns HTTP status code to indicate the result of the send event REST call. A HTTP status code of 201 means success. Read more about the send event REST call here. You can monitor your Azure Event Hub dashboard to see the incoming messages.

In the next post I will share more about what I’m doing with the preview feature of Azure Stream Analytics to do a toll-gate analysis of event data in transit. Thus far I had only been doing descriptive analytics of data at rest. This ought to be interesting because it requires a different understanding of what I would like to analyze.

Fresh raspberry pi at my service

I am super glad that the items I ordered from Element14 arrived overnight. I got not one but 3 MEMS Sensor board, and it is working well. Proof that I somewhat “overused” the previous board so much so that the temperature and altitude reading stopped working.

pi@raspberryclay ~/rpi_sensor_board $ sudo python mpl3115a2.py
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128

I got a new Raspberry Pi too, it’s for my co-worker, Clayton. It’s pretty simple to clone the micro-SD card from raspberryfai and write it on to the new one. Just use the Win32DiskImager. It works like a charm.

Just to make sure my ISS agent which sends the right telemetry data which consists of MEMS sensor temperature/altitude reading, here’s a good OData feed of the data all captured in the cloud.

raspclayiss

I’m thinking of the next steps in my experimentation such as:

  • Revert to plain old way of sending telemetry data from my Raspberry Pis to Azure Event Hubs using a friendly AMQP client/library.
  • Do real-time analytics of multiple data streams from Fitbit, Strava, Raspberry Pi (temperature/altitude). I’ve been mucking around with the new preview of Azure Stream Analytics to some success.

I have 2 more packages of nice toys to be delivered. I’d ordered an Arduino Starter Kit, and 2 Intel Galileo 2, among other nifty breakout boards, sensors, and kits. Stay tuned.