Installing Ubuntu Core and Ubuntu 16.04 on Intel NUC

I have an Intel® NUC Kit DE3815TYKHE and I finally got some time to reinstall the OS on it. I’d also installed an SSD drive which I had lying on the top shelve of my study room. My intention is to install Ubuntu Core on the eMMC and Ubuntu 16.04 on the SSD. Ubuntu Core to try out the snap packages including the snap package for Azure IoT Edge. Ubuntu 16.04 so that I have a local dev environment instead of having to constantly spin up my Linux VM in Azure just to try things out.

  1. Upgrade BIOS. At the time of writing this post, the latest BIOS update for this NUC is version 0060.
  2. Follow the steps of flashing Ubuntu Core on the Intel NUC at the Ubuntu developer page.
  3. Instead of a standard Ubuntu image, I used the Linux Lite 64bit image. Follow the instructions here. This is based on an Ubuntu 16.04 xenial image.
  4. Installing the grub bootloader failed for me. Instead I manually installed grub on my Linux Lite drive.
  5. Open up a shell window. Credit goes to the person who posted on this forum thread. It works like a charm. [Note: anywhere you see “XY” or “X”, change that to the correct drive letter (“X”) and partition number (“Y”) for your LL root partition. To list your partitions, just run lsblk command]
    sudo mount /dev/sdXY /mnt
    for i in /dev /dev/pts /proc /sys; do sudo mount -B $i /mnt$i; done
    sudo chroot /mnt
    grub-install /dev/sdX
    for i in /sys /proc /dev/pts /dev; do sudo umount /mnt$i; done
    sudo umount /mnt
  6. I turned off UEFI boot, and just stuck to Legacy boot in the BIOS. Works for me.

Running IoT Edge on Raspbian/arm32/arm-hf

I reckon that I’ll always try to start my technical articles with the following tl;dr to introduce a summary of a lengthy post.

TL;DR – Azure IoT Edge is a project which enables edge processing and analytics in IoT solutions. The modules within the IoT Edge gateway can be written in different programing languages (native C, as well as different module language bindings available such as Node.js, .NET, .NET Core and Java) and can run on platforms such as Windows, and various Linux distros. As part of what I do in my day job, I work with customers and partners as they build their edge modules. One of the key asks is to be able to write modules in .NET Core and deploy on Raspberry Pi due to its ease of use and popularity for PoC and prototyping purposes. This article explains how to run modules written for .NET Core within the same Azure IoT Edge framework.

This post is timely as the .NET Core engineering team just announced less than a week ago that the .NET Core Runtime ARM32 builds is now available. More details about this announcement available here. Please do take note that these builds are community supported, not yet supported by Microsoft and have a preview status. For prototyping purposes, I wouldn’t complain too much about this. My plan was to get the existing modules for .NET Core cross-compiled on my dev machine for linux-arm as well as .NET Core runtime 2.0.0. This cannot be performed on Raspbian because only .NET Core runtime is available, not SDK, and there are native runtime shared libraries for Raspbian which must be cross-compiled. I will demonstrate how you could pull a Docker container with the right cmake toolchain to cross-compile for Raspbian,

According to the documentation in .NET Core Sample for Azure IoT Edge, the current version of the .NET Core binding and sample modules were tested and written in .NET Core v1.1.1. However I’d also tested that this works with .NET Core 2.0.0.

Cross-compiling Azure IoT Edge native runtime for ARM32

If you do a search in NuGet, you will find that there are a number of NuGet packages for Azure IoT Edge which contain native runtime libraries for a number of platforms namely Windows x64, Ubuntu 16.04 LTS x64, Debian 8 x64 and .NET Standard. How about for Raspbian which is a flavour of Debian 8 on ARM32? Instead of waiting for this to be released, you cross-compile the runtime libraries. After all this is one of the benefits of the open-source nature of Azure IoT Edge project.

Within the Azure IoT Edge repo jenkins folder, we have a build script just for cross-compiling for Raspberry Pi. This script is called and it will spit out an output which is the cmake toolchain file called toolchain-rpi.cmake. To make things simpler, the Azure IoT engineering team had created a Docker image. However there is no guarantee that this image will be kept on Docker Hub at all times. Now that it is, you can find it here. There are heaps of other Docker images available as well.

In your dev machine which has Docker daemon installed (not your Raspberry Pi), pull down the image by:

docker pull aziotbld/raspberrypi-c

Then run the Docker container:

docker run -it aziotbld/raspberrypi-c

Inside the Docker container do the following steps:

  1. Update and Install all the pre-requisite libraries.
apt-get update 

apt-get install -y libunwind8 libunwind8-dev gettext libicu-dev liblttng-ust-dev libcurl4-openssl-dev libssl-dev uuid-dev apt-transport-https

2. Install .NET Core SDK on this Docker container. The image contains Ubuntu 14.04.

curl | gpg --dearmor > microsoft.gpg

mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg 

sh -c 'echo "deb [arch=amd64] trusty main" > /etc/apt/sources.list.d/dotnetdev.list'

apt-get update

apt-get install dotnet-sdk-2.0.0

3. Git clone the Azure IoT Edge repository.

git clone

4. Target .NET Core 2.0.0 for Azure IoT Edge .NET Core binding and modules. To do so modify all of the .NET Core modules’ csproj files by doing the following:

i. Change <TargetFramework> from “netstandard1.3” to netcoreapp2.0 so that it looks like the following:


Note: .NET Standard is a specification for .NET APIs which form the base class libraries (BCL). In the original, specifying .NET Standard 1.3 means that .NET Core 1.0 implements .NET Standard 1.3, which also means that it exposes all APIs defined in .NET Standard versions 1.0 through 1.3. More information about this here.

ii. Comment <NetStandardImplicitPackageVersion> like the line listed here.

You can now publish specifically for the linux-arm runtime. In the shell script for tools/, add a -r flag to the dotnet commands. E.g., in the lines for

dotnet restore and dotnet build, after adding the -r flag, the lines look like the following:

dotnet restore -r linux-arm $project
dotnet build -r linux-arm $build_config $project

5. For some reasons the Test nuget packages were missing, I trust everything is tested fine, so I will comment the last few lines of my script.

#for project in $projects_to_test
#    dotnet test $project
#    [ $? -eq 0 ] || exit $?

6. Run

cd iot-edge
chmod 755 ./tools/

7. Now it is time to cross-compile native runtime library for Raspbian 8 ARM32. Create a symbolic link to the RPiTools in /home/jenkins because the script expects the RPiTools in the home directory which for the root user is the following:

ln -sd /home/jenkins/RPiTools/ /root/RPiTools

8. Run

chmod ./jenkins/

Note: If you encounter cmake errors, just delete the install-dep directory and re-run the shell script above.

This creates a toolchain file at ./toolchain-rpi.cmake

9. Now is time to build Azure IoT Edge with .NET Core binding targeting 2.0.0 along with the cmake toolchain file.

./tools/ --disable-native-remote-modules --enable-dotnet-core-binding --toolchain-file ./toolchain-rpi.cmake

10. All the native runtime libraries required to run your gateway on Raspbian and targeting .NET Core runtime 2.0.0 are in the following directories:

iot-edge/build/samples/dotnet_core_module_sample/ - *.so and *.dll
iot-edge/build/modules/ - *.so and *.dll

11. Now it is worthwhile to commit the changes you have made in the Docker container in a new image which is properly tagged/labelled. You should also copy the entire iot-edge folder in a tar ball out of this Docker container, and move it to your Raspberry Pi device. Steps required to do this is outside of this tutorial.

12. You can check out the Azure IoT Edge samples from its GitHub repo.

13. I tried out this simulated_ble sample within the dotnetcore folder. Please make sure that you have .NET Core 2.0.0 runtime installed on your Raspberry Pi device. I added this within the a loaders section in my gw-config.json file. Actually I’m not too sure if this is how it works in Linux, but I was paranoid anyhow. The right way is to export the directory in the LD_LIBRARY_PATH environment variable, I think. 🙂

    "loaders": [
            "type": "dotnetcore",
            "name": "dotnetcore",
            "configuration": {
                "binding.path": "/home/pi/iot-edge/build/bindings/dotnetcore/",
                "binding.coreclrpath": "/opt/dotnet-2.0.0/shared/Microsoft.NETCore.App/2.0.0/",
                "binding.trustedplatformassemblieslocation": "/opt/dotnet-2.0.0/shared/Microsoft.NETCore.App/2.0.0/"

14. To be even more paranoid, I copied all the native runtime libraries (*.so) and DLLs for the .NET Core binding modules into my execution folder. I also copied and rename a native gateway host as gw and place it within the same folder.

15. The result…..

pi@raspberryfai:~/iot-edge-samples/dotnetcore/simulated_ble/src/bin/Debug/netcoreapp2.0$ ./gw gw-config.json
Gateway is running.
Device: 01:02:03:03:02:01, Temperature: 10.00
Device: 01:02:03:03:02:01, Temperature: 11.00
Device: 01:02:03:03:02:01, Temperature: 12.00
Device: 01:02:03:03:02:01, Temperature: 13.00
Device: 01:02:03:03:02:01, Temperature: 14.00


Under the hood of a Connected Factory

TL;DR – Ingesting telemetry data is nothing new in the industrial IoT world. Typically captured data are stored in a historian. However not all “historised tags” are stored in the historian. All there to it then is some data infrastructure on-premises. In order to do advanced analytics, leading to machine learning, leading to AI, the first step is to ingest telemetry from a larger variety of data sources in the cloud which leads to interesting stream processing and analytics in the cloud. This post talks about the guts of a connected factory, and how to bridge with existing components and systems in a connected factory.

I posted this on LinkedIn 3 months ago. The setup was for the Azure IoT Suite Connected Factory pre-configured solution. I have been meaning to publish this, now’s the time. The integral parts of a connected factory are connected telemetry stations, of which OPC UA is the standard for industrial IoT machines and systems in your plant floor.

This post is about streaming the telemetry data from SCADA systems or MES to the cloud, with Azure IoT Hub being the cloud gateway for ingestion and for maintaining the digital twins of these physical systems. The component which allows this integration is the OPC UA Publisher for Azure IoT Edge.

This reference implementation demonstrates how Azure IoT Edge can be used to connect to existing OPC UA servers and publishes JSON encoded telemetry data from these servers in OPC UA “Pub/Sub” format (using a JSON payload) to Azure IoT Hub. All transport protocols supported by Azure IoT Edge can be used, i.e. HTTPS, AMQP and MQTT (the default).

The target deployment environment is a Process Control Network (PCN) in which the target OPC UA Server lives. My target environment is made out of Windows Server 2016 virtual machines in an on-premises data centre. Azure IoT Edge modules are packaged into a Docker container. The current requirement for Docker images on Windows is that it has to be a Windows image. Prior to this the Dockerfile recipe for building a Docker image and running the container is for Linux, which is great for many purposes, except for my target PCN environment. I made a pull request in the GitHub repo for OPC UA Publisher with my contribution for a Dockerfile.Windows which uses a Windows NanoServer image with the right version of .NET Core upon which this project depended. The pull request was accepted and merged by the engineering team behind this project and they improved the recipe based on the new Azure IoT Edge architecture. All within the spirit of open source and making contributions back into the community.

To test this out, I

  • Created a Windows Server 2016 VM in Azure with Docker extension enabled
  • Git cloned the repo
  • docker build -t gw -f Dockerfile.Windows .
  • docker run -it –rm gw <applicationName> <IoTHubOwnerConnectionString>

Note: It is not a requirement to run the OPC UA Publisher for Azure IoT Edge within a Docker container. However doing so makes it easier to deploy your IoT Edge modules on your field gateway. It also allows you to perform orchestration of your containers thereafter. I do know that certain industrial IoT vendors who are deploying the bits directly onto their specialised hardware without the need for a Docker container.

Note: If you are using Zscaler, during the process of building your Docker image, if you encounter issues with dotnet restore, this is likely due to a certificate trust issue between the Docker container and Zscaler. Just alter the Dockerfile to add the Zscaler certificate to the Docker containers Trusted Root certificates and that would fix that error.

Once you got the Docker container running with the OPC UA Publisher for Azure IoT Edge, what next? The next logical step is to publish your OPC UA Server nodes. You can add more nodes to the publishednodes.json after running the OPC UA Publisher module. To do so you need to use an OPC UA client to connect to the Azure IoT Edge OPC UA Publisher module on its exposed endpoint on port 62222, and publish a node.

You could expose this port when you run the Docker container by adding the following arguments when you run Docker on the CLI:

docker run -it --rm --expose 62222 -p 62222:62222 gw <applicationName> <IoTHubOwnerConnectionString>

If you have a simple OPC UA client you could use that to connect to this endpoint. I used the sample .NET Core Client and I could see the exposed OPC UA TCP endpoint allows you to add more nodes, create a subscription, etc. However this does not allow me to invoke the nodes, I reckon that I need a full .NET Application client to do so.

C:\UA-.NETStandardLibrary\SampleApplications\Samples\NetCoreConsoleClient>dotnet run opc.tcp://winozfactory:62222/UA/Publisher

A better way is to use the UA Sample Client and UA Sample Server from the OPC Foundation .NET Standard Library GitHub repo.

Using the UA Sample Client, I am able to connect to the Azure IoT Edge OPC UA Publisher endpoint of opc.tcp://winozfactory:62222/UA/Publisher.

Then go to Objects->PublisherInstance, and call the PublishNode item. You need to provide the NodeID and the ServerEndpointURL as arguments. In my case, I want to subscribe to the simulated value in my UA Sample Server, so I provided a node ID of ns=5;i=40, and server endpoint URL of opc.tcp://winozfactory:51210/UA/SampleServer

Voila! Publishednodes.json was updated by the OPC UA Publisher without restart.

To prove that telemetry is ingested in Azure IoT Hub, use Device Explorer. Monitor the device which you used in the connection string when you started the OPA UA Publisher, you will see telemetry serialised into JSON.

Once the telemetry stream lands in Azure IoT Hub, the sky’s the limit literally as your data-in-motion, data-at-rest are both in the cloud. The next step is to hook this up to Azure Time Series Insights, besides many other options you have as part of implementing Lambda architecture using Azure 1st party or 3rd party services. We continue to add more features in Time Series Insights such as root cause analysis and time exploration updates. Read the article here, but here’s an excerpt:

“We’ve heard a lot of feedback from our manufacturing, and oil and gas customers that they are using Time Series Insights to help them conduct root cause analysis and investigations, but it’s been difficult for them to quickly pinpoint statistically significant patterns in their data. To make this process more efficient, we’ve added a feature that proactively surfaces the most statistically significant patterns in a selected data region. This relieves users from having to look at thousands of events to understand what patterns most warrant their time and energy. Further, we have made it easy to then jump directly into these statistically significant patterns to continue conducting an analysis.”