Azure API App – FTP Connector – How to solve “227 Entering passive mode error”

I really like the Azure Logic Apps. It reminds me of the good old days of workflows in WF except that this meant for simple workflow logic, but it does the trick. I particularly like the FTP Connector and the Azure BLOB connector. Due to the trigger function not yet implemented in the Azure BLOB connector, I found a workaround which was to use the FTP Connector, then use the Blob Connector as an action. But in this particular IoT scenario, it is hardly a workaround, it’s a necessity because the “thing” could only upload my payload in an FTP server or send an email with the payload as attachment. More about this IoT scenario I am working on in a later post.

For the past few days, I’d been stuck looking at this one error. When I clicked on the “Output Links” in my trigger history, this highly elusive message was shown:

“message”: “Unable to connect to the Server. The remote server returned an error: 227 Entering Passive Mode (104,43,19,174,193,60).\r\n.”

This is super weird because “227 Entering Passive Mode” is hardly an error, it’s a valid FTP status message for passive mode. So why is this an error? Before jumping into a conclusion that this is a bug in the FTP connector, I tried 3 different options for having a FTP server in Azure:
1. Azure websites
2. A Linux VM running vsftpd.
3. A Windows Server 2012 R2 Datacenter running FTP Server

I tried all of the above in that order. Only (1) worked, but my “thing” could not upload the event data into an Azure website FTP server. I wasn’t even going to try to fix what’s in my “thing” because this “thing” is pretty much a blackbox, if you will. It’s white actually but you get what I mean. So I tried options (2) and (3).

After trying out all kinds of configuration and creating incoming rules and Azure endpoints to allow traffic, it finally dawned upon me. If I want to make a VM instance available across a range of ports, I have to specify the ports by adding them as endpoints in my Azure management portal. However the easier solution is to enable an instance-level public IP address for that VM instance. With this configuration I can communicate directly with my VM instance using the public IP address. The advantage of doing so is that I can immediately enable a passive FTP server, because “passive” essentially means that it can choose ports dynamically.

ftpdataports
These port ranges can be huge, so I don’t think you would want to specify/add 1001 port numbers as endpoints on your VM instance. While it costs to have a instance-level public IP address, it’s well worth the money. Trust me.

Just remember to create an incoming rule for your firewall to allow your data channel port range.

What a relief when I finally saw the following in the FTP connector trigger history.

ftpconnectortriggersok

Potential solution to HTTP 500 Error with your WordPress site on Azure Websites

It’s been awhile since I blogged, and the embarrassing part about this is because my blog has been down and I haven’t really got the time to really troubleshoot. I did what anyone would do, searched online which pointed me to a few posts on MSDN and Stackoverflow but nothing really did it for me. So in case you found this post through the same searches, this could be the potential solution for you.

If you get a HTTP 500 error, and you should FTP into your Azure website deployment to figure out the exact error. Here’s where you can find your HTTP detailed errors:

awsdetailederror

 

Just click on the file according to the latest datetimestamp.

If your error looks like the following:

pythonfastcgimoduleerror

That means you have enabled Python in your Azure website, which you do not need because WordPress runs on PHP.

Go to your Azure website configuration and disable it like shown below:

python-off

 

This ought to fix it.

Using Azure Stream Analytics to tap into an Event Hub data stream

The pre-requisite is to make sure that you have requested for Stream Analytics preview if you have not already done so.

1. Create a Stream Analytics job. Jobs can only be created in 2 regions while the service is under preview.

sa-1

2. Add an input to your job., This is the best part because we get to tap into an Event Hub to analyse the continuous sequence of event data stream and potentially transform it by the same job.

sa-2-addinput

3. Select Event Hub.

sa-3-addeh4. Configure the event hub settings. You could also tap into an event hub from a different subscription.

sa-4-ehsettings

5. Due to my event hub’s event data being serialized in JSON format, this is exactly what I will select in the following step.

sa-5-serializationsetting

Under the Query tab, I just insert a simple query like the following.

SELECT * FROM getfityall 

I’m not doing any transformation yet, I just want to make sure that the event data sent by my Raspberry Pi via Event Hub’s REST API is done properly.

Next on the list of steps is to setup the output in the job.

6. Add an output to your job. I’m using a BLOB storage just to keep things simple so that I could use open the CSV file in Excel to take a look at the data stream.

sa-6-output

7. Setup the BLOB storage settings.

sa-7-blobsettings

8. Specify the serialization settings. I’m choosing CSV for obvious reason stated above.

sa-8-serialization

As I pump telemetry data from my Raspberry Pi, I could see my CSV file created/updated. Just go to the container view in your BLOB storage, and download the CSV file.

sa-9

Below is what my event data stream looks like.  It shows event data points captured from two Raspberry Pis, one using the MPL3115 temperature sensor (part of the Xtrinsic sensor board), and another using MCP9808 temperature sensor. The fun begins as I could write some funky transformation logic in the query and do some real-time complex event processing.

streamanalytics-temperature

How to send sensor data from Raspberry Pi to Azure Event Hub using a Python script

This post follows what I intended to do which is to pump sensor data consisting of temperature and altitude readings from the Xtrinsic sensor board to my Azure Event Hub named getfityall. The sensor board comes with some Python scripts already as you have seen in my earlier posts. Coupled with Microsoft Azure Python packages from the SDK, I could easily reuse a very nifty Python script I found from Kloud’s blog, a very competent Microsoft cloud partner, to send sensor data to my Azure Event Hub. The repurposed Python script looks like the following:

import sys
import azure
import socket

from azure.servicebus.servicebusservice import (
  ServiceBusSASAuthentication
  )

from azure.http import (
  HTTPRequest,
  HTTPError
  )

from azure.http.httpclient import _HTTPClient

class EventHubClient(object):

  def sendMessage(self,body,partition):
    eventHubHost = "youreventhubname.servicebus.windows.net"

    httpclient = _HTTPClient(service_instance=self)

    sasKeyName = "yourprocessorname"
    sasKeyValue = "yourprocessoraccesskey"

    authentication = ServiceBusSASAuthentication(sasKeyName,sasKeyValue)

    request = HTTPRequest()
    request.method = "POST"
    request.host = eventHubHost
    request.protocol_override = "https"
    request.path = "/youreventhubname/publishers/" + partition \
    + "/messages?api-version=2014-05"
    request.body = body
    request.headers.append(('Content-Type', \
    'application/atom+xml;type=entry;charset=utf-8'))

    authentication.sign_request(request, httpclient)

    request.headers.append(('Content-Length', str(len(request.body))))

    status = 0

    try:
        resp = httpclient.perform_request(request)
        status = resp.status
    except HTTPError as ex:
        status = ex.status

    return status

class EventDataParser(object):

  def getMessage(self,payload,sensorId):
    host = socket.gethostname()
    body = "{ \"DeviceId\" : \"" + host + "\""
    msgs = payload.split(",")

    for msg in msgs:
      sensorType = msg.split(":")[0]
      sensorValue = msg.split(":")[1]
      body += ", "
      body += "\"SensorId\" : \"" + sensorId \
           + "\", \"SensorType\" : \"";
      body += sensorType + "\", \"SensorValue\" : " \
           + sensorValue + " }"
    return body

I saved and named this Python script as sendtelemtry.py.

Then in the mpl3115a2.py script, add the following import statements:

import socket
import sendtelemetry

At the end of the script, replace the Python code with the following:

mpl = mpl3115a2()
mpl.initAlt()
#mpl.initBar()
mpl.active()
time.sleep(1)
while 1:

        #print "MPL3115:", "\tAlt.", mpl.getAlt(), "\tTemp:", mpl.getTemp()
        hubClient = sendtelemetry.EventHubClient()
        parser = sendtelemetry.EventDataParser()
        hostname = socket.gethostname()
        message = "temperature:"+repr(mpl.getTemp())+",altitude:"+repr(mpl.getAlt())
        body = parser.getMessage(message,"mpl3115")
        hubStatus = hubClient.sendMessage(body,hostname)
        print "[RPi->AzureEventHub]\t[Data]"+message+"\t[Status]"+str(hubStatus)
        time.sleep(0.1)

Then execute this script by doing

sudo python mpl3115a2.py

The Azure Event Hub REST API returns HTTP status code to indicate the result of the send event REST call. A HTTP status code of 201 means success. Read more about the send event REST call here. You can monitor your Azure Event Hub dashboard to see the incoming messages.

In the next post I will share more about what I’m doing with the preview feature of Azure Stream Analytics to do a toll-gate analysis of event data in transit. Thus far I had only been doing descriptive analytics of data at rest. This ought to be interesting because it requires a different understanding of what I would like to analyze.

Fresh raspberry pi at my service

I am super glad that the items I ordered from Element14 arrived overnight. I got not one but 3 MEMS Sensor board, and it is working well. Proof that I somewhat “overused” the previous board so much so that the temperature and altitude reading stopped working.

pi@raspberryclay ~/rpi_sensor_board $ sudo python mpl3115a2.py
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 100.24 Temp: 28.144
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128
MPL3115: Alt. 101.32 Temp: 28.128

I got a new Raspberry Pi too, it’s for my co-worker, Clayton. It’s pretty simple to clone the micro-SD card from raspberryfai and write it on to the new one. Just use the Win32DiskImager. It works like a charm.

Just to make sure my ISS agent which sends the right telemetry data which consists of MEMS sensor temperature/altitude reading, here’s a good OData feed of the data all captured in the cloud.

raspclayiss

I’m thinking of the next steps in my experimentation such as:

  • Revert to plain old way of sending telemetry data from my Raspberry Pis to Azure Event Hubs using a friendly AMQP client/library.
  • Do real-time analytics of multiple data streams from Fitbit, Strava, Raspberry Pi (temperature/altitude). I’ve been mucking around with the new preview of Azure Stream Analytics to some success.

I have 2 more packages of nice toys to be delivered. I’d ordered an Arduino Starter Kit, and 2 Intel Galileo 2, among other nifty breakout boards, sensors, and kits. Stay tuned.

 

Intelligent Systems (at your) Service

I was really excited that my application for the Microsoft Azure Intelligent Systems Service (ISS) Limited Public Preview (LPP) had been approved (my apologies for all the 3-letter acronyms which I will be repeating all over my posts from now on). What the ISS LPP entitles me are the following:

  • Access to the ISS service which is enabled for use with my Azure account. The ISS service icon is now available in the list of services (on the left in my Azure Management Portal) and it looks like this:

0-issazure

  • Download the ISS SDK along with a couple other utilities such as DeviceMonitoring and the Contoso Home Automation -part 1 sample/demo (to be checked out soon)
  • Participate in a private forum.  According to the welcome email, “You are more likely to get a response in a reasonable timeframe, since you are not reliant on an individual being online and able to answer your question”, which sounds great.

I went ahead to create an ISS service for GetFitY’all. However I could not share too much details and the screen clipings because according to confidential information clause in the EULA,  the software and service, including the user interface, features and documentation, are confidential and proprietary to Microsoft and its suppliers.

The real fun happens inside my raspberryfai. The RPi runs Raspbian, a customized version of Debian used to run on the RPi. I used scp to copy the ISSAgent_C_Samples folder into my raspberryfai. Then I compile the sample ISS agent and run it. The agent sends messages to my ISS account. My next step is to embed the ISS managed library into my GetFitYall device gateway, which was implemented as a WebJob described in one of my previous posts, and do the same in sending the activity data points to my ISS account. Previously I implemented a simple message pump functionality in the WebJob to send activity data points (pulled from Fitbit and Strava APIs) asynchronously to an Azure Event Hub via AMQP. Then I have Azure Worker Role instance(s) to ingest the event hub messages by persisting into respective Azure Storage Tables.

VoluntaryTracking.Enabled = true;

If you’ve been using Google Maps on your phone, you probably know there is a setting within that allows Google apps to use your device’s location any time it is on, or maybe not.

googlelocsettings

 

Or if you prefer the conspiracy theory version, you can read this.

The way I think of this is simply in a geeky way which is the title of this post. Since I had enabled voluntary tracking of my whereabouts, why not make good use to mash it up all in my Project GetFitY’all.

To get your location history in a timeseries of lat/long data points, go to the Google Maps Location history page. You have to sign in using your credentials first of course, so you are only seeing your own location history.

viewlochist

  1. Select the date/time range from the drop down list.
  2. To have a comprehensive list of location data points, click the link that says “Show All Points”.
  3. Click “Export to KML”. You would have downloaded an XML file in the Keyhole Markup Language (KML) format.

To make sense of these location history data points, the best means is to import this in PowerQuery. The steps were described in my previous posts about mashing the data in PowerQuery. The only exception is that I would open an XML file and use the KML file I downloaded. It’s supposed to be a simple step of expanding the 2 columns in PowerQuery which represent the <when> and <coord> elements but for some reason it expands the tables within those 2 columns in a weird way. Weird in the sense that for each <when> element it expands all of the <coord> elements for it. So if I have 1000 data points, I ended up having a total of 1000 X 1000 = 1 million rows. I referred to this blog post to try to expand the tables within the columns, but to no avail in my case.

Hence I did a workaround, a manual way of retrieving all the <when> rows and <coord> rows in two separate queries, then copying and pasting the combined data in a separate spreadsheet, finally saving it as a CSV file. Then I go back to PowerQuery to import the CSV file.

Next I need to retrieve my Fitbit data points. Good thing I already have a REST endpoint which does that for me. The REST API was implemented as a node.js app published at http://getfityall-api.azurewebsites.net/fitbit. I pass in the query strings which consists of the date and time range and VOILA!

Create a PowerMap, and I get this slightly different visualization below.

googlochistmashup So there you go, the results of VoluntaryTracking.Enabled = true; and then mashing your Google location history with Fitbit data.

PowerQuery invokes a GetFitYall API endpoint, and fun with PowerMap

Is that even possible? Yes and I’m talking about invoking that from within the PowerQuery add-in in Excel 2013, and then mucking around with the data which is represented in JSON. Pretty awesome I would think. To the layman, don’t worry about what’s this JSON thing, it’s all transparent to you, just consume the date.

In my previous post, I wrote about the REST API which I had exposed and it allows mashup on-demand which is perfect in the case of self-analytics using PowerBI. Here’s how you could consume this API from within PowerQuery specifically. It’s just another source of data like how I had retrieved the data from Azure Table Storage. Here are the steps:

1. From the PowerQuery ribbon, click “From Web”. Then enter the URL. The URL I’m entering is the REST endpoint I have. It should be HTTPS but then this is just a PoC so I’m keeping things simple here. I pass some query strings in the URL too.

fromnode

2. Click List which contains an array of mashed up activity data points. Do NOT click “Into Table”, at least not yet.

powerquery-step1

 

3.  Now that you have expanded the List into a row of records, click “To Table”.powerquery-step2

4. Then you see the following dialog box, just click OK. No worries, it’ll be fine.

powerquery-step3

5. Select All Columns.

powerquery-step4

6. Fix the data type for the fields you care about, especially those you want to be used to visualize in PowerMap. Start with datetime. Click the column header, then at the ribbon, select Date/Time as the Data Type.

powerquery-step5

7. Fix Steps column as well. Choose Whole Number. This is because you don’t have a fraction of a step, just steps. :)

powerquery-step6

8. Fix Calories column, and set Data Type as Decimal Number.

powerquery-step7

 

9. At the ribbon, click Close and Load To. Then this dialog box pops up. Be sure to tick “Add this data to the Data Model”. The data needs to be in the Data Model in order for PowerMap to work on it after this.

powerquery-step810. The results are a number of rows retrieved from the REST endpoint. Look man, no JSON :)

powerquery-step9

11. If you want to look under the hood, I happen to be “tailing” the log of my node.js Azure website. Here’s proof that it’s the same 3,014 rows being returned. It took some 7 seconds to execute, this is what I mentioned in my previous post that I might not have optimized the mashup logic.

powerquery-step9-1

That’s it on the part of PowerQuery. Let’s do the fun stuff of visualizing this on PowerMap.

1. Map the geography and map level by selecting the lat and long fields.

powermap-step1

 

2. Select the columns which we want to visualize in the PowerMap.powermap-step2

 

3. Change the width of the “skyscrappers” and the colors of course, and VOILA, you get this birds eye view of where the “action” happens. In this case, I walked the most around the Sydney CBD area. I attended the Mobile Monday Sydney meeting a couple weeks ago. powermap-step3

 

When I showed this to my wife the other day, she asked why were calories burned even when I was sitting idle in the bus. But then she answered her own question when she said “oh yeah we burn calories so as long as we are breathing!” LOL 😀

 

 

 

GetFitYall REST API

This API exposes a singular function at the moment which is to do the following:

  • Mashup on demand – Let client apps consume a mash up of fitness activity data points from different target APIs based upon user ID and time period for a specific date. Note: User ID is not implemented right now because my authorization website is not implemented fully yet.

Common usage of GetFitYall API
1. Self-service analytics tool such as PowerQuery and PowerMap pulls activity data points from HTTP/S endpoint(s) based upon query parameters such as user ID, date, and time period.

1. Get activities mash up

GET http://getfityall-api.azurewebsites.net/mashup?<userID>&ondate=YYYY-MM-DD&aftertime=HHmm&beforetime=HH:mm

E.g.,

http://getfityall-api.azurewebsites.net/mashup?ondate=2014-08-04&aftertime=10:00&beforetime=23:00

Description:

Gets a mashup of activity data points from different fitness APIs based upon user ID and matching timestamps. Currently supports Fitbit intraday API and Strava API. In order to get activity data points down to 1-minute detail level, this API function only works for a specific date as required by the Fitbit intraday API.

Query parameters
userID The Fitbit user ID which has been authenticated and authorized by Fitbit OAuth API
date The specific date from which to pull the Fitbit activity data points. This works with 1 specific day because the Fitbit Get Intraday Time Series function only allows fetching a time series for a specific day but the data points would be down to 1 minute detail for the day. See https://wiki.fitbit.com/display/API/API-Get-Intraday-Time-Series
afterTime The start of the period, in the format HH:mm
beforeTime The end of the period, in the format HH:mm

Returns: Content-type = application/json
HTTP status codes
200 – OK
400 – Error in request
500 – Error in processing

Example response:

[
{
"type":"fitbit_strava",
"date":"2014-07-20",
"data":[
{"datetime":"2014-07-20T10:00:00.000Z","latitude":-33.869406,"longitude":151.120498,"distance":12959.4,"altitude":8.5,"steps":105,"calories":13.890000343322754,"floors":0,"elevation":0},
…. omitted for brevity….
{"datetime":"2014-07-20T10:00:00.000Z","latitude":-33.869369,"longitude":151.120443,"distance":12966.1,"altitude":8.3,"steps":105,"calories":13.890000343322754,"floors":0,"elevation":0}
]
}
]

This API is implemented as node.js app and deployed into a free/basic Azure website from WebMatrix. The reason why I had chosen node.js is mostly to learn a new server-side technology. The other reasons are:

  1. More programmable for mashup logic.
  2. Leverage many third-party Node.js modules such as node-strava, node-fitbit to rapidly develop prototypes of the GetFitYall API.
  3. Scalable as I configure this as an always on  basic Azure website and scale out accordingly.
  4. Due to the lightweight nature of node.js, the node.js app can handle a large amount of traffic with low overhead

Obvious Bottleneck

One potential problem is the sheer number of mashup requests. When I invoked this endpoint from PowerQuery,  I saw that there were 3 requests for each query, kind of weird but I’m not keen to find out why. Multiplied by thousands if not tens of thousands of users, it is pretty obvious that this would become a serious problem. Recommendation to address this bottleneck as follows:

  1. Cache the HTTP Response

A simple solution is to cache the response for similar requests (based on the same query parameters). The main benefits of caching response are reduced latency and network traffic.

2.   Mashup On-demand Optimization

The node.js app makes asynchronous calls to the Fitbit API to retrieve steps, calories out, floors, and elevation because this is how the Fitbit API works. A Javascript promise is used to determine when all calls are returned before processing the next step. This is another benefit of using node.js and by further using a 3rd party module such as Q, the callbacks hell can be avoided.

IoT Descriptive Analysis using PowerBI

Now comes the interesting part which is self-analytics of all the data that I have collected from “the Internet of My Things” (IoMT). As a recap I am currently ingesting activity data points from 2 devices, a Fitbit One and a Samsung S4 running 2 “sensor apps”; Strava and MapMyWalk. But it shouldn’t be limited to this as I also have a Garmin Edge 705 with heart-rate monitor (HRM) to track my MTB rides and a Polar FT40 wrist watch also with HRM to track other activities such as badminton and swimming (yeah my one and only wearable device which works under water). I have a small disclaimer: I’m not a regimented fitness geek. I just want to make sense of my activities. It all started with mountain biking and I just want to know how often I ride and for how long to try to justify to my wife why I bought 2 mountain bikes! :) During my rides, I wear a HRM because I just don’t want to over-exert myself during those steep climbs. When I looked at my dashboards I realized there is so much information which helped me to gain insights as to what I’m doing well and what not. It helps me to be better when I ride or play sports all without “killing” myself.

I chose PowerQuery and PowerMap, 2 very nifty PowerBI add-ins. I just want things to be simple and nothing beats self-analytics using a friendly tool like Excel (my wife is quite an Excel junkie from her previous life). These add-ins are available as free downloads from Microsoft to enhance the data access and data visualization capabilities of Microsoft Excel 2013. You should search for the latest download links. Using these tools I could retrieve data from a variety of sources and integrate that data as part of my Excel data model.

I’m particularly impressed that in the “Internet of my Own Things” the data generated were pretty sizeable. There were over 8000 Fitbit data points and 6000 Strava data points over a few days. And this is just for myself, imagine opening this up to more devices and more users? Obviously we needed a solution that is of cloud-scale to make this work. If you are doing self-analytics using Excel 2013, you may want to install a 64-bit version of Excel. Your Excel may crash working on all that data, I’d crashed the PowerQuery and PowerMap add-ins a few times, sent in feedback to Microsoft, they asked me if I could reproduce it, I say yeah when I work on huge datasets, they recommended I use the 64-bit version. Remember to download and install the corresponding 64-bit versions of the add-ins too.powerqueries

 

These data were retrieved from my Azure Table Storage which my Worker Roles diligently inserted (see my previous post). You could also import data from other sources which include Facebook, that’s pretty fun. Imagine being to compare my activities with my other buddies. I am a member of a couple of mountain biking groups in Strava. This could be a side-project later on.

azuretablesource

 

You need your storage account details such as the name and the storage primary key which you can get from your Azure management portal.

After I had retrieved data from 2 Azure storage tables which stored my Fitbit and Strava data points, I could “mash” them y’all. The function for this is merge within PowerQuery ribbon.  First I select the Strava table, and then the Fitbit table. This is because there are more data points in Strava that maps out my lat/long coordinates versus 1 Fitbit data point recorded at every minute interval. Then I select the datetimestamp column to match and I only want to include matched rows. I name my merged query as Getfityall. Voila, I had just mashed up both data sources without writing any code! I had tried to write the code to do the matching but I don’t think my code was all that good, I had nested for loops. I then tried to use JSON path but a JSONpath library I used was painfully slow. So guess what, I just let Excel do what it does best! However in a later post, I will talk about how to import the mashed up data in Excel by calling a REST endpoint that returns the mashed up data in JSON.  And in this implementation  I do have nested for loops written in node.js! (yeah please do LOL :))

Things to note when you merge the 2 tables in Excel. you have to adjust the column formats especially for date/time and steps (by making it a whole number). Otherwise PowerMap doesn’t understand the format of your data and would be unable to render it. Then remember to load the data into the data model. This is required by PowerMap.

Next you insert a PowerMap. It is not available as a ribbon on its own. Rather go to the Insert ribbon, and under Map, click on it. You will notice a Launch PowerMap option. Wonder why is such a powerful feature tucked away here.

insertpowermap

Create a new PowerMap tour and the fun begins. Select the latitude and longitude columns. It should automatically map correctly. Next I select other columns such as DateTime, Steps and CaloriesOut.

powermap-map

Be sure not to aggregate your columns under height. Otherwise you get weird-looking “skyscrappers”.  Next I configure the layer options by making the thickness smaller to about 25%, otherwise I get fat buildings and I can’t even see the route on the roads when I click play. I also changed the layer colors accordingly. I chose the national colors of Australia, green and gold to represent my steps and calories.

sceneoptions

Turn on the map labels. Change the playback speed. Pan, zoom in and zoom out and just play around with the map. Then create a video and it’s cool! You could even add a soundtrack!

There you go, self-service analytics of data captured from the Internet of my things. This is just descriptive analytics, I’m just visualizing data that I already have. Next you could advance to predictive and prescriptive analytics which opens up many more possibilities. In another post I will talk about my thoughts about how far we have progressed being able to derive value of out of your own IoT solutions and projects in such an accelerated manner. And the best part is that you only focus on what you do best without worrying about the underlying plumbing and infrastructure. It just works!