Gordon Strodel, Archetype Consulting

Gordon Strodel

Data & Analytics Solution Owner for @SlalomBoston. Background in Mechanical Engineering. Obsessed with solving problems elegantly and permanently.

📍 Boston, MA

Testing IoT Mesh Network Latency

Testing IoT Mesh Network Latency

In earlier posts, we looked at how IoT data could be sent to Microsoft Azure Cloud and stored in a SQL server database which could be connected to Tableau or any other number of data analysis tools. If you’ve not completed that tutorial, I’d suggest starting there. This blog post will build upon those foundational concepts and we will re-use a lot of the code and objects from before.

Background

If you are familiar with IoT, you know that each board runs a piece of code and publishes results to the internet. In a mesh network, you can publish the results to the mesh itself with the endpoint but not the rest of the internet. The gateway (think of the central hub in a wheel) handled the wifi connection and publication to the internet. (I am obviously oversimplify here.) In an effort to understand mesh networking and test my the network for latency, I borrowed @ninjatill’s code from Github. Thank you! 🙂

The objective of this post is to save mesh network latency data into Azure for visualizing in Tableau. I’ve made this complicated to show off Tableau and the Azure IoT connectivity. You could obviously do this with Ubidots or any other simpler solution. Here is an example from Github showing a Losant visual from @ninjatill:

 

Set-up the Particle Mesh

In our example, I am using an Argon (Wifi gateway) and three Xenons (wifi + bluetooth endpoint). Particle has a helpful series of blogs on the topic, start with “Mesh 101” here.

Follow the steps in the mobile app to set-up your network. I’ll assume that you have the network set-up and are able to get everything connected, breathing cyan as they say in the Particle IoT world. If you have trouble, check out the Particle forums: community.particle.io.

Xenon development board (image courtesy of Particle.io)

Xenon development board (image courtesy of Particle.io)

Once you have the network set-up, download the code from Github and flash the Argon (gateway) and the Xenon (end-points) with the respective code. You can do this OTA via the Build editor to make it really easy. Of note, I did notice the Xenons took a few minutes to download the code and flash. This was probably because the Argon was already flashed and running the Marco/Polo code.

 

Set-Up Particle to Azure Integration

To start, let’s adjust the Particle.io webhook integration with Azure to capture the new messages. Use the out of the box (OOTB) integration option with Azure in the Particle Console. Give it the name of the event from your code, in this case, I am using the default:

Event = MarcoPoloHeartbeat

Please supply your IoT Hub Name , Shared Policy Name , Shared Policy Key.

Use the default format for the JSON body:

 

{ "event": "{{{PARTICLE_EVENT_NAME}}}", "data": "{{{PARTICLE_EVENT_VALUE}}}", "device_id": "{{{PARTICLE_DEVICE_ID}}}", "published_at": "{{{PARTICLE_PUBLISHED_AT}}}" }

Assuming you set it up correctly, you should start to see webhook success near the bottom of the screen. Here’s an example: (Note: The hardware was already running the code at this point.)

 

Set-up the Azure Cloud components

Assuming you’ve set up the previous steps in the previous post, you should be starting with an Azure account, Stream Analytics job, SQL Server Database and Table, and an IoT Hub. If you don’t have any of these components working, please go back and set them up! 😎

Once you have the integration working, go adjust your Stream Analytics query:

 

SELECT DEVICE_ID, EVENT, PUBLISHED_AT, DATA INTO [output-sql-db] FROM [iot-hub]

Once you re-enable the job, let’s start looking at the results in SQL server. Use a basic query as shown to view the raw output from the webhook:

SQL preview.png

Assuming it looks good, we can use the following query to get and format the output accordingly from SQL server. (We’ll use this same SQL in our Tableau data source later on.)

 

select Convert(varchar(30),Published_At,127) as PUBLISH_DATE , event , SUBSTRING(DATA, CHARINDEX(':',DATA,1)+1,1) as NUM_OF_NODES , RIGHT(DATA, LEN(DATA)-(CHARINDEX(':',DATA,CHARINDEX(':',DATA,1))+ CHARINDEX(';',DATA,1)+1) ) as MILLI_RESP_TIME , DATA as RAW_DATA from dbo.particle_data where event = 'MarcoPoloHeartbeat'

Assuming you still have the table set-up and the access keys remain the same, you should start to see data flowing into the SQL server table.

 

Visualize the data in Tableau

Once you have the data in the table and can confirm the new values flow through, it’s time to connect Tableau to Azure. Use the SQL server connector and the SQL above and pull in the data source. After the extract is created, create a basic line chart on the Response Time (milliseconds) over time (blue line). The Orange Line indicates the # of nodes (Xenon’s) who responded to the Marco call on the mesh network. Ideally, this should remain at three.

The author’s Tableau dashboard showing the mesh network latency.

I did use a formula to convert the string version of Created Date/Time into a true date-time value in Tableau in the Eastern Time Zone:

 
DATEADD('hour',-5,DATETIME( MID([PUBLISH_DATE],6,2)+"/"+ MID([PUBLISH_DATE],9,2)+"/"+ LEFT([PUBLISH_DATE],4)+" "+ MID([PUBLISH_DATE],FIND([PUBLISH_DATE],'T',1)+1,2)+":"+ MID([PUBLISH_DATE],FIND([PUBLISH_DATE],'T',1)+4,2)+":"+ MID([PUBLISH_DATE],FIND([PUBLISH_DATE],'T',1)+7,2)))
 

Here is the interactive version of the dashboard in Tableau Public:

 

Results & Conclusion

Looking at the network latency across the nodes:

  • On average there is a 36 millisecond response time between the Marco/Polo calls.

  • Occasionally there are spikes to 2,000 milliseconds for a ping or two which appear to be at random over the course of the 4 days worth of data pictured below.

  • Generally, all 3 of the Xenon end-points are on the network with the occasional instance of a call being missed by one Xenon. The data doesn’t capture which device that was, but we could probably modify the code if that is important.

  • There are 3 instances of gaps in the data. Generally they occur in the 5-5:15pm to 7pm timeframe. Anecdotally, this is also the window I generally have the Apple TV on for my kids to watch television before dinner + a hour of time before it goes to sleep.

  • 🚧 Be Advised. Running this code for 4+ days 24/7 generates 32k records and 2.1 GB of storage in Azure’s database. 🚧

In conclusion, the entire mesh network, node set-up process was just as seamless as any other of Particle.io’s devices. (Despite many reported early- adopter growing pains.) While there were multiple steps to get all the boards talking to each other, the mobile app (iPhone) did a great job of making it easy. I was able to get the 4 boards unboxed, set-up, network working and publishing data in under an hour. 👍

And thank you to @ninjatill for their code!


Deep Dive!

Tableau Conference 2018 - A newbie's perspective

Tableau Conference 2018 - A newbie's perspective