Skip to main content

Hard grapht

You would all be forgiven for assuming that bar, pie and queue line are favourite pastimes of the British.  Yet, in fact – yes, we did learn this back in GCSE maths – they are also mechanisms through which meaningless, mundane data of suspect origin can be given a Gok Wan-grade makeover, with the prime objective of padding out biblical 187-page PowerPoint presentations and 871-page Word reports (*other Microsoft productivity tools are available).  In other words, documents that nobody has the intention of ever reading.  But it becomes apparent over the years; this is perhaps the one skill which serves you well for a lifetime in certain careers.  In sales.  Consultancy.  Politics.  Or any other profession in which the only known entry requirement is the ability to chat loudly over a whizzy graph of dubious quality and value, preferably while frantically waving your arms around.

Nevertheless, we are acutely conscious of the fact that we have spent an inordinate amount of our time throughout the I-O-Tea series collecting trivial measurements.  Temperature readings.  Current draw.  The rate at which earwax build-up occurs during a typical episode of Britain's Got Talent.  Yet we have only invested just short of 01,000,000 minutes in the beautification of our data points (which if you too did your GCSE maths you would recognise as being 3.14 minutes, or raspberry π nanoseconds).

Well, it's our lucky day.

We're taking a little io-tea break after our recent escapades filming nocturnal animals in our garden, and running visual recognition to alert us to the presence of bin bag demolishing English wolverines, otherwise known as foxes.

We have had some time to undertake a dose of graphical rehabilitation. And these are our conclusions.

Wares Wolly?

Shh.  Come a little closer.  Let us show you our (not so) curious black market wares for this experiment:
  • An ESP32 development board attached to a BME280 sensor module is central to our mission.  We're using a HelTec Automation Wireless Stick development board, but it could just as equally be any other variant.  And no, we're not using any LoRa here.  Everything is happily in range of our home Wi-Fi.
  • BME280 is the clever little sensor from the same German conglomerate that also builds car pressure washers that can blast tiles off a Bavarian's roof.  We'll use it to measure temperature, air pressure, and humidity.  I too see that the I2C interface on the sensor can be used to connect it to our ESP32 microprocessor.
  • We’ll hand-craft ourselves a voltage divider circuit consisting of a photo-resistor (also known as the chart-topping light-dependent resistor).  This unsophisticated circuit outputs a variable analogue voltage depending on how much ambient light it detects, which is why it’ll be connected to an Analogue to Digital (ADC) pin on the ESP32 development board.
  • Last but not least, we will forward our various sensor readings obtained by our ESP32 development board and its sensors to a Raspberry Pi 3 B running AWS IoT Greengrass.  Nope, this isn’t strictly necessary.  Yep, this is just showing off.  But as we’ve had our local Greengrass node running in situ for a long time now, we might as well use it to funnel our MQTT messages from our microprocessors back to AWS IoT Core.
But didn't we promise you some priceless pictorial pleasure?

We did indeed.  Which is why from AWS IoT Core, we will forward to and store our readings in AWS Elasticsearch Service.  This is an AWS-branded and managed service centred around... guess what... Elasticsearch, which is a popular opensource search engine and the "E" of the "ELK" stack.

And for the all important killer charts?  We'll dabble in two popular (and suspiciously similar-sounding) opensource visualisation projects that allow us to prettify data stored in Elasticsearch: Grafana and Kibana.  But don't take our word for it.  Here's how Camila Cabello sold this idea to the execs in the boardroom:
Grafana, ooh na-na (ay, ay)
Half of my heart is in Kibana, ooh-na-na (ay, ay)

Losing the box-plot

Are you a scatter-plot-brain, getting hysterical about histograms?

If you expand out that y axis you'll see that there has already been some considerable activity registered to date earlier in the I-O-Tea series.  Like Facebook, much of it involving the needless collection of data.
  1. Frozen Pi
  2. Have-ocado
  3. Green, green grass of /home
  4. Quantitative wheezing
  5. LoRa-Wan Kenobi
  6. Soreen seems to be the hardest word
  7. Ironed curtains
  8. 20th entry fox
Green, green grass of /home was where we deployed AWS IoT Greengrass.  Quantitative wheezing was where we demonstrated that the wholesale gathering of lots of arbitrary data can only be a good thing.  Both are worth a read.

Extremely graphic comic

Turn over the page.  Here's a few uninteresting snippets from this week's volume:
  • The not-so-bosh module from Bosch (BME280) is firstly attached to the designated I2C pins of the ESP32 development board.  From here, we use the useful BME280 MicroPython library to obtain the holy trinity of sensor readings of questionable importance: temperature, air pressure and humidity.
  • We're having a (R)IoT!  Why not collect some more data for the sake of it.  Let's create a simple voltage divider circuit using a photo-resistor, and connect it to a designated ADC pin on the ESP32 development board.  We can use the standard MicroPython ADC library to retrieve this value.  This should tell us whether it's daytime or nay time (a fact which should actually be obvious just by looking out the window, or from the actual time).  Oh well.
  • Create an Elasticsearch Service domain in AWS, and configure its access policy so that it is reachable. Then, we prepare the domain with an index named sensor with a timestamp field of type date - ready for the ingestion of incoming JSON data.  To get this index populated, we will create an AWS IoT core rule to forward sensor readings received via a simple Greengrass subscription back to our domain in Elasticsearch Service.
  • Once it is confirmed that data is being pumped into Elasticsearch Service, we can finally point Grafana and / or Kibana at the service's endpoint and its index.  In the case of Kibana, the tool is already included as part of AWS's Elasticsearch Service's ELK stack.  Grafana on the other hand needs to be downloaded and run.  Not before long, we will be able to stare at the life-changing awe that are multi-coloured line charts, self-refreshing histograms and tantalising tetrahedrons.  This really is the life you've always wanted to live.

Chartly Chaplin

In essence, we would like to use two popular opensource data visualisation tools to make our sensor data look more Hollywood, less Halewood.  And both Grafana and Kibana are able to fulfil that function nicely.  Annecdotaly, the former is used more prominently for graphical analysis, with the latter for log data interrogation, but there is most definitely a healthy overlap in capabilities.  More importantly, either can give us pretty pictures.  And they both rhyme with one-another, along with banana.  Perfect excuse to fire them up together with the aim of producing mesmerising works of fine (ch)art.

Both tools work with Elasticsearch as the back-end JSON document repository, which is why we'll be storing our sensor readings in an AWS Elasticsearch Service domain, and a newly created index.  And this index is populated over time using AWS IoT Core from our hard-working ESP32 development board.

But don't take our word for it.  Here's Flat Eric staring intensely at two busy monitors in a control room deep inside a top secret bunker. Just look how productive (and slightly menacing) this all looks.


We are using a HelTec Automation Wireless Stick ESP32 development board.  We didn't have to, by the way.  Since we're not using its Semtech LoRa transceiver, nor its OLED screen.  But it just happened to be lying around.

After looking at its pinout diagram, we decided on the following pins:
  • GPIO 39 (ADC) - Photo-resistor divider circuit output
  • GPIO 13 (I2C Clock) - BME280 I2C Clock
  • GPIO 12 (I2C Data) - BME280 I2C Data
The paparazzi have been at it again, camping outside our I-O-Tea factory.  Here's an opportunistic photo of this setup. Hmm. Such excitement.

Interpreting ADC values was discussed back in Quantitative Wheezing, so we won't go over it here again.

For retrieving readings from the BME280 sensor, we'll use the extremely helpful BME280 MicroPython library. And there doesn't appear to be much to this, other than to instantiate a BME280 class, and use its read_compensated_data() method to obtain the trio of readings over I2C.

from machine import I2C, Pin
import bme280_float
i2c = I2C(scl=Pin(13), sda=Pin(12), freq=400000)
bme = bme280_float.BME280(i2c=i2c)

We then use MQTT to pass these values over to our AWS IoT Greegrass device as JSON data.  In this instance, we're not using Greengrass for anything other than forwarding MQTT payloads received on topic rosie/sensors to AWS IoT Core.

In the past, we've been using all sorts for processing and storing AWS IoT data; from DynamoDB and AWS IoT Analytics, to something a little more bespoke using Lambda.  In this episode, we'll be using Elasticsearch as it is becoming increasingly popular for storing highly searchable indices of JSON documents.  And it works with Kibana and Grafana.

We start our journey by creating a new AWS Elasticsearch Service deployment.  There's a few decisions to make here.  For instance, this is definitely not "production".  This is more bizarre than custom. A perfect reason if any to pick the "development and testing" deployment type and be done with it.

Next, we name our domain and pick a really, really (did we say really) small instance.  We do need to ensure that we don't incur unexpected charges. Like most other AWS services, please ensure that you understand the charging model (and what is covered by the free tier) before proceeding.  Otherwise you may be eating baked beans on toast, three meals a day, for the remainder of the month.

We don't plan to use much storage.  Let's start with 10GB per node, which is probably enough storage to store Jedward's entire discography in MP3 in a triple fault-tolerant, N+2 multi-copy deployment... and still have enough left for the entire collection of the Lord of the Rings films.

Giving our domain public access will do for testing purposes (we will secure it further using access policies).

Configuring the access policy is where we need to stop and think a little.  Who needs access to this thing?  From where?  Why?  What is the meaning of life?

For our test, we gave our public IP address unfettered access to the Elasticsearch Service domain.  This is probably not wise in the long run, since our public IP address assigned by our broadband provider is subject to change, and what if we need to access it from other devices, on the go?  Unfortunately, when it comes to using Grafana with AWS Elasticsearch Service, there is currently no built-in way to authenticate access using AWS access key and password.  If there was, we could be less prescriptive with the IP policy, but rely on authentication instead.

It is therefore highly recommended to fine tune the permissions to provide the least possible privileges required for your purposes. 

Finally, confirm all selected options, and complete the domain creation.  Then go and make a much-deserved tea.  It takes a few minutes before the domain becomes active.  Enough time for an Earl Grey.

What we now have is a pretty empty domain.  We now want to create an index called sensor in which we store all our sensor readings. In this instance, we used Postman to manually fashion up a REST API PUT request and sent it to our AWS Elasticsearch Service instance endpoint.


In the request, we house JSON data with basic mapping so that it knows from where to source the timestamp field of type date.

  "mappings": {
    "esp32_device_name": {
      "properties": {
        "timestamp": {
          "type": "date",
          "format": "yyyy-MM-dd HH:mm:ss"

Our Elasticsearch domain should now be ready to receive JSON data, and dynamically create and populate fields as the JSON payloads arrive in the form of MQTT messages.  In order to receive this data from AWS IoT, we'll be creating a IoT Rule for Elasticsearch Service.

We'll create a rule for query select * from "rosie/sensors" and send it to our domain, endpoint and index.

Let's mimic our ESP32 sending in data.  If we test fire a simple JSON payload , we should see it increment the count on our Elasticsearch index.

    "timestamp": "2019-07-07 23:58:00",
    "esp32_device_name": "test-device",
    "temperature": 22

Beside the initial fields we defined earlier, new fields start to get extrapolated as well when they are found in the posted data.

This is all very promising, since we now appear to have sensor readings arriving in our Elasticsearch index.  And new fields are being created.

We'll now download and run Grafana on our local machine.  Grafana installation is covered extensively in its documentation so we advise that you go and read it, rather than rely on our dodgy mutterings.

By creating a data source, and pointing it at our Elasticsearch Service endpoint URL, Grafana should be able to interrogate the data held within it.

If the data source is happy, it should be possible to start creating dashboards and panels.


Contains needlessly graphic content from this point onwards.

Once the data source is successfully established, it is possible to create a dashboard populated with lots of panels (visualisations).  Here's an eclectic dashboard we created that shows current readings from our sensors.  Note that refresh intervals can be toggled if you want this thing to auto-update before your eyes. 

Clearly, it is also possible to report on historical values, since all data points are stored in Elasticsearch.  Here's a dashboard housing some rather scientific looking graphs.

Grafana is rather quite brilliant.  And there's an array of options that can be set to make the charts come alive.

Yet the "K" in the ELK stack stands for another tool that also allows the visualisation of data - Kibana.  Much in the same way as Grafana, we could download it and launch it on our machine - but actually - it's already included as part of AWS Elasticsearch Service.  Yay.

Access policy permitting, it can be launched directly from the AWS Elasticsearch Service console.

We found this a little fiddlier to get working, but eventually it was possible to create a similar-looking dashboard with current and historical data.

Kibana's real strength appears to be its ability to search through streams of text data (think logs).  Looking in the Discover tab gives you an idea on how this might be used out in the field.

As it is customary with IoT projects, we forgot about our thing for a good week or so.  And when we revisited it, we were pleasantly surprised by the accumulation of readings.  Lots of it, in fact.  Here are visualisations for the last 7 days, which clearly show some interesting patterns.

Well, that's about it, folks!

There are now countless graphs occupying our monitors.  So many, in fact, we are struggling to keep this blog post open in our browser without blue-screening our PC.  Perfect time, therefore, to claim victory and shutdown our machine for the day.

So long fellow chart toppers!

Update - July 2019

It just so happened that we had this rig running during record July temperatures that came with a Europe-wide heatwave.  Granted, this development board was left next to a closed west-facing window, but the results are quite stark.

When our 3D-printed (PLA) devices started to deform under the sun, we now know why.

Plotting on...

We're using this wonderful BME280 MicroPython library for this project:
Get elasticated with AWS Elasticsearch Service:
Available AWS IoT Core Rules - including for Elasticsearch - are described here:
Kibana installation and configuration is described here:
Do you have a pretend internet-connected truck that requires graphing using Kibana and AWS Elasticsearch Service?  No problem, AWS has a walk-through about this very use case:
We're using a HelTec Automation Wireless Stick ESP32 development board.  Pinouts are described here:
And here are some friendly guides from Grafana:
For some of our manual REST API trickery, we used Postman:



LoRa-Wan Kenobi

In the regurgitated words of Michael Bublé: It's a new dawn .  It's a new day .  It's a new Star Wars film .  For me .  And I'm (George Lucas, and I'm) feeling good .  Unfortunately for Canadian Mike, the Grammy that year was won by the novelty disco classic with the famous refrain: We love IoT, even in Planet Tatooine * . *Not true. Clearly, the Star Wars producers didn't sincerely mean the last Jedi the previous time around.  Return of the Jedi, released during the decade that spearheaded cultural renaissance 2.0 with the mullet and hair-metal , was less economic with the truth.  Either way, we're going to take inspiration from the impressive longevity of the money-spinning space-opera and reboot our franchise with some Jedi mind tricks.  Except this particular flick doesn't require an ever-growing cast of unrecognisable characters, unless ASCII or UTF counts.  In place of an ensemble gathering of Hollywood stars and starlets, we will b

Beam me up, Rosie!

How do you get from A to B? You can't, as As and Bs are just letters in the alphabet. But if A is your house, and B is a meerkat village at your favourite safari park, you'd probably use a device equipped with GPS.  Not to be confused with UPS, who will deliver you your chosen meerkat through the post. And why on Earth would Rosie Patrol need one? Precisely, it's because she is on Earth that she needs one. Because our planet is rather big. Big enough to get lost in. And we don't want to lose our friendly plastic boxes on wheels. And maybe, eventually when she's clever enough, she'll go and defeat baddies on her own. And return home afterwards for a well deserved Earl Grey tea. Besides, why wouldn't we want to add another three letter acronym to Rosie Patrol's repertoire? All superheroes need: One Raspberry Pi 3, running Raspbian OS Computer from which you are connecting to the Raspberry Pi Probably the most important bit: a GPS r

Tea minus 30

We're fast approaching Christmas time.  And if robots were to make one simple observation about the human species during the Christmas festivities, it's that they watch a lot of TV.  A LOT.  Often, accompanied by an inappropriate amount of greenhouse gas-producing food .  Stuff you don't normally eat during the remainder of the year - for good reason. And most so-called shows on TV are boring to robots like Rosie.  After all, why watch a minor subspecies of the human race - celebrities - stumble awkwardly around the dance floor, dressed like a faulty, sparking circuit board?  Such branch of entertainment doesn't require robots to engage any of their proud circuitry.  Their processors remain idle.  Memory under-utilised. But if robots are to be part of people's homes (and blend in), they need to look at least a little interested in some of this irrational nonsense .  Nobody likes a party pooper .  A killjoy .  And this is where a certain subgenre of TV