Skip to main content

Chariots of wire

There have been some mesmerising feets of human endurance recently.  Eliud Kipchoge became the first runner ever to run a marathon in under 2 hours.  Brigid Kosgei set a woman's marathon record, only a day later.  A bloke named Rylan apparently sang Karaoke non-stop for 24 hours.  And, a humble Raspberry Pi survived 18 hours outdoors, with the help of LiPo supplements that WADA have prohibited alongside EPO, Lance Armstrong's chocolate brownies and Ben Johnson's "nutritional" flapjacks.

But we should never, ever be totally satisfied with our successes in life, however grand or infinitesimal.  As consistently demonstrated by the mishaps encountered during our blog series, often centred around an empty Flora tub (or two, or three hundred), everything can be improved upon with a subtle dollop of performance enhancing drugs common sense, often mixed with semiconductors dubiously imported from China.  After all, it's all about the margarine-nal gains.

Yet to achieve our next ZeroDivisionError%-level gain in endurance, we aren't going to rely on an assortment of expensive, branded gear that look good on athletic models Photoshopped in the Alps, but not so much on ordinary folk attempting a jog alongside rush hour traffic on the A4, the morning after a cousin's stag party.  An all-weather GPS watch that costs more than a year's worth of parking tickets accumulated in central Bath.  Nutritional supplements, that if taken in quick succession, make you run the fastest that you've ever run... to the bathroom, or nearest hedge.  Not even an aerodynamic onesie fashioned out of exotic zucchini extracts (because surely, that must be a thing).

Instead, we will attempt to improve our sub par walking-slash-running performance by strapping suspect microcontrollers to our already tortured legs using elasticated bands and ™™™Velcro™™™.  All in a comical attempt to analyse what we have rather academically coined: "DA. LEG. MOVEMENT".

No: this isn't remotely scientific. Yes: we will look utterly foolish in the process. But as the legendary Sir Mo Farahday once famously observed, it's not about the winning, but the taking apart (of perfectly working gadgetry) that counts.

Besides, what else is there to do while we wait for our fluorescent yellow Zu-kini to arrive through the post? C'mon eBay seller BargainZukinis4U.  We could really do with that 279% gain in performance promised by the ad.

Lance ARM®strong:

Right, let's get all juiced up*!

*In a totally legal way.

Like the life of a totally innocent and always persecuted athlete who receives mysterious packages from the team doctor at night (and we aren't talking deliveries of emergency LR44 batteries from Amazon Prime), what feels like a simple, attainable objective ends up placing us in very sticky situations (quite literally in some places).

After all, how hard could it be to automagically track the positions we find our depleted legs in?  Well, actually quite hard, it transpires.

Could we strap our ancient school protractors to our limbs using Gorilla tape and pay some cash-strapped medical students on zero hour contracts to measure the angular displacement of our legs every couple of milliseconds?  Nah, not really practical.  Nor ethical.  Could we get our hands on a medical device that does this for us?  Nope, we don't really have the budget.  Such devices are called a goniometer and they appear to be quite expensive and rather over-the-top for this job.

Which is why we're ultimately taking inspiration from this academic paper we stumbled up on in the Wild Wild Web.  That's right.  We'll be emulating a goniometer using the following components:

  • First, though, We Need to Talk About Kevin "DA. LEG. MOVEMENT".  For the sake of simplicity, like Bristol Road in Bath, we're going to assume that there are only two moving parts to our leg: an Upper and a Lower (sure, let's discount the importance of the feet).  Let's dumb it down some more.  We'll assume that they only rotate in one dimension at the joints - up or down - although in reality, there is clearly some sidewards displacement as well - eek!  Yep, that's right.  We're in fact talking about the "pitch".  If we're sounding suspiciously specific in our vocabulary, it's because we're deliberately getting passionate in the Language of Laaaab.  For we intend to use cheap-as-microchips Inertial Measurement Units (IMUs) found in our lab to record angular displacement of our two limbs in relation to gravitational force.  And that's why we find ourselves - in the wise words of the Walrus of Laaaab - getting it on with the InvenSense MPU6050 3-axis gyroscope / accelerometer modules - the Nissan Micra of the IMU world (whatever that means).
  • Yet, something needs to retrieve the gyroscope / accelerometer readings from these pesky IMUs at a relatively high frequency.  Any microcontroller with I2C support should do quite nicely here.  And since it's going to be inexplicably attached to our leg, we ought to go for something small and lightweight (and easily replaceable when it meets its inevitable demise at the Avon Canal).  To this end, we'll use an ESP32 development board running MicroPython - because we simply Can't Get Enough of its Love.
  • Finally, just to make sure this rig is working as expected and the readings look semi-sensible, we'll send our payloads using MQTT to a Raspberry Pi running a MQTT broker.  This bit isn't central to this post, and won't be described in huge detail, but it is certainly an area that we'll look at next so that we can do what we need to do with the data (if anything).  For now, we're simply running Flask, Flask-MQTT, and Flask-SocketIO on the Raspberry Pi so that we can make a quick and dirty graphic of the moving legs using the power of HTML5 and its canvas feature in a browser.

A Solid State Relay Team:

This is the long-awaited threequel in our so-called-adventurous, do-mundane-stuff-outside-to-make-it-look-more-interesting blog series - I-O-Mr-T.  Inevitably, however, the UK weather has deteriorated somewhat since our earlier excursions in the summer.  Just take a look at the glorious photos from our previous episodes. 

  1. Gold Filing
  2. Castle Track-a-lot

Yet - really - this craziness is a wholly illogical response to the events that unfolded in Taking a Peak: Xtreme² Edition.  Doesn't everyone have an urge to crudely attach IMUs to their legs after an endurance event to give them that Dave Brailsford-esque 1% advantage?  Anyone?  Hello?!  Anyone???


We'll take this opportunity to reiterate that cleverer people at a Colombian university have already demonstrated this concept rather more professionally using formulae and graphs and stuff.  However, not to be outdone by evidently more qualified people in academia, we will clumsily implement our very own interpretation using devices we find around our atypical household.

Here's what our recently struck off physio ordered:

  • Once the MPU6050 IMUs are cabled up to the ESP32 development board for I2C, we'll start with a quick recap on how to measure pitch using the accelerometer.  Of course, we'll be doing this from within MicroPython.  Because that's just how we roll (quite literally down the hill when we trip over our irresponsibly placed Dupont cables).
  • Then, we combine the accelerometer pitch output with the output from the gyroscope using a Complementary Filter in an attempt to estimate a value that is less prone to noise, doesn't drift over time, and is hopefully more resistant to distortions caused by running motion, black holes, The Apprentice final, swarming grasshoppers and Brexit - strictly in this order.
  • For bonus points, we're going to attach the IMUs to some elastic fabric in an attempt to make the monstrosity a trendy "wearable" that you might find in an Apple Store... discarded on the floor in the janitor's cupboard.

What viewers might find most striking about the below photo, except the distinct lack of a Zukini, is that there is a rather concerning intent on our part to actually use this outside, albeit under the cover of night to avoid awkward questions from the Avon and Somerset Constabulary.  Questions such as "Why on earth are you not using C for your embedded programming?" and "Why should we not arrest you tonight for possession with intent to supply Class-A bugs?"

Shoeports Science:

This entire premise is surely built on dodgy scientific ground, isn't?  Well, like Simon Pegg from Run Fatboy Run, we're just going to keep going, regardless.  Because it's all about reaching the end credits.  Rest of the storyline is immaterial.

We explained earlier how we intend to use the IMUs' built-in accelerometer and gyroscope to take readings at a reasonably speedy sampling rate.  By combining the two readings with different weightings - using some Sensor Fusion, or more specifically, a Complementary Filter - we should be able to estimate the angles of our upper and lower legs, in near real-time.  Then, by using the magic of MQTT, JavaScript and HTML5, we ought to be able to visually depict our leg position as deciphered by the ESP32, to ensure we're collecting the data that we think we are.  Guess this bit sounds a little like the established field of Motion Capture.

So why such negativity? Well, for starters, we're British.  But here are some other causes for concern:

Firstly, the body will be in motion, so there will be all sorts of other dynamic forces in play.  Then, there's the sheer quantity of data.  Where will we store it?  How will we get it to AWS IoT?  How will we pay our AWS bills, once we've bombarded it with 7 trillion JSON messages about a pair of legs that is plodding through a saturated, slippery field covered in grass and cow manure?  And what will we do with this muddy, pongy data puddle festering in the cloud like our trail running shoes after the last run? 

So many questions... can we concoct any answers?

No.  But here's a random picture of our IMUs against a backdrop of Flora tubs to keep the suspense desperately alive.  One day - we're certain - Upfield will make us their global ambassador for a new range of delicious butter substitutes: FLoRa Byte.

Circuit Training:

The MPU6050 IMU, like its MPU9x50 cousins, has an I2C interface.  Which means we firstly need to define our SCL and SDA pins, and initialise an I2C bus using the MicroPython I2C class.  Thereafter, we can use the MicroPython MPU9x50 MicroPython library to retrieve the readings from the module.

Here's a very basic example.

from machine import I2C, Pin
from imu import MPU6050
i2c_bus_scl = 19
i2c_bus_sda = 18
i2c_bus = I2C(scl=Pin(i2c_bus_scl, Pin.OUT, Pin.PULL_UP), sda=Pin(i2c_bus_sda, Pin.IN, Pin.PULL_UP), freq=400000)

Yep, that's right.  We have 2 IMUs connected in series, with a single module's address pin pulled high to make it appear on the I2C bus with a different address to the default.  Accordingly, the scan() method has successfully detected both of them, which means we can now start to play with the MPU9x50 library to get meaningful data.

Here's a simple starters-for-ten where we retrieve some accelerometer readings from each of the IMUs.  Notice how the library abstracts from us the actual trigonometry involved in obtaining angles from raw acceleration (m/s2) readings.  We covered these principles back in Balancing the Books, but luckily for us, this library gives us the absolute pitch as established using the accelerometer via the elevation or inclination attributes.

imu_1 = MPU6050(i2c_bus, device_addr=0)
imu_2 = MPU6050(i2c_bus, device_addr=1)

We're successfully interfacing with the IMUs, and getting accelerometer readings, which is a promising start.

But what exactly is it that we're looking at?

Well, if we're going to continue to work with just the accelerometer, we can use the elevation attribute of each IMU to establish the pitch.  Here's the theory, modelled to perfection, using some discarded loo rolls glued expertly together.

Notice how the actual values play out in respect to our leg position is a matter of how the IMUs are orientated and how we intend to represent the angles (in our case, between 0-360 degrees, starting at the improbable leg vertically behind our head position).  Also, the elevation attribute needs to be interpreted along with the x vector attribute so that we can re-factor the angle to our desired 0-360 degrees scale.  Otherwise, we find ourselves at the mercy of the library's elevation attribute, which simply outputs positive or negative angular displacement from the vertical, mirrored either side of the horizontal.

But this alone is unlikely to cut it.

After all, there will be all sorts of external forces muddying the accelerometer readings when we're on the move.  And it's unlikely to be receptive to small movements.  OK, let's use the gyro as well.  And undertake sensor fusion to combine the steady with the wild in 90 Day Fiancé style matrimony.

Clearly, we could use a Kalman Filter.  But truth be told, we don't know how to implement one yet.  Nor are we convinced that we could run one convincingly on an ESP32.  For this reason, let's resort to the use of a Complementary Filter instead.  It involves less maths.  Less brainpower.  And less headaches overall.

Here's the equation in the most basic form:

θn = α × (θn-1 + ω × ∂t) + (1 - α) × a

Fundamentally, the filter allows us to have a higher trust in the readings calculated from the gyroscope (θn-1 + ω × ∂t) in the short term, over that of the accelerometer (a).  The general principle being that estimating the current angle using angular speed (ω) is likely to be more accurate in the near term, but subject to drift over the long term in comparison to the accelerometer.  Combining the two, as Van Halen correctly observed in 5150, one of their three album titles that could be electronic components purchasable from Farnell (1984 and OU812 being the others), gives us The Best of Both Worlds. 

To this end, we'll need gyroscope angular velocity readings from the IMU (ω), which can simply be obtained through the same library we were using before using the attribute.  Gyroscope readings (ω) are outputted raw, in degrees/s.  Therefore, in order to estimate current angle (θn), we need to start with the last estimated angle (θn-1), and add to it the angular velocity (ω) × elapsed time (∂t) since the last measurement was taken - which would be the sampling rate if the program is perfectly timed.  This is repeated... like, forever.

Throughout, α is simply used as a constant that allows us to set a higher weighting to the gyroscope output, although this value could be tweaked through trial and error.

Let's try this out for real:

Twist the IMU around and it should pick up on the rather volatile gyroscope readings.  Now, onto the filter.

If the sampling is performed at a sufficiently high rate (we used every 20ms in the example below) and the code is not overrunning this schedule, we appear to be able to obtain a fairly stable reading that doesn't drift over time.

Here's the filter in action, but based on the incorrect assumption that the MPU library is providing an angle in line with our 0-360 degrees scale.

FREQUENCY_MS = const(20)
angle_pitch_filtered_previous = None
while True:
    start_time_ms = utime.ticks_ms()
    accel_pitch_angle = imu_1.accel.elevation
    if not angle_pitch_filtered_previous:
        angle_pitch_filtered_previous = accel_pitch_angle
    # Estimate angular movement since last sample
    gyro_pitch_angle = angle_pitch_filtered_previous + imu_1.gyro.y * FREQUENCY_MS / 1000
    print("Gyro says...", gyro_pitch_angle, "Accelerometer says...", accel_pitch_angle)
    # Complimentary filter algorithm
    angle_pitch_filtered = COMPL_FILTER_A * gyro_pitch_angle + (1-COMPL_FILTER_A) * accel_pitch_angle
    print("Complimentary filtered...", angle_pitch_filtered)
    angle_pitch_filtered_previous = angle_pitch_filtered
    # Calculate and enforce sampling rate
    time_elapsed_ms = utime.ticks_ms() - start_time_ms
    time_remaining_ms = FREQUENCY_MS - time_elapsed_ms
    if time_remaining_ms > 0:

We now appear to have the fundamentals working in the rawest form.  Note, however, that the above example doesn't do any of the re-factoring we mentioned before, nor implement any wrapping of values inside the 0-360 scale.  For this, please see the full example at the end of the post.

For now, we'll send these values via MQTT to a Mosquitto broker running on the Raspberry Pi, and using Flask-MQTT and Flask-SocketIO, we can redraw the anatomy on a HTML5 canvass using JavaScript.  This exact technique was used before in Red Current and Serial so we won't bore you with the nitty gritty details.

There is no password protection.  No encryption.  We're relying on the fact that none of the sporting world's espionage operations will be deployed against our world crass sporting academy.

Right. Time to strap these IMUs onto some elasticated bands and ™™™Velcro™™™ purchased from Hobbycraft in aisles that we simply never knew existed.

Who wants to see a photo of these purchases laid out gracefully on a table?  OK, next time we won't ask.

This is utterly crude.  And this messy chariot of wire (which additionally needs to house the ESP32 and battery) needs further work to miniaturise.  But it's ready enough to test the general concept in the most elementary fashion.  And this moment is made even more photogenic when it's possible to find shoes that match the 3D filament currently loaded in the 3D printer.

Yet before we alarm the neighbours by dressing up in dangling wires and LEDs, and sprinting breathless through the streets, we first opted to demonstrate this indoors.  And even if this endeavour turns into a complete failure of a project out on the trails, others might find some use in this as a simple motion capture tool with which there might be much more fun to be had.  After all, double the IMUs and it can be attached to the arms also.  Add the roll and yaw elements, and the simulation can happen in three dimensions.

Now - clearly - our next episode will need to describe how we transfer this data to the intended destination, with the view of performing some meaningful analysis of "DA. LEG. MOVEMENT".  We don't want the embarrassment of being a one-man mobile rave installation to be for nothing.  Perhaps we'll trawl through the stinky data pond with some machine learning.  Graph it using Grafana or Kibana.  ETC. etc.

Oh, listen.  That's the Zukini being delivered to the door.  In the famous words of the Walrus of Laaaab, this post is The First, The Last and Our Everything (on randomly attaching IMUs to legs to take totally legitimate readings). At least, until the next one, that is.

...which turned out to be Athlete's Footnote.

And since we have lots of different components being assembled, we took the mission a little too far, and designed and had manufactured a PCB for the entire setup. 

We have also decided to house them in a suitably 3D-printed case.

Possession with Intent to Supply Class-A Bugs:

Nutty Tales from Macadamia:

There are some great posts below about the perils of using either an accelerometer or a gyroscope in isolation, and the reasons why using a Complimentary Filter might be the answer:
We're using the MPU9x50 MicroPython library, which is available here:
...And here's the paper from the National University of Colombia that we keep referring to:



LoRa-Wan Kenobi

In the regurgitated words of Michael Bublé: It's a new dawn .  It's a new day .  It's a new Star Wars film .  For me .  And I'm (George Lucas, and I'm) feeling good .  Unfortunately for Canadian Mike, the Grammy that year was won by the novelty disco classic with the famous refrain: We love IoT, even in Planet Tatooine * . *Not true. Clearly, the Star Wars producers didn't sincerely mean the last Jedi the previous time around.  Return of the Jedi, released during the decade that spearheaded cultural renaissance 2.0 with the mullet and hair-metal , was less economic with the truth.  Either way, we're going to take inspiration from the impressive longevity of the money-spinning space-opera and reboot our franchise with some Jedi mind tricks.  Except this particular flick doesn't require an ever-growing cast of unrecognisable characters, unless ASCII or UTF counts.  In place of an ensemble gathering of Hollywood stars and starlets, we will b

Battle of BLEtain

The trolling . The doxing . An army of perplexing emojis. And endless links to the same - supposedly funny - viral video of a cat confusing a reflection from a dangling key for a golden hamster, while taking part in the mice bucket challenge. Has social media really been this immense force for good? Has it actually contributed significantly to the continued enlightenment of the human (or feline) race? In order to answer these poignant existential questions about the role of prominent platforms such as Critter, StinkedIn and Binterest, employing exceptional scientific rigour equal to that demonstrated by Theranos , we're going to set up a ground-breaking experiment using the Bluetooth Low Energy feature of MicroPython v1.12, and two ESP32 development boards with inexplicable hatred for one another.  And let them hurl quintessentially British expressions (others call them abuse) at each other like two Wiltshire residents who have had their internet access curbed by the co

Hard grapht

You would all be forgiven for assuming that bar , pie and queue line are favourite pastimes of the British .  Yet, in fact – yes, we did learn this back in GCSE maths – they are also mechanisms through which meaningless, mundane data of suspect origin can be given a Gok Wan -grade makeover, with the prime objective of padding out biblical 187-page PowerPoint presentations and 871-page Word reports (*other Microsoft productivity tools are available).  In other words, documents that nobody has the intention of ever reading.  But it becomes apparent over the years; this is perhaps the one skill which serves you well for a lifetime in certain careers.  In sales.  Consultancy.  Politics.  Or any other profession in which the only known entry requirement is the ability to chat loudly over a whizzy graph of dubious quality and value, preferably while frantically waving your arms around. Nevertheless, we are acutely conscious of the fact that we have spent an inordinate amount