Just how many times do we have to remind ourselves? If we're building a bona fide robot, we're eventually going to have to teach her/him/it how to balance like a pro. We've been making haphazard, almost wishful, assumptions that anything mechanical we build - like Rosie 2.0 - will never fall victim to the infamous (and cruel) powers of gravity as so often demonstrated by comedy clips on YouTube. That they'll gracefully glide around our physical space, no matter what, and simply blend in like the rest of us. In other words, just like humans. All while doing something extremely useful, of course. Like saving the world (when not destroying it). No pressure there!
But despite all the amazing advances made in technology to date, there's one good reason why we've yet to see humanoid robots freely roam around our cities. Sure, robots these days come on wheels. Tracks. Engines. Propellers. 4 or more legs even. But despite the fact that Blade Runner promised us in abundance bots that are indistinguishable from humans back in the early 80s, there are none to be seen on our streets (not ours at least). It's because, as of yet, mechanical contraptions built with two legs can't help but to stutter around clumsily, unreliably, in a tragically "robotic" fashion. Come to think of it: most un-human like, in fact.
Clearly, mastering the art of balance is actually quite hard for even the cleverest of brainiacs working in secret and non-secret labs around the world.
On the other hand, we can be strangely proud of the fact that us humans appear to have developed a sophisticated way of understanding our place in the physical world. And have evolved our brains and bodies sufficiently to control the large array of muscles at our disposal to do some pretty amazing things. Like jump. Walk. Run. Kick a moving ball. Then roll over and feign (terribly) an injury in a World Cup match watched by a billion people. And stand up afterwards like nothing happened. We don't need to give these tasks much thought (except perhaps the one about deceiving the refree). And as a result, we are able to saunter through our environment with some elegance and style. Generally, without crashing into things. Or suddenly collapsing to the ground in an embarrassing heap.
For robots, however, even controlling a smaller selection of electromechanical joints to simply remain upright can be an insurmountable challenge. And when they do, they are prone to walking awkwardly, or randomly losing balance without warning. Nevertheless, if we are to make a valiant attempt at it (as many companies and institutions now are), it'll probably begin with the use of a sensor of some sort that allows the robot to understand how it is positioned and moving in relation to the surroundings. Provide it with lots of data points that helps keep its balance.
And - clearly - this provides us with a nice little excuse to play around with an Inertial Measurement Unit (IMU). A WHAT? Is that something made up from Star Trek? No. It's a unit that allows us to wave a picture book about fairies around, and see its angles (in relation to the ground) represented on a terribly "retro" display (or at least, that's what we've decided to do with it). Now that sounds like some magic fairy dust that we ought to be bagging to sprinkle on Rosie in the future.
You don't want to be the tool without the tooling:
- A Raspberry Pi 3, running Raspbian OS
- A GY-521 InvenSense MPU6050 3-axis gyroscope / accelerometer module
- Use of a picture book (and hair bands!) to hold the IMU in place is strictly optional
- That's it. Are you disappointed?
Chequered history:Building Rosie 2.0 - naturally - assumes that there is a Rosie 1.0. For that reason, casting your eyes over the organised chaos that was Rosie and Rosie Patrol is highly recommended.
And then, there's also been the first few episodes of Rosie 2.0:
Bit by byte:
- Connect the MPU6050 3-axis gyroscope / accelerometer module (aka the "IMU") to the Raspberry Pi's GPIO I2C pins
- Still in Raspbian OS shell (command line), use a number of I2C commands to make sure you can detect the module, and that you can retrieve data from it
- Next, it's time to play around in Python. You gotta right?
- Getting bored? Thought so. If you want something "visual" to show for your efforts, see if you can develop a small display using Python curses to show you the device's current angles (pitch and roll). Not to be confused with rock and roll.
You ain't seen nothing yet:An Inertial Measurement Unit (IMU) is a clever device that allows you to gather useful data about linear and angular motion (think: how your smart phone or games controller knows when you've tilted or waved them, and how enthusiastically). As it happens, these are extremely important things to be able to measure when you want to understand how your mechanical creation is interacting with the real world. Is you robot accelerating through physical space, and if so, in what direction? And at what angle? Is it about to lose its balance and fall over, hopelessly intoxicated on AI? Was Deckard a replicant? Does he know? Can we think of even more random questions to append to this paragraph?
Meet the emu (or more scientifically known as the IMU).
The InvenSense MPU6050 is a widely used IMU which does its thing to capture these useful data points, using a built-in Gyroscope, and Accelerometer. The ying half of the device which is the gyroscope works out how fast it is moving around (angular velocity) the X, Y or Z axes - in degrees per second. Its yang, the accelerometer... well... measures linear acceleration (change in velocity) along these axes. All this data is pre-processed for you by a "digital motion sensor" chip. *Marketing blurb over*
Interestingly, if the chip is perfectly still, and laid flat, all readings should be near 0. That makes sense... because there's no linear or angular motion right? True. With the exception of acceleration on the Z axis. Why?
Oh, it's that dreaded nemesis Gravity again. On Planet Earth (where majority of our 10 readers are based last time we checked), we are constantly accelerating, due to gravitational force. Just like Newton's infamous apple. At an acceleration of 9.80665 m/s2 (also known as 1g in G-force) to be exact if you really must know. These may seem like a bunch of boring scientific facts that were safely ignored back in school physics classes. But it's rather quite important here. Because this tells us that the accelerometer should show us Z acceleration of 1g, when the sensor is perfectly stationary. And it is what allows us to work out the device's current angle when we start to tilt the device and measure emerging acceleration along the X and Y axes. Oh yes, using another subject area you probably chose to ignore back in maths classes: trigonometry (remember Señor Pythagoras?)
Great. So how do we interact with the device, and collect all this useful stuff? There's a bunch of Analogue to Digital Converters (ADCs) built in to the MPU6050 to interpret the sensor outputs... which means we only need to interact with the countless Register addresses to read / write bytes of data. There's even ways to interrogate its built-in temperature sensor, or attach Magnometers (compass), but let's not go there... for now. We appear to have our hands full already with things that actually sound quite clever.
And this particular IMU allows us to read from or write to these registers using the beloved I2C protocol. All we need to know is the address of the register which holds a particular value. Cast your eyes over the MPU6050 Register Map provided by InvenSense and you get a sense of just how much information you can fiddle around with... along with the various settings that can be configured.
Sigh. All this sounds suspiciously quite complicated. And time consuming. But potentially very, very useful. Onwards and upwards as they say (at a yet undermined angle)!
Detail:First things first.
This little baby IMU of ours needs to be cabled up to the Pi to be of any use. Using no more than 4 cables from the Pi, as it transpires. Did we also already mention that our IMU uses the I2C protocol to communicate? We probably did (and forgot to store it in our memory register). And it means that 2 of these 4 connections are used for the DATA and CLOCK I2C signals.
Identically to the way we used I2C to communicate with our Adafruit PWM / servo controller, I2C needs to be enabled first on the Pi using raspi-config. But we're counting on the fact that it already was. If it wasn't, you'll soon find out.
Here's a short table to summarise our connections to the MPU6050 module from the Pi.
|MPU6050 Pin||Pi GPIO Pin||Description|
|GND (Ground)||Any 0V Ground (e.g. 6, 9, 14, 20, 25, 30, 34, 39)||0V (Ground)|
|SDA||3 (BCM 2)||SDA - serial I2C data|
|SCL||5 (BCM 3)||SCL - serial I2C clock|
|VDD||Any 3.3V (e.g. 1 or 17)||Used to power the sensor using 3.3V|
Isn't it great when there is so little cabling to do?
But, as is always the case with our projects, everything ends up looking so much messier in reality. Notice that we 3D printed a little holder for our IMU. It'll come in handy when we end up using this for real on our various robot misadventures.
Now before we proceed any further, it's worth reminding ourselves of one important fact.
We mentioned earlier that if the MPU6050 is totally flat, and stationary, it should register close to absolutely nothing at all, except on the Z axis for acceleration (because of gravy tea... yuk). If you later find that this isn't the case, and there are suspicious readings all over the place, you might need to calibrate your MPU6050. As calibrating the device is a wholesome topic on its own, it's best to Google articles on how to calibrate a MPU6050 (i.e. we really cannot be bothered to write that up).
As before, for testing of I2C connections from the Pi, we want to make sure we have i2ctools installed, which will allow us to use some basic i2c commands in Raspbian OS (Linux) to crudely interact with device. And while you're at it, you should install python3-smbus to allow us to do more advanced I2C stuff later on, but programmatically, from within Python.
sudo apt-get install i2c-tools sudo apt-get install python3-smbus
Chances are, you'll find that these packages are already installed.
If the device is all cabled up correctly, you should be able to now run i2cdetect and confirm that the device is found.
sudo i2cdetect -y 1
Here, we find the MPU6050 discovered at address 0x68 (which appears to always be the default address for MPU6050s). Good news. As this should mean we can start to communicate with it over I2C.
You can now check to see if you can retrieve values from the device's specific "memory" locations - called registers - that the unit uses to store its data. These values can be readable sensor readings, but also writeable settings used to configure various aspects of the device.
Use i2cget, passing the I2C address of the IMU (0x68), together with the specific address of the device's register (in this example 0x75) that we want to read values from.
sudo i2cget -y 1 0x68 0x75
What? The returned value is 0x68? That looks suspiciously familiar. Wait...
It just so happens that register 0x75 holds the I2C address of the MPU6050. Which is kinda boring... and maybe even pointless (since you would have had to know the address to find it in the first place!) but proves that we are able to interact with the device, and its registers.
Now, how do we know that 0x75 was the register we needed to query to get our hands on this pointless byte of data? It's because the register map published by InvenSense tells us so. And according to it, 0x75's register name is "WHO_AM_I" which sounds exactly like the kind of jokey introduction that we just experienced (think: phoning up someone and asking them their number). "R" denotes that that it is read only - which means we are unable to do anything with this register other than to read values from it.
The register map is just about the most important piece of documentation you will need when working with the device. Because it tells you exactly which registers you need to address for what data. Note that each register holds a byte of data (8 bits), and sometimes, each one of these bits have a different purpose. What tells you this? The register map of course.
And now that we know how to retrieve data held in a specific register, and are now armed with the super helpful table, you should be able to steam ahead and start collecting useful data right?
Scroll through the document, and you'll soon arrive at some more useful looking registers. Try retrieving the value held in "ACCEL_XOUT_H" which is at register 0x3B - because the documentation tells us this is one half (high) of the X-axis acceleration reading.
sudo i2cget -y 1 0x68 0x3B
The output returned is 0x00 which looks suspiciously like no data.
That's exactly what it is. Because the MPU6050 needs to be taken out of sleep mode to start taking measurements. This can be done by setting the "PWR_MGMT_1" power management register at 0x6B to 0. Writeable registers can be set using the i2cset command, which passes the I2C address of the IMU (0x68), the address of the writeable register (0x6B), along with the value that we want to set it to (0x00). Give it a go.
sudo i2cset -y 1 0x68 0x6B 0x00
Now try retrieving the value from register "ACCEL_XOUT_H" again. Over and over again while wildly waving the MPU6050 device around.
sudo i2cget -y 1 0x68 0x3B
There. It's showing something now. And it's changing values too which is immensely promising, and must finally mean that it's collecting sensor readings.
But can we make any sense of it? No. As alluded to earlier, the value we retrieve is 1 Byte of data (8 bits), from one specific register. Represented in Hex. For sure, it's going to be one tough ask to continue this journey in plain old command line.
Time for us to take this to the next level. Using Python.
And as usual, you will be faced with two fundamental choices on how to proceed:
- Download an existing MPU6050 Python library that someone has already created and (probably) made available to the public on GitHub. Why reinvent the wheel if you don't have / want to?
- Try and re-create a MPU6050 module of your own. Because you do want to know how it sort of works, don't you?
Just be warned. This stuff can get pretty complicated, pretty rapidly. And we don't pretend to half understand some of the more advanced mathematical concepts that people apply to decipher and make use of outputs from various IMU devices (filters, signal processing, and the such like). But using an IMU to measure the tilt of a device, it's angular deviation from the x or y axis, is a pretty common reason for using an accelerometer. So it's as a good place to start as any.
Now before we ramble on - and pretend to know what we're doing - here are some useful material created by people who clearly do know what they are doing.
In a desperate attempt to make use of some of what we learnt, here's our plan:
- Create a class called MPU6050. We'll work out in advance - using the register map - which registers we need to interact with using the python3-smbus module. We'll also note down the sensitivity settings for both the gyroscope and accelerometer which are needed to re-scale the output.
- During initialisation (__init__) of our class, let's make the device exit sleep mode by sending 0 to the "PWR_MGMT_1" register using write_byte_data() method of the smbus module
- We'll then create a low-level method - _read_word() - to retrieve gyro and accelerometer readings from their respective registers using the smbus module's read_byte_data() method. Note that you need readings from both the high and low registers for each measurement (2 bytes, or 16 bits, in total). 2’s complement value is used to represent signed (negative and positive) numbers in binary.
- _read_word_2C() then centres the value retrieved on 0, allowing a range of -32768 to 32767.
- At this point, we can create a higher level method (get_gyro()) to retrieve raw gyro readings for X, Y and Z. We'll divide them by the sensitivity value of the sensor, and store the results in a dictionary for safekeeping (because we don't intend to do much else with the gyro).
- We'll do something similar with accelerometer readings through the get_accel() method. By default, let's convert the g values to m/s2.
All this data is interesting, but how on Earth do we calculate the angle (tilt) the device finds itself in? You could use the gyro readings... but that will only give you a rate at which it is turning around one of the axes. This would require some code to continually track what these rate are in order to estimate where the device might now be pointing. And relative to what? Where you end up facing after turning 90 degrees very much depends on where you were facing to begin with. Gyros appear to be used to get an idea of short term angular movement, and used in conjunction with other sensors. We're told; they are not very accurate over longer time frames due to drift.
There is, however, a more widely used technique that involves the use of the accelerometer instead to calculate these angles (which at first seems counter-intuitive as what does linear acceleration have anything to do with angles). It also comes with a pretty solid reference point - gravity - which gives us absolute measurements relative to this constant that we find anywhere and everywhere (well, on Earth that is).
And by angles, we mean two in particular*:
By now, hopefully you've read one of the helpful references above that explains all this in a more professional manner. But in short, calculating pitch and roll involves the use of Pythagoras' Theorem and some trigonometry. Simply put: we are able to measure acceleration along X and Y axis of the device (which we can using the MPU6050's accelerometer). And because gravity is constant, if the device is perfectly stationary, but tilted, we can attribute the resulting X and Y acceleration measurements to being X and Y components of gravity. And where we know values of certain sides of a triangle, and want to know angles, well you know this story already (if you paid attention in class)... trigonometry is your friend.
It's worth reiterating that this process relies on the device being stationary. Any other non-gravitational acceleration taking place makes the calculations no longer so reliable.
For those interested in the actual formula, here they are:
Note that result of the Python math function atan2() is in the not-so-friendly radians unit, and therefore needs to be converted into to the much more familiar degrees (θ) unit using the degrees() function. This way, our brains can compute the angles better.
So all we do now is write some Python code that does just the above. We have _calc_pythag() method to workout the third length of the triangle using Pythagoras' Theorem. Then get_pitch() and get_roll() methods to calculate the angles using the atan2() function.
Here's the full module that makes this all happen.
Let's test this out. Import the module (we called it rosie_imu.py), instantiate a MPU6050 object (let's call it imu_1), and run some of the top-level methods we created.
import rosie_imu imu_1 = rosie_imu.MPU6050() imu_1.get_gyro() imu_1.get_accel() imu_1.get_accel(False)
Now that looks promising. The get_gyro() returns us the degrees per second angular velocity (which is interesting but we've decided not to use). The get_accel() returns us linear acceleration in meters per second squared (ms2). With the "False" flag set, we'll see the same thing in just plain g.
How about our pitch and roll calculations? Let's run the methods.
Place the MPU6050 at different angles and re-run the methods a few times. And it does indeed look like we have some semi-believable tilt data.
It goes without saying, this is extremely crude.
You wouldn't want to rely on this module to auto-pilot an Airbus full of 300+ fairies, let alone taxi across the tarmac. More advanced IMU code will combine both the gyro and accelerometer outputs (perhaps even GPS receivers and magnetometers too) to get an even more stable and accurate sense of angle and direction - in a process called Sensor Fusion. Most certainly, the sensor data will also be processed using somewhat more advanced mathematical techniques, such as the Kalman Filter, to reduce the overall noise of the measurements and give the entire workings more reliability.
If you are interested in one such method of sensor fusion - the Complementary Filter - we actually use it in a later blog post - Chariots of Wire.
But, all that is - now - for another day.
By the time we reach this stage in our experiments, it's become customary for us to do something completely useless, using something potentially quite useful. Clearly we can obtain pitch and roll readings. And to prove that they are in the right ballpark, we'll attach the MPU6050 to a book and wave it around a bit (balancing the books... now get it?) In fact, it's better than that. It's a 1001 Things to Spot in Fairyland book, with the Pi, battery pack and MPU6050 attached to it using hairbands. Now that's upping the ante! There will be some very dizzy fairies shortly.
And just because we want to create a terrible-looking, blocky display that allow us to visualise these readings, like those you tend to see in science fiction films from the 80s, we'll use curses - a screen painting program for a text-based terminal that works over the SSH session. Curses should already be available in Python.
We won't cover in too much detail how we used curses to put together our pathetic "display", but in short here are the steps we took:
- Start curses and initiate a "window" object
- Take some pitch and roll readings using our rosie_imu module
- Re-scale the pitch reading to fit as a horizontal line in the pitch portion of the display
- Use trigonometry (cos and sin functions) to work out the start and end x and y positions for the diagonal line for the roll portion of the display. Draw the line in between using a while loop.
- Refresh, clear screen and repeat forever... or until keyboard interrupt (Ctrl+C) is received
The result is a little underwhelming, and looks like a bug-ridden Space Invaders game, but it does demonstrate in real-time that we're obtaining some plausible data from the IMU about the angles the fairyland finds itself in. Substitute the book for a robot limb, and you can suddenly see how this stuff is valuable.
If you are so inclined to recreate this primitive, monolithic excuse of a "display", here is the code. And no, not much effort was put into making this.
All things considered, it turns out, this was a rather exciting outing. It had a bit of everything; a powerful sensor, physics and... you gotta love it... maths that we should have remembered from school. And we actually ended up with something that seems rather useful when it comes to building functional robots that don't just fall over randomly
Clearly, there is a lot more to do this than meets the eye. And considerable room for improvement. But this technique goes a long way to making your robot creation... stable.