Skip to main content

this["orientated"]



This is it, our dedicated Terran readership (yes, all 3 of you).

We've reached the very last instalment in our third, and still incredibly unspectacular DIY robotics series - Rosie 2.0.  What has at times felt like a never-ending, rambling blog has turned into an eclectic curation of often unrelated Raspberry Pi and ESP8266 based activities (assembled under the highly misleading guise of "robotics").  And - quite frankly and not so unusually for us - we lost our bearings somewhat.  OK, badly.

Except, actually, we didn't get completely and utterly lost.

During our very last expedition, we took our GPS-enabled Raspberry Pi around the 10 peaks of the Brecon Beacons.  And perhaps, we inadvertently proved that there are some ingredients required for autonomous navigation that can be obtained on the fly (as opposed to from a fly, because that would be kind of weird) - for cheap.  In other words, you don't have to be an eccentric billionaire with a penchant to name yourself after an arbitrary element in the periodic table, or if all 118 have been exhausted, an animal, to hack together a gadget that accurately positions stuff on Earth.  Yes, that's right. You and your little bionic chihuahua can hunt down hostile aliens with reasonable precision in the unlikely event of an ensuing intergalactic invasion, and all this without the help of Uber.  Providing that those extraterrestrial marauders provide you with their precise GPS coordinates - preferably before they get too close and do unimaginable things to us.  Like probe us for our post-it note with the current Wi-Fi password.  Or ask us to explain the ins and outs of Brexit.  Yikes.

Therefore, it seems fitting that we conclude Rosie 2.0 by knocking up a gizmo that helps orientate us.  Specifically, we'll extend our arsenal to include a magnetometer... because it appears to be the crudest way to make a semi-educated guess on the correct direction of travel once a) we know where we are on Planet Earth, and b) we know where we want to go (all other trivial obstacles in between, like impassable cliff faces, deadly magma pits, and grannies crossing roads, permitting).

And since we've not spent anywhere near enough time on actually developing the chassis for the robot, we'll demonstrate the basic principle using a TFT screen connected to the Raspberry Pi.  By mashing together elements of MQTT, HTML 5, Socket.IO, JavaScript and - of course - the Gorilla Glue of our trade, Python, we'll create a digital compass, or more truthfully, a highly suspect, imaginary alien location device.  Yes, we could call it some sort of "brilliantly advanced neurotic alien neutraliser apparatus", or B.A.N.A.N.A for short.

Let's go alien hunting with a BANANA*.

*2 × AAA IoT-connected "smart" alien invaders manufactured by Samsung-Weyland-Yutani Corporation sold separately (price: your life).

The ARMoury:

It feels like there might be an extensive kit list required for this seemingly semi-ambitious project... but the list turns out to be surprisingly short.

Here's what we discovered in our space station scrap heap that we'll be using towards the build:
  • MPU9250 is a rather special Inertial Measurement Unit (IMU) by InvenSense. Unlike the MPU6050 we've already tinkered with, for a few of yer English pennies more, this little clever so-and-so not only comes with an accelerometer and gyroscope, but also a built-in AK8963 magnetometer.  Perfect for projects that require magnetic strength to be measured.  Such as a digital compass.
  • As simple and beautiful as they are, we started to quickly realise the limitations of working with small factor LED, LCD and OLED displays.  They are great if we don't need to show a lot of information to the occasional unsophisticated passer-by. But, unfortunately, millennials these days demand colourful, fluid 1080p / 4K /  HDR / Betamax visuals that they can Tweet to every one of their 35 followers about.  Often, while sitting idly on the loo.  Enter the 3.5 inch 480 × 320 TFT screen.  The screen we have is touch capable (via a XPT2046 touch panel controller), but we're only interested in its display for now.  We used a model by Elegoo.  It sounds a little like "alien goo".
  • Ah yes. We're never too far away from a temperature sensor (or two, or three)... and our favourite - the DS18B20 - makes a predictable return to the theatre of action like Sigourney Weaver in need of a pay cheque.  Interestingly, we noticed that the MPU9250 IMU has a built-in temperature sensor, but we'll opt to use a dedicated instrument so that we can expose it more directly to the inhospitable climate of England deep space.  Don't forget the pull-up resistor if you just have the sensor, and not a module.  Never forget the pull-up resistor.
  • A Raspberry Pi.  Of course.  And not a single ESP8266 is in sight for this adventure.  More on the exact Pi model below.
  • Last but not least, it sounds like we'll be requiring a GPS receiver?  We sure are!  GY-NEO6MV2 is a serial (UART) GPS receiver based on the U-Blox Neo-6. A slight deviation from the USB variants we've been using to date, but nonetheless, perfectly usable.
Weapons activated; let's enter the battlefield.

3B 0 - battle of the π:

This was actually a decision that continued to confound us while the credits were still rolling, and beyond.

On the face of it, we shouldn't require the better equipped Raspberry Pi 3 Model B.  We haven't got a use for any USB or wired Ethernet network ports.  The Pi Zero has all the connections we'll be needing, and takes considerably less space.  It's noticeably cheaper as well (a significant point, since, clearly, due to an impending alien apocalypse in 2019, every member of Earth Force will be building one of these using spare change left over after their weekly Tesco's shop).  After all, every little counts.

But we can't easily shake off the question... will the Zero's single core 1GHz CPU and 512MB of RAM be capable of running all this?  Initially, the answer appeared to be a not-so-emphatic yes (just about).

Yet once we really got going with the browser in kiosk mode, running badly written JavaScript, together with a bunch of Python applications that run in the background, limitations of the Zero started to become painfully apparent.  There simply isn't any headroom left... like for when we eventually succumb to the urge to draw out the entire floor plan of U.S.S. Sulaco in JavaScript and HTML5.  While occasionally breaking out to play Minecraft.  It looks like we will be having to revert to using the Pi 3 Model B with its quad-core 1.2 GHz CPU, and 1GB of RAM.  In other words, like choosing between Alien and Aliens, it all depends on whether you want to carry around a spare "s" in your reserve... the secret sauce.


Moving swiftly long (the Earth's surface)...

Prelude / prequel / plot-hole ridden backstory:

Building Rosie 2.0 - if you believe the non-existent marketing literature - insinuates that there was a Rosie 1.0.  For that reason, casting your eyes over the organised chaos that was Rosie and Rosie Patrol is highly recommended.

Specifically, this post is a natural continuation of previous episodes - Beam Me Up, Rosie! and Taking a Peak.  We also got acquainted with MQTT and JavaScript / Socket.IO in Red Current and Serial.

Rosie 2.0 has consisted of the following episodes, if you really must know:
  1. PiAMI HEAT
  2. Crawly creepy
  3. 2 crawly 2 creepy
  4. Balancing the books
  5. Shaken not steered
  6. El-oh-wire
  7. Raspberry bye, hello
  8. Red current and serial 
  9. Taking a peak

Plot line:

  • We promised you some tiki-taka graphics which is easy on the eye, and isn't Messi-y. One that allows us to display a colourful, dynamic alien hunting interface that James Cameron or Ridley Scott would be proud to use as a doorstop on set, or an interactive loo roll holder.  To this end, we'll re-route the Pi's default desktop from the HDMI output to the TFT screen connected using about the same number of SPI and GPIO pins as there are Alien films (not likely to $tay $o for long).  
  • We'll then connect the MPU9250 (via I2C), GY-NEO6MV2 U-blox GPS receiver (via UART) and DS18B20 temperature sensor (via GPIO / 1-Wire) to the Pi, and frantically start obtaining readings using Python like that unimportant extra in the backdrop that always gets taken down by a Xenomorph, Predator, or the 38th Governor of California.
  • Then, we need to religiously propagate collected information around our Pi like Google or Facebook does the galaxy.  We'll use MQTT for this, since it's the "go to" technology for overwhelming networks with content of dubious social importance, a title previously held by Netflix and YouTube. 
  • Finally, we will give birth to a JavaScript-driven web-page infested by a colony of HTML5 canvas drawings.  We'll make good use of Socket.IO to ingest the MQTT data.  Flask will provide the web services.
  • Is there more?  We don't think so.  For now.  Time to get our breath back.
As you've rightly concluded by now, this is a rather over-engineered piece of "B.A.N.A.N.A Alliance® certified" makery, a hotchpotch of technologies that could probably be consolidated somewhat.  Exhibit A - the diagram:


But in our intergalactic defence, the technology choices leave this open to considerable (and increasingly nonsensical) customisation.  Like more sensors and gadgets.  More graphics.  Integration with other "stuff" on the network and beyond.  Ooh, the Cloud.  In other words, it could be turned into one of those blockbusters that has a £500 million budget, but only takes in £5 during the opening weekend in some pop-up cinema down in Land's End (and the £5.00 was for a single pack of Haribos).

All so alien to me:

So what exactly are we trying to achieve here, because we've confused ourselves somewhat.

...It feels like the overwhelming aim of this project is to demonstrate that armed with our GPS coordinates (from the GPS receiver), and that of someone else's (made-up alien coordinates), we can work out the distances between us and those pesky space-travelling critters.  Moreover, to assist us in working out in which direction we physically need to head, we could use a digital compass of some sort.

MPU9250 is one of those super clever accelerometer / gyroscope modules that we can obtain various motion-related readings from using I2C. But its secret weapon?  A 3-axis AK8963 magnetometer.  And by using it to measure Magnetic Field Strength (Teslas) out in the wild, we should be able to establish our heading relative to the Magnetic North.  This isn't new news.  The method has been tried and tested to navigate the world, by accomplished historical navigators such as Christopher Columbus, Vasco da Gama, and Donatella Versace, for centuries.  But the MPU9250 wasn't around in 15th Century Europe (there are no Renaissance paintings or Domesday records to indicate this).  And the gentry certainly weren't slaying intergalactic brutes back then that have an inherent dislike of humans.  No,they terminated peasants instead.

Great.  All in all, this warrants another rambling blog post, then.

So in the interest of mankind, and its continued survival, we'll create a digital compass, with some imaginary aliens* superimposed on-screen.  Undoubtedly a good use of a Raspberry Pi and a TFT screen, I'm sure you'll agree.


*Replace "aliens" with nearby Greggs stores, your GPS-tagged teenagers, or the entire meerkat population of the Kalahari Desert for realistic use cases that are more likely to impress the Noble Prize judges.  Or, of course, Pokemons.

Deep dive into space:

Let's launch straight into the nitty-gritty detail of this alien autopsy.  Starting with the display.

The grand plan is to automatically launch Raspbian OS's default web browser - Chromium - sometime after the Pi boots up.  Chromium will be launched in "kiosk mode", in full screen, with no boundaries and no mouse.  This will allow us to serve up a delicious HTML5-based web page which will form the basis of the B.A.N.A.N.A display.

And while we're still talking about the display, clearly, a dedicated TFT screen is being used with this Pi setup.  We chose the Elegoo 3.5 inch 480 × 320 TFT screen, for no other reason that it was the cheapest we saw online (at the time).  Other makes and models are available.

Our TFT screen came with documentation that describes the connections required:


Our initial reaction was, wow, that's a lot.  But do we really need them all?

Our TFT is a HAT (or Hardware Attached on Top).  And the block really gets in the way.  More importantly, we shouldn't need all the connections if we don't need touch screen functionality, and it appears as though we're covering multiple power pins for the sake of this TFT being a HAT.  Let's go it alone, through the dark flight deck.  Because nothing bad ever happens in sci-fi films when people go it alone.

We tried and tested only the connections that are stated as being relevant to the display, and wired them in manually.  This means we are free to use other pins on the Pi, which would have otherwise been blocked by the HAT... to do other stuff.  We're looking at you I2C pins for the MPU, and UART TX/RX pins for the GPS receiver.

Pi PinTFT PinDescription
6 (Ground)0V (Ground)0V (Ground)
2 (5V)Vcc5V used to power the module
18 (BCM 24)LCD_RSLCD instruction control
19 (BCM 10) SCLSPI data input
22 (BCM 25)RSTReset
23 (BCM 11)LCD_SCK/TP_SCKSPI clock of LCD/Touch Panel
24 (BCM 8)LCD_CSLCD's chip selection

Next, we followed the manufacturer's steps to divert the Pi's HDMI output to the TFT display.  This process simply involves the use of GoodTFT's LCD-show, which after install, outputs the Raspbian OS's desktop via SPI to the TFT screen.  Mágico!

sudo rm -rf LCD-show
git clone https://github.com/goodtft/LCD-show.git
chmod -R 755 LCD-show
cd LCD-show/
sudo ./LCD35-show

Reboot the Pi, and that was about it.  If all connections are present and correct, Raspbian OS's glorious desktop will be splashed across the TFT screen.  A pretty significant victory on our quest, and so early on.


Now, there's a bit of hygiene to be performed in Raspbian OS, since we want to ensure that the display remains ON all the time.  For example, we don't want the screensaver to kick in.  Or an immovable mouse to appear over our display.

We changed the following two settings in /etc/kbd/config.

sudo nano /etc/kbd/config 

BLANK_TIME=0
POWERDOWN_TIME=0


Then, following even more information we found on the web about operating Pi in kiosk mode, we also modified the ~/.config/lxsession/LXDE-pi/autostart file.

sudo nano ~/.config/lxsession/LXDE-pi/autostart

The "@unclutter" appears to require unclutter to be installed.

sudo apt-get install unclutter

@xset s 0 0
@xset s noblank
@xset s noexpose
@xset dpms 0 0 0
@unclutter -idle 0


If you want the desktop's task panel to completely disappear as well, you can edit the following.

sudo nano ~/.config/lxpanel/LXDE-pi/panels/panel

autohide=1
heightwhenhidden=0


That's all we need to do for now.  Later, the Chromium auto-startup is configured using Supervisor.

When the Chromium browser is now launched with the URL of the local Flask instance (http://localhost:5000), it won't show anything.  Not good, because we would be hunting down bloodthirsty extraterrestrials with malicious intent using a rather embarrassing "this site can't be reached" screen.  Aliens: 1, humans: 0.


Time to move on, then.  We hear unidentified flying objects are inbound.

We intend to have 3 sensors / modules.  Here they are, in a stylish table that gives the impression that we know what we're doing.

UseSensor / ModuleInterfaceHave we used this before?
MagnetometerMPU9250I2CMPU9250 is a fully fledged gyroscope / accelerometer.  We'll only be using the built-in AK8963 magnetometer.  We introduced its less capable sibling - MPU6050 - in Balancing the Books.  We're not being judgemental... it's just that the MPU6050 didn't have a magnetometer.  Bit of a showstopper, really.
Temperature sensor DS18B201-WireThis is a digital temperature sensor, accessed using the 1-Wire protocol. As our favourite temperature sensor, we've used this many, many (many) times before.  Its basic premise was described in PiAMI HEAT.  And guess what, it records temperature.  End of.
GPS receiverGY-NEO6MV2UARTWe've opted to use a serial (UART) GPS receiver, instead of a USB variant that's surfaced in our posts in the past.  Beam Me Up, Rosie! has been updated with the additional steps required to interface with a serial GPS receiver.  Aren't we organised?

So all in all, our little gizmos can be accommodated using the remaining pins on the Pi, even after the TFT screen has been cabled up, thanks to amateurish creative cabling between the TFT and the Pi.

Here's another table (don't worry, we'll knock up a few more before the very end).

Pi PinSensor / ModulePinDescription
9 (Ground)MPU92500V (Ground)0V (Ground)
1 (3.3V)MPU9250Vcc3.3V used to power the module
3 (BCM 2)MPU9250SDAI2C data
5 (BCM 3) MPU9250SCLI2C clock
17 (3.3V) DS18B20Vcc3.3V used to power the sensor
20 (Ground)DS18B200V (Ground)0V (Ground)
7 (BCM 4) DS18B201-Wire GPIOTemperature readings (1-Wire). Don't forget pull-up resistor if it doesn't have one already.
8 (BCM 14)GY-NEO6MV2RXUART - data
10 (BCM 15)GY-NEO6MV2TXUART - data
17 (3.3V) GY-NEO6MV2Vcc3.3V used to power the module
34 (Ground)GY-NEO6MV20V (Ground)0V (Ground)

Can you spot the anomaly?  Can you?  CAN YOU?

The most eagle-eyed observers might have noticed that we have the 3.3V (pin 17) listed twice.  That's right, the Pi only has 2 of these, and we require 3.  For now, we've simply shared pin 17, by temporarily using a mini breadboard.  This could just as easily be achieved using a splitter.  Or an external power supply.  Or a limb of a defeated alien for maximum theatrical effect (if it hasn't been ushered off to Area 51 before then).

Here is the mess we created.


There is nothing out of this world about majority of the sensors and modules we decided to connect - and we've chronicled their use during our various outings described above.  However, the one that needs introducing is the MPU9250, and specifically, its AK8963 magnetometer.

As an I2C device, and as we saw with the MPU6050, the clever chips on the module do all the work for us, and it simply then becomes a case of accessing relevant Register Map addresses to set a specific configuration, or read its data.

For example, here's the extract from the register map for the AK8963 magnetometer.


Not unlike the accelerometer and gyroscope readings, there appears to be a dedicated set of addresses for the magnetometer.  Specifically, measurement data (micro Teslas) can be obtained through a set of addresses.


Slightly confusingly, however, the AK8963 magnetometer doesn't immediately appear when the entire module is first connected to the Pi via I2C.  We could only see 0x68, which is the accelerometer / gyroscope.


After changing the module to "bypass mode" (setting 0x02 on register 0x37), the AK8963 magnetometer sprang into action and appeared on the I2C connection with an address of 0x0c.


Clearly, it's best to use a ready-made library that handles all of this (including initialisation of the AK8963).

We found one by FaBoPlatform which fitted the bill perfectly.  We'll only be using the AK8963 portion of the driver, and use it to return the magnetic field strengths (in ÎĽTeslas).  Quick glance on the web informs us that other libraries are available.

First, we cloned the library from GitHub.

git clone https://github.com/FaBoPlatform/FaBo9AXIS-MPU9250-Python


Then we navigated to the subdirectory and ran the setup.py file to install the goodies.

sudo python3 setup.py install


With the library supposedly installed, and MPU9250 connected, let's get some readings.  Import the module, instantiate a MPU9250 object, and run the readMagnet() method.

from FaBo9Axis_MPU9250 import MPU9250
mpu = MPU9250()
mpu.readMagnet()

That's about it.  There's some raw magnetometer readings that we can work with.


We used a bit of geometry to convert this into a usable heading that the compass needle on our display will reflect.  That's it, repeat after us. S-O-H, C-A-H, T-O-A...

import math
reading = mpu.readMagnet()
180 - math.degrees(math.atan2(reading["y"], reading["x"]))

Finally... a workable heading, which basically forms the foundation of the compass display.  It is the angle in which the compass needle will be drawn on-screen using JavaScript / HTML5, and also the offset by which all other relative graphics are displayed.


But we must talk about the prehistoric mammoth in the room... calibration.

Whether due to the magnetometer not being calibrated correctly in the factory, or natural or artificial magnetic fields present in the environment, raw readings need to be offset before use.  There's plenty of material available online (along with scripts) that advise you on how to achieve this.

Just look at what happens when a speaker is in the vicinity of our old-school compass.  If this speaker was a permanent fixture on the rig (because sounding a warning alarm when a facehugger is nearby, rather than on your face, sounds incredibly useful), it would need to be compensated for during calibration.


In the end, we opted for the simplest of the calibration methods.  In short, we generated a large number of magnetometer readings over several minutes, while wildly moving the sensor around like the Macarena gone wrong.  Afterwards, we took the average readings in the X and Y axis, and stored them as offsets that we later use to subtract from our raw readings.  Crude. 

This method is unlikely to be fit for purpose if this rig is moving around the world (more on this later), or if there is dynamic equipment in the vicinity that is only temporarily interfering with magnetic readings.  Let's just not tell our invaders about this.

OFFSETS = {"X_OFFSET": 6.765999999999998, "Y_OFFSET": 21.793999999999997}
180 - math.degrees(math.atan2(reading["y"] - OFFSETS["X_OFFSET"], reading["x"] - OFFSETS["Y_OFFSET"]))


All things considered, the heading shown on our digital display appeared to resemble what was shown on the actual compass we have for legitimate outdoor use.


That's now all of our hardware ready.

Incidentally, working with magnetometer readings has a whole load of complications associated with it. We'll talk about a few here, and how we are / aren't addressing them (we're sure there are many others we missed).
 
TopicWhat's the problem here, sunshine?Aha, so what are YOU doing about it?
True North vs. Magnetic NorthIt turns out Earth's magnetic field is surprisingly dynamic.  The actual "Magnetic North" that the compass points to, is different from where we'd ideally like it to be on a map, bang in middle of North Pole ("Truth North").  What's more, we're told that the Magnetic North moves around over time.

There are people far cleverer than us who monitor the position of Earth's Magnetic North as it stands currently.  They can even tell you what the difference is between Magnetic North and True North ("declination" angle), depending on where you are (remember, this "discrepancy" varies relative to where you are on Earth). For now, however, we're choosing to depend solely on initial calibration.  Armed with a database of current declination values, we could compensate for it dynamically in code. For the UK, Ordnance Survey appears to have this information as concept applies equally to walkers with low-tech compasses.
HaversineUnless you're a fanatical denier, or no-one ever told you, Earth is NOT flat.  End of.  Therefore working out ground level distance between 2 GPS coordinates isn't simply a case of applying Pythagoras' Theorem we all learnt in school (or not).Keep calm, and carry on (using the Haversine method).  It allows us to work out what the distance is along the surface of the Earth, between 2 GPS coordinates.

"Z" axis magnetometer readingWe appear to only be interested in X and Y magnetometer readings, but the AK8963 is 3-axis.  What are we doing with the Z reading?Nothing.  Happy?
No, seriously.  We didn't see a need to work with magnetic strength along the z-axis for our requirements... but we're sure there will be Internet netizens out there who will be happy to disagree.  We are assuming that the device will be held level when being used, like with an actual compass.  And we're working in relatively small scale (and not too bothered about pin point accuracy).
4th row of tableWhat's this 4th row for in this table? It's just titled "4th row"!This is indeed the 4th row in the table, which we were going to keep blank, but couldn't because you asked about it...

Once the magnetometer, temperature and GPS readings are collected using a Python-based application, we intend to publish all these sensor readings to the local Mosquitto MQTT broker, which we'll do using the lightweight paho.mqtt.publish() module. We'll call this sensor data collection application:- rosie-sensors.py.

Here are the other components that complete the deployment.

WHAT?!No, what is it called actually?Have we used this before?
MQTT brokerMosquittoMosquitto will be the MQTT broker running on the Pi, receiving the sensor readings locally from our rosie-sensors.py application using MQTT.  Once installed, it runs as a service, with minimal configuration required. 
We installed Mosquitto in Red Current and Serial.
The broker could just as easily receive sensor readings from other nodes on the network.  Think: IoT aliens.
Web (Flask) applicationFlask, Flask-MQTT, Flask-SocketIOOur Flask-based web application - rosie-web.py - will subscribe to sensor readings being published to the broker, and convert them into Socket.IO data that is streamed out to the connected browser.
We proved this mechanism before, also in Red Current and Serial.
Browser JavaScript / HTML5All the display logic is performed in the browser, based on Socket.IO messages being received.  JavaScript is also used to draw the graphics across the HTML5 canvas.  This is by far the messiest part of the entire project.

In short, whenever a new Socket.IO message is detected by the JavaScript running in the browser, it instructs certain contexts in the HTML5 canvass to be redrawn.  With the "display elements" - simply showing the latest readings - this is straight forward.  The rectangle is re-drawn, re-filled and text updated.

For the compass contexts, however, there's little more geometry involved to ensure that the compass needle is drawn as per the magnetometer heading, and that "other nodes" appear correctly in relation to the device's compass heading.  Dust off that secondary school maths textbook.  You'll be needing it.

Here is a rough diagram summarising the various elements in play on the display.


Lastly, Supervisor is used to start up and manage our 3 programs when the Pi boots:
  • rosie-sensors.py - is monitoring the sensors and sending out readings via MQTT.  There is an accompanying shell script which is in fact being used to launch the Python application after setting MQTT credential details as environment variables.  Supervisor starts this shell script.
  • rosie-web.py - subscribes to sensor readings and presents web services to browser.  Yes, this also has an accompanying shell script which is in fact being used to launch the Python application after setting MQTT credential details as environment variables.  Supervisor starts this shell script, too.
  • chromium - starts up and keeps browser open in full-screen, kiosk mode.  Browser is started after a delay to ensure that other Python apps have time to startup (with the Zero, this needed to be quite large, not so much with the Pi B).


We used Supervisor all the way back in Hello Supervision, Goodbye Supervision.  Chromium is being launched using the following flags (including 60 second delay), wrapped up neatly in a shell script:

#!/bin/bash
sleep 60
export DISPLAY=:0.0
/usr/bin/chromium-browser --incognito --start-maximized --window-size=480,320 --window-position=0,0 --kiosk --noerrdialogs --disable-translate --no-first-run --fast --fast-start --disable-infobars http://localhost:5000

And this is it.  If everything is working together, the browser should launch, access the local Flask web server, and start drawing on the HTML5 canvas using data streamed through the open web socket.  Clearly, there is a load more code in JavaScript to keep track of other nodes, canvas settings, and provide the system logic, but these should be apparent from the code.

All in all, a satisfying result.


This setup warrants a 3D-printed case of some description.  One that exudes Hollywood sci-fi glamour.

  

Don't disclose your full GPS data to anyone, especially latitude / longitude information. Unless you're in dire need of rescue by emergency services (in which case please stop tinkering with a Raspberry Pi and reassess your life priorities).

By its very definition, GPS is a system that positions you... globally.  And if you made it this far in this post without realising this... this corner of the Internet probably isn't for you.


James Horner's melancholic score has just kicked in... so it's time to wrap up.

Clearly, this entire setup can be enhanced in an infinite number of ways, not least by addressing the very nonsensical reason for its existence.   For example, with the addition of a RFID reader and cards located in gruesome pretend eggs scattered around a wide open space outdoors, we could organise a "highly educational" Easter Alien Egg Hunt, under the guise of STEM advocation.  Hey National Trust - are you interested?  Or with cellular or LoRa comms added, something more dynamic could be played out in the real world.  Because everyone seems to be talking about LoRa*...

*Fast forward to April 2019 and we do end up playing around with LoRa, in LoRa-Wan Kenobi.


Of course - more sensors and modules could also be added, and other displays created... maybe even the touch screen utilised.  Using MQTT, JavaScript, SocketIO and HTML5, there is flexibility to fashion up a user interface that makes it look like it's an advanced piece of cyber gear that ought to have made it into a Van Damme epic (if there was one).

And if we do eventually move onto using this to navigate a robot, we are evidently going to require some mathematical filters (such as Kalman) to reduce the noise present in the sensor readings.  Furthermore, we would need to use outputs from other sensors, namely the accelerometer and gyroscope, to accurately work out by how much we rotate the robot to point it in the right heading. GPS data too can re-confirm actual heading, once the robot starts to move.  A little bit of sensor fusion never hurt anyone.

But all that stuff can wait until the wholly unnecessary,  straight-to-video sequel to this blog post that no-one - literally - asks for sees the light of day... This Orientated Resurrection3: The Requiem.  In fact, there might actually even be video, courtesy of HTML5.  B.A.N.A.N.A+, coming to a theatre near you.

Now excuse us.  It's late, and we really ought to make a start on the U.S.S. Sulaco floor plan in JavaScript.

Update 1 - January 2019

Well, this rather over-engineered contraption wasn't exactly left in this state for long.

Because with the recently earned freedom to pixelate our display as much as humanly possible using HTML5's canvas, the job to make something unashamedly over the top wasn't yet mission complete.

Attached to a simple push switch connected to a GPIO pin, we made the device toggle between different views.  First off, our now familiar compass interface.  Locked on to multiple suspicious interstellar visitors?  Good.


Then, we'll see exactly how far they are from us on the first of our new displays.  All the nasties are well away.  And we quite like Arnie, so nothing to worry about here (although that pesky Xenomorph is getting kinda close).


...But perhaps this is the most exciting enhancement of all.

By exporting a map from OpenStreetMap in OSM format (XML), extracting only the data we need (we used only highways, nature, waterways and buildings data) into a JavaScript dictionary using Python's xml.etree.ElementTree, and loading it into our JavaScript, we were in fact able to draw the entire map between certain longitudes / latitudes.  What's more, we drew the map on an off-screen canvas at start up, ready for it to be shown as a background canvas on our busy third screen, with the location of us and the aliens superimposed on a foreground canvas (which is the only bit that's dynamically redrawn).  Without this little trickery, we found that the map would take too long to be redrawn each time on this rather illustrious screen... and thereby significantly degrading the alien hunting user experience.

Lastly, we've augmented this map with our current heading information from the magnetometer so that we can see which way we're looking, relative to the map.


Ok... so this is becoming slightly over the top.  And a tad creepy.  In fact, what we're likely to find is that these "alien" coordinates have been randomly generated for testing purposes (i.e. please do not visit any of the locations insinuated in the pictures or video).  Nevertheless, perhaps what we have ended up with is a rather useful device to physically test our understanding of the navigational sensors and modules in our robot building arsenal, in a very visual, tangible way.  Now, that's rather helpful.


Music in the video is the X-Files Theme by Neo Geo.

Update 2 - March 2019

What became of this silly alien-hunting device?  Well, it was given a little 3D-printed makeover, and was actually taken around Brecon Beacons National Park.  You see, if we replaced the "aliens" with the 10 peaks of the Brecon Beacons (as described by the ultramarathon challenge we are actually training towards), we can project a map of the course, the peaks we need to conquer, along with our previous data points (such as temperature, compass heading and GPS coordinates).




Is it actually useful?  Not a lot (and please set off on any treks responsibly - as we did - with OS maps, smartphones with OS app, a real compass, and all the usual outdoor paraphernalia).  The novelty of carrying a Pi around the peaks quick wears off when you start to (correctly) focus on the more serious task ahead of surviving 13 hours of continuous walking, covering 42 kilometres in distance and almost 2000 metres in climbs.


Nonetheless, not unlike back in Taking a Peak, we did end up with a pretty satisfactorily GPS plot of the adventure based on the GPS receiver recordings.



Code:

Code will be posted here... if we haven't been ravaged by imaginary aliens by then.

Director's cut:

GoodTFT's LCD-show is being used to divert Pi's HDMI output to the TFT display:
The MPU9250 Python library by FaBoPlatform that we used is available here in GitHub:
Supervisor is used to startup and manage our Python applications, as well as the browser in kiosk mode:
Chromium is Raspbian OS's default web browser:
We are using the Elegoo 3.5 inch 480 × 320 TFT screen:
GPSD makes a glorious return in this post, since it's the most convenient way of working with GPS receivers:
Mosquitto is the MQTT broker we are using:
The InvenSense MPU9250 datasheet can be found here:
If you really have the urge to look at the DS18B20 datasheet...
Our GPS receiver is based on the U-blox Neo-6:

Comments

MOST VISITED (APPARENTLY)

LoRa-Wan Kenobi

In the regurgitated words of Michael BublĂ©: It's a new dawn .  It's a new day .  It's a new Star Wars film .  For me .  And I'm (George Lucas, and I'm) feeling good .  Unfortunately for Canadian Mike, the Grammy that year was won by the novelty disco classic with the famous refrain: We love IoT, even in Planet Tatooine * . *Not true. Clearly, the Star Wars producers didn't sincerely mean the last Jedi the previous time around.  Return of the Jedi, released during the decade that spearheaded cultural renaissance 2.0 with the mullet and hair-metal , was less economic with the truth.  Either way, we're going to take inspiration from the impressive longevity of the money-spinning space-opera and reboot our franchise with some Jedi mind tricks.  Except this particular flick doesn't require an ever-growing cast of unrecognisable characters, unless ASCII or UTF counts.  In place of an ensemble gathering of Hollywood stars and starlets, we will b

Battle of BLEtain

The trolling . The doxing . An army of perplexing emojis. And endless links to the same - supposedly funny - viral video of a cat confusing a reflection from a dangling key for a golden hamster, while taking part in the mice bucket challenge. Has social media really been this immense force for good? Has it actually contributed significantly to the continued enlightenment of the human (or feline) race? In order to answer these poignant existential questions about the role of prominent platforms such as Critter, StinkedIn and Binterest, employing exceptional scientific rigour equal to that demonstrated by Theranos , we're going to set up a ground-breaking experiment using the Bluetooth Low Energy feature of MicroPython v1.12, and two ESP32 development boards with inexplicable hatred for one another.  And let them hurl quintessentially British expressions (others call them abuse) at each other like two Wiltshire residents who have had their internet access curbed by the co

Hard grapht

You would all be forgiven for assuming that bar , pie and queue line are favourite pastimes of the British .  Yet, in fact – yes, we did learn this back in GCSE maths – they are also mechanisms through which meaningless, mundane data of suspect origin can be given a Gok Wan -grade makeover, with the prime objective of padding out biblical 187-page PowerPoint presentations and 871-page Word reports (*other Microsoft productivity tools are available).  In other words, documents that nobody has the intention of ever reading.  But it becomes apparent over the years; this is perhaps the one skill which serves you well for a lifetime in certain careers.  In sales.  Consultancy.  Politics.  Or any other profession in which the only known entry requirement is the ability to chat loudly over a whizzy graph of dubious quality and value, preferably while frantically waving your arms around. Nevertheless, we are acutely conscious of the fact that we have spent an inordinate amount