Skip to main content

Lights, camera, satisfaction!

Some of humans' greatest questions continue to remain unanswered.  What does a Higgs Boson particle taste like?  Can the sofa constant ever be proven?  ...Do rabbits go rogue and ride around in race cars at night?

Like most problems, however, solving them involves the gathering of proof.  Therefore, if we want the authorities to believe that there are indeed small furry mammals with big ears performing unsafe burnouts in our streets, we need the evidence.  Indisputable evidence.  Ideally, of a bunny on wheels, up to no good.  Causing chaos in our neighbourhoods.

But don't worry.  We'll proceed to equip Rosie Patrol with some of the most sophisticated devices known to mankind.  Will it be a Large Hadron Collider, shaped as a carrot?  No, that'll be in our next episode (clearly, not guaranteed).  No, we'll provide her with a camera, and a passive infrared motion detector, in a valiant attempt to catch the big eared culprit in the act.

All superheroes need:

  • 2 Raspberry Pi 3s, both running Raspbian OS
  • Computer from which you're connecting to both Raspberry Pis 
  • A camera for the Pi.  We're using the official Raspberry Pi Camera V2.  Official, should probably mean it just works.
  • A passive infrared potion detector.  Sorry, that's: motion detector.  Much more useful for detecting trespassers.  Not so good for detecting potions.

Already completed these missions?

You'll need to have completed the Rosie series.  Then:
  1. Lights in shining armour 
  2. I'm just angling around
  3. I would like to have some I's
  4. Eh, P.I?

Your mission, should you accept it, is to:

  • Connect more stuff to the Pi.  Like the camera and passive infrared motion sensor.
  • Test the camera.  Take some award-winning photos of table legs.
  • Go back and modify code for the head-torch.  Make it more API friendly, and allow it to set specific light modes.
  • Create a class to detect motion using the passive infrared motion detector
  • Fashion up an ambitious routine.  It will detect movement (hopefully), fire off an API to the other Pi to turn on head-torch on the brightest light setting (again, hopefully), and take a picture (very, very hopefully).

The brief:

With the power of APIs, we were able to start to stretch Rosie Patrol's brain across multiple Pis.  You could give this hack grand descriptions like distributed computing.  Or horizontal scaling.  Maybe, even, microservices.  <insert your own marketing term here>.  We just found it useful, as we doubled the hardware we had access to, like for example, the number of available Pi's GPIO pins.  That's all.

And directly as a result, we can proceed to attach more useless useful gadgets to our second Pi: rosie-02.  You know, the one that is controlling everything attached to Rosie Patrol's head.  We've agreed on what these gizmos are: a Passive Infrared (PIR) motion sensor, and a Camera.  A PIR motion sensor detects infrared emitted from objects (like objects of a human variety), and indicates this discovery as a signal back to our Pi's GPIO pin.  The camera, on the other hand... well... takes photos or videos.  But you knew that right?

So it shouldn't be beyond our capabilities to mash something up.  A program that takes photos good enough for crime labs when the PIR motion sensor detects movement.

Except most crimes happen at night.  And nights are usually dark.  ...And we didn't get ourselves a night-vision camera (which we probably should have).  That's why we are resorting to doing this the (highly) clumsy way.  When motion is detected, we'll tell rosie-01 - via REST API - to change its head-torch to the brightest setting.  And only then, we'll take the photo.  This method could just as easily be adapted to fire off our particle collider, but Tammy, the cat, would be no more.  And our little robot factory would probably be closed down by the authorities, pretty quickly.  The gang of unsocial bunnies would remain at large.

All in all not very subtle.  We're likely to scare our big-eared petrol-heads with the sudden explosion of light (maybe that is the point?).  But - for now - let's pretend that this is all part of our grand plan to protect the universe.  You never know.  We might learn a thing or two.

The devil is in the detail:

Let's start with the cabling.  It's by far the easiest bit.

PIR - explained here - requires three connections: Vcc of 5V, a Ground (0V) and a PIR output which uses a 3.3V signal (so no need for a voltage divider circuit here, phew).  The PIR output can go to whatever GPIO pin is free.

Connecting the camera isn't any harder, either.  It turns out, there is a dedicated slot on the Pi to connect this thing.  We are using the official Raspberry Camera V2 which we're told is a 8MP camera.  It can take videos, and photos.  After all... it's a camera.

2 minutes... and you're done with the cabling.  It should probably look a bit like this (although, here, we have other connections for our servo motors, and dot LED matrix displays).

Let's move onto the software.  The ability to use the camera does need to be enabled in the Pi (it's not enabled by default).  Run raspi-config and enable it.

sudo raspi-config
...launches the Raspberry Pi Configuration Tool.  Enable camera in here under 5. Interfacing Options.

What shall we test first?  Yes, we'd quite like the idea of winning awards for our magnificent photography.  Let's test that the camera works, using guess what - Python.

There are several different modules to choose from when it comes to interacting with the camera.  We chose PiCamera, as it appears to be aimed at doing all of this, using Python.  The PiCamera Python module is quite easy to understand, and taking a photo is as simple as running the following commands.  Which commands?  Here they are, ready for testing - interactively - in IPython.

But before we do, let's setup a directory called 'capture', just to store our images.  It's just better organised that way.

mkdir capture
...creates directory in current folder (in our case /home/pi/rosie/rosie-web)

Start up IPython.  We clearly need to import this module.  Luckily, our Raspbian OS appeared to have it already installed.

from picamera import PiCamera
import time

We then instantiate our PiCamera object, set its resolution, start previewing, and say cheese!  Interestingly, if you have a monitor plugged into your Pi using a HDMI cable, you can see what the camera is seeing in real-time, because of the start_preview().

c1 = PiCamera()
c1.resolution = (1024, 768)

Run this in IPython.  Just like that, a photo called 'test.jpg' should appear in your 'capture' directory.  Download the photos from the Pi using a tool like WinSCP.  It's probably of a table leg, or of a dull wall.  Does it looked focused?  Or is it blurry?  You can actually rotate the lens in Raspberry Pi Camera V2 - but be very careful, and follow the instructions closely.  It's easy to damage the lens if you don't have the right tools.

Now we turn out attentions to that funny head-torch on rosie-01.  What's wrong with it?  I hear you ask.  Well nothing.  Except we have no way of checking what 'mode' it's in.  Or we can't order it to be in a certain mode (unless we keep operating the relay until we get it into the state we want).  Nor can it be operated by another machine, using an API.  OK, so there's actually lots wrong with it.  Let's proceed to address these limitations, one by one.

Let's remind ourselves of the head-torch.  We operate it using a relay, and with each momentary close of the circuit (switch) we are able to toggle through its modes, of which there are:

  • Off
  • Bright
  • Less bright (let's call this medium then)
  • Red LED thing
  • Same red LED thing, only flashing

Now, there is absolutely no way (mechanically or electronically) to monitor which actual mode this light is in.  But what we should be able to do is keep track of how many times the relay has been operated (because it's only our program operating it).

Let's assign ourselves some class-wide variables to reflect the modes in our Light class:


And during instantiation of our light, let's set a new instance variable called light_mode.  This is what we'll use to track which mode we think the light is in.

self.light_mode = self._LIGHT_OFF

It's important that this matches what mode the light is actually in when the program first starts up.  The program has no way of telling whether it's in sync with the actual un-intelligent head-torch, or not.  Throughout, we are simply assuming that the light is toggling through its settings as instructed, every time we send the relay a signal.  We do anything outside of this (such as manually change the light's mode), we're no longer in sync.  And the plan will be in ruins.

Then, in our switch() method, we now include target_method as an argument.  Idea is that we keep cycling through light_mode (and operate the relay) until we reach the one we're after. 

while self.light_mode != target_mode:
    # ...code to toggle through modes until it reaches target

We achieve this all by incrementing the light_mode instance variable every time we momentarily close the switch in the relay.  Also, we shouldn't forget to reset it back to 0 when we reach the very last mode (_LIGHT_RED_FLICKER).

if self.light_mode == self._LIGHT_RED_FLICKER:
    self.light_mode = 0
    self.light_mode += 1

Our class is now all set.  We should be able to instruct the light to toggle through as many modes as it needs to reach the desired mode.  Remember we wanted to create an API endpoint for the light?  Let's create a route, using Flask:

@app.route("/api/v1/light", methods = ["POST"])
def control_light():
    _light_mode = int(request.get_json()["light"])
    return jsonify({"light" : _light_mode})

Here, we are expecting a HTTP POST request, arriving at our endpoint URL /api/v1/light.  It expects JSON data, with a light key, and a value of 0-4 (which we now know are the different light states).  We can now quite happily test this, using Curl from our Windows machine.  Depending on the value you use, the light should cycle through to your desired mode.  For example going from 0 to 1 is one click.  Going from 1 to 0 involves 4 clicks.

curl -X POST -H "Content-Type: application/json" -d "{\"light\":\"0\"}" http://rosie-01:5000/api/v1/light

Quite clearly, we're on a roll.  We're ahead of schedule.  And there have been no sightings of unscrupulous bunnies revving their V8 engines so far this evening.  So we thought we'd add another extra touch.  Remember our shockingly bad webpage that allows us to control Rosie?  Let's modify the existing route that accepts a HTML form POST, to allow it to control the lights.

@app.route("/light", methods = ["POST"])
def rosie_light():
    _light_mode = int(request.form.get("light"))
    return redirect(url_for("index"))

Which of course now means that we need to modify the Jinja2 template slightly to have multiple buttons in the form, representing each of the light modes.

<form action="/light" method="post">
    <table border=0>
            <td><button class="button-mode" type="submit" name="light" value=0>Off</button></td>
            <td><button class="button-mode" type="submit" name="light" value=1>Bright</button></td>
            <td><button class="button-mode" type="submit" name="light" value=2>Medium</button></td>
            <td><button class="button-mode" type="submit" name="light" value=3>Red</button></td>
            <td><button class="button-mode" type="submit" name="light" value=4>Red (Flash)</button></td>

We've zoomed through the changes required on rosie-01.  It's been a blur, revisiting the stuff we've done before.  But what we should now have is a application that allows you to change light modes to the desired state, using both the API and the webpage.

The entire code now looks a bit like this:

We suggest you take a break.  Get a drink.  Because we now have more work to do, on rosie-02.  Before the marauders return.

PIR motion sensors do little else other than to provide a signal on its output when it detects movement.  Some parameters are adjustable on the PIR device itself using knobs, so it's worth an investigation.  We want Python running on the Pi to monitor for this signal.  Let's setup a class for this - MotionSensor.

class MotionSensor:

    def __init__(self, GPIO_PIN = None):
        self._GPIO_PIR = GPIO_PIN
        gpio.setup(self._GPIO_PIR, gpio.IN)

    def check_motion(self):
        if gpio.input(self._GPIO_PIR):
            return True
            return False

Not much code right?  Correct!  We simply instantiate this class with a GPIO pin in mind, then monitor that pin for a signal.  The method check_motion() will do this for us, whenever we call it.

We now need a function that, when run as a thread, continuously checks the PIR motion sensor, then does a few of the things that we want.

def detect_movement():
    API_LIGHT_URL = "http://rosie-01:5000/api/v1/light"
    pir1 = MotionSensor(14)
    c1 = PiCamera()
    c1.resolution = (1024, 768)
    global stop_neck_movement

The above should look familiar.  We instantiate the MotionSensor class, using GPIO pin 14, and also setup the camera.  We've also defined the URL of rosie-01's light API which we need to toggle through our headtorch.  DETECTION_WAIT_S will be the seconds the routine will wait once it's detected movement, to start checking again.  DETECTION_DELAY_S is an artificial delay we introduce between checks of the PIR motion sensor so that we don't overwhelm the Pi with near-constant checking.  We actually want Rosie Patrol's neck to stop moving once motion is detected.  For this, we'll change the value of a new global variable - stop_neck_movement - and incorporate a check for this in our head_movement() function*.

*Mounting the PIR motion sensor on Rosie Patrol's moving head turns out to be a pretty bad idea.  The reasons are fairly obvious once you dedicate some brain cells to it.  Because when the PIR motion sensor itself is moving,  the whole world appears to be moving to the sensor, and so it constantly triggers with a heat source.  For this reason, it's best placed on a still object, or only operate it when the robot is stationary.

The second part of the detect_movement() function is a while loop.  It's not that exciting, other than it's where the API is fired to change rosie-01's light to 'bright', and capture a photo.  We can keep taking photos, as the filename is derived from the date / time that the photo is taken.

while True:
    if pir1.check_motion() == True:
        print("ROSIE: I've spotted movement!")
        stop_neck_movement = True
        post_json_request(API_LIGHT_URL, "light", 1)
        date ="%m_%d_%Y_%H_%M_%S")
        c1.capture("capture/" + date + ".jpg")
        stop_neck_movement = False
        post_json_request(API_LIGHT_URL, "light", 0)

As expected, we'll start this as a thread to make sure it runs in the background.  This way, Rosie Patrol can be doing all sorts of other stuff at the same time (most likely drinking her favourite Earl Grey to keep herself awake all night).

t6 = Thread(target = detect_movement)
t6.daemon = True

As a result, now looks a bit like this:

So there it is.  We've just added a whole bunch of new gizmos to Rosie Patrol, and she can begin to gather crucial evidence.  And just in case you were wondering, here's a photo taken by our budding superhero, just to prove that rabbits do roam around the streets at night in super cars.

It turns out this isn't the best use of her new found powers.  You could take this further and capture or stream videos, use night-vision cameras, and maybe even upload the results to your favourite photo storage site (using whatever API they provide you with).  Clearly attaching the PIR motion sensor on a moving object was a schoolboy error.  But fear not.  The whole neighbourhood can now sleep soundly with the knowledge that our big-eared delinquents will be brought to justice.  More so once we manage to obtain parts for our particle collider.  Time for eBay anyone?

Information overload:

PiCamera module's official documentation can be found here:
Bit more about it on the Raspberry Pi website also:
We are using the official Raspberry Camera V2:



LoRa-Wan Kenobi

In the regurgitated words of Michael BublĂ©: It's a new dawn .  It's a new day .  It's a new Star Wars film .  For me .  And I'm (George Lucas, and I'm) feeling good .  Unfortunately for Canadian Mike, the Grammy that year was won by the novelty disco classic with the famous refrain: We love IoT, even in Planet Tatooine * . *Not true. Clearly, the Star Wars producers didn't sincerely mean the last Jedi the previous time around.  Return of the Jedi, released during the decade that spearheaded cultural renaissance 2.0 with the mullet and hair-metal , was less economic with the truth.  Either way, we're going to take inspiration from the impressive longevity of the money-spinning space-opera and reboot our franchise with some Jedi mind tricks.  Except this particular flick doesn't require an ever-growing cast of unrecognisable characters, unless ASCII or UTF counts.  In place of an ensemble gathering of Hollywood stars and starlets, we will b

Battle of BLEtain

The trolling . The doxing . An army of perplexing emojis. And endless links to the same - supposedly funny - viral video of a cat confusing a reflection from a dangling key for a golden hamster, while taking part in the mice bucket challenge. Has social media really been this immense force for good? Has it actually contributed significantly to the continued enlightenment of the human (or feline) race? In order to answer these poignant existential questions about the role of prominent platforms such as Critter, StinkedIn and Binterest, employing exceptional scientific rigour equal to that demonstrated by Theranos , we're going to set up a ground-breaking experiment using the Bluetooth Low Energy feature of MicroPython v1.12, and two ESP32 development boards with inexplicable hatred for one another.  And let them hurl quintessentially British expressions (others call them abuse) at each other like two Wiltshire residents who have had their internet access curbed by the co

Hard grapht

You would all be forgiven for assuming that bar , pie and queue line are favourite pastimes of the British .  Yet, in fact – yes, we did learn this back in GCSE maths – they are also mechanisms through which meaningless, mundane data of suspect origin can be given a Gok Wan -grade makeover, with the prime objective of padding out biblical 187-page PowerPoint presentations and 871-page Word reports (*other Microsoft productivity tools are available).  In other words, documents that nobody has the intention of ever reading.  But it becomes apparent over the years; this is perhaps the one skill which serves you well for a lifetime in certain careers.  In sales.  Consultancy.  Politics.  Or any other profession in which the only known entry requirement is the ability to chat loudly over a whizzy graph of dubious quality and value, preferably while frantically waving your arms around. Nevertheless, we are acutely conscious of the fact that we have spent an inordinate amount