Skip to main content

LoRa-Wan Kenobi



In the regurgitated words of Michael Bublé: It's a new dawnIt's a new dayIt's a new Star Wars filmFor meAnd I'm (George Lucas, and I'm) feeling good.  Unfortunately for Canadian Mike, the Grammy that year was won by the novelty disco classic with the famous refrain: We love IoT, even in Planet Tatooine*.

*Not true.

Clearly, the Star Wars producers didn't sincerely mean the last Jedi the previous time around.  Return of the Jedi, released during the decade that spearheaded cultural renaissance 2.0 with the mullet and hair-metal, was less economic with the truth.  Either way, we're going to take inspiration from the impressive longevity of the money-spinning space-opera and reboot our franchise with some Jedi mind tricks.  Except this particular flick doesn't require an ever-growing cast of unrecognisable characters, unless ASCII or UTF counts.  In place of an ensemble gathering of Hollywood stars and starlets, we will be assembling together a crack team of bargain basement electronics designed to stand out from the crowd, like Chewbacca cosplayers queueing for the loos at a Star Trek convention.

Yet, all jokes aside, we have high aspirations for this sci-fi action film produced on an austere budget. As Van Damme was busy making beer adverts on an icy, mountainous ridge with a pack of wolves (or alternatively in front of a green screen made to look like one, with donkeys CGI'd with fur and pointy teeth), we have decided to enlist the help of a growing star.  One that can likewise handle the remote wilderness, challenging terrain, and last weeks on end without a recharge back in a 5-star Bellagio suite.

Yes.  You guessed it - nah, to be fair, you probably didn't.  This is the I-O-Tea instalment in which we say hasta la vista baby (oops, wrong actor) to our ex-Wi-Fey of yesteryear, and attempt to elope with something allegedly more low maintenance, that will go the distance.

If we wanted a margarine spread made from #quote a blend of healthy fats #un-quote, we would choose Flora. On the other hand, if we want #quote a long-range wireless communication protocol based on license-free sub-gigahertz radio frequency bands #un-quote, we would choose LoRa.

Is it the latter we want?  Yes, it probably is.  Oh gosh, a pretentious harpsichord has started playing the Star Wars Theme.  Let us begin.

Development board clone wars: episode XXXVI

We are entering slightly uncharted territory here with our very first LoRa experiment.  As ™™LoRa™™ is a ™™proprietary™™ long-range communication protocol, we clearly require a new rig.  Did we leave enough ™'s behind?  Here's some more to cover us legally: ™™™™™™™™™™™™*.
  • First and foremost, we're going to reaffirm our enduring love for MicroPython on ESP32.  Moreover, we don't want to be procuring additional modules from an interstellar trading outpost to augment it with LoRa capability.  This is precisely why we embark on our mission with a no-frills LoRa "capable" ESP32 development board by HelTec Automation.  And by "capable", it simply means that the board is equipped with a LoRa transceiver - a Semtech SX1276.  In addition:
    • A good rummage through their GitHub repo reveals that the built-in OLED is a monochrome 0.96" 128×64 display, powered by a widely used SSD1306 driver
    • As is usually the case, there are also a number of embedded LEDs that we intend to toggle on and off in a way that would make George Lucas' special effects studio shudder in horror
  • We're still sending data (just much less of it).  For the sake of simplicity, we'll pick BMP280 (pressure) and DHT22 (temperature / humidity) sensors to generate our slimmed down sensor readings.  These three datapoints could be manipulated to occupy a frugal 1 byte each.  Making them perfect for use in testing high latency, low data rate networking.  Note that we only intend to test an "unconfirmed data up" message type - in which we frivolously dispatch a packet out into the airwaves in the hope of reaching a gateway, but do not expect any confirmations back.
  • Confession: we don't own our very own LoRa gateway (yet).  We're relying entirely on the kindness and generosity of LoRa warriors out there who have made their gateways available for use by the general public, specifically through the The Things Network (TTN).  We salute you brave sub-gigahertz radio band volunteers.  We aim to join you in galactic battle one day.
  • ...And because we don't own our own LoRa gateway, and it appears as though there isn't a TTN gateway in the proximity of where we live, we could not fully test our transmission from the comfort of our home.  Yes, it's time to plan inexplicable family day trips based exclusively on the availability of a TTN gatewayWe've always wanted to visit Bristol and Reading, haven't we kids?
It's true, we're placing a lot of faith in the capabilities of our HelTec Automation development board.  Is there a nice marketing graphic that can fill us with some hope?  Yes there is.

*Here are some more, just in case: ™™™™™™™™™™™™.

A long time ago in a galaxy far, far away....

Is this the mythical fourth instalment of a trilogy?  Is this the perplexing sequel that arrives before the prequel?  Could we use random Roman numerals - say episode XXXVI - to make this piece of work sound like the magnum opus that it clearly isn't?

We'll make this easy for you by simply sticking to our formulaic plot currently plodding through the I-O-Tea series.  We've had 4.  According to pre-school maths, it makes this the 5th.  And they are all in the right order, we kid you not.
  1. Frozen Pi
  2. Have-ocado
  3. Green, green grass of /home
  4. Quantitative wheezing

Don't make a wookiee mistake

We'll try hard not to make a giant extraterrestrial hairball out of a mole hill, since testing a complete LoRa solution encompassing different message types would result in a bore of an epic that would warrant a dramatic score by John Williams.

This, on the other hand, is more a trailer; a teaser for the LoRa tech in action on an ESP32 running MicroPython.  And getting those sensor readings netted in AWS IoT Core - over a distance even Beckham would think twice before putting his threads through the ball.

In other words, we're stating our excuses early.  We won't be (...on purpose, you understand?) building a sophisticated solution, such as one which could be used to assess Global Warming, by remotely monitoring thawing snotsicles on the climbers at the summit of Mount Everest.  In any case, we don't yet want to transport this rig up lofty snow-covered peaks, as we're likely to find Van Damme posing there in inappropriate denim, downing a certain North American alcoholic beverage that would be dismissed as mere melting ice water here in the West Country.

Time to return our focus on the ascent up ahead.  Here's our intergalactic itinerary.
  • First things first.  Let's attach our DHT22 and BMP280 sensors to the GPIO pins of the ESP32 development board, preferably using a breadboard.  Then, we'll fire-up MicroPython to prove that we can obtain temperature, humidity and pressure readings like an amateur climatologist straight out of... erm... climatology academy.
  • We can't leave that OLED left untouched, especially if we want our un-scientific instrument to look semi-professional.  Let's cram some text onto the teeny screen.  Even better, we'll flash the on-board LED on and off a few times to show that it's alive.  Wow, this is pure, cinematic excitement.  Like watching Phantom Menace on a Bush VHS + TV combo from the 90s.
  • All of the above so far has been the highly distracting backstory.  We came here to the out-of-town cinema complex for the blockbuster: Adventures of LoRa Croft.  It's time to attempt our interaction with the Semtech SX1276 chipset using the SPI protocol.  Can we retrieve some arbitrary information from its configuration registers?  In fact, is it there?  Can we go through the entire post without writing Semtex?  So many questions.
  • Deploy (or if we're feeling brave, develop) a SX127X MicroPython library to start conquering the airwaves with unsolicited, unacknowledged messages ("unconfirmed data up"). As you'll read later, we had to fork our own library, since we couldn't find one that worked the entire stack (including LoRaWAN and TTN) on MicroPython.
  • The Things Network console beckons.  We'll create our very own "rosietheredrobot" application, then register our "rosie-esp32" device.  Unique keys linked to this device, incorporated into our outgoing LoRa packet, allows it to be identified as ours, and recognised by the gateway and the wider LoRaWAN infrastructure.
  • We've been doing a lot in AWS IoT recently.  Since we ultimately want these sensor readings to appear in AWS IoT Core (and forwarded to some other yet to be identified destinations afterwards) we'll deploy the official CloudFormation Stacks to integrate TTN with AWS IoT Core.

Smart E-wok

What shall we fry up today using our imaginary E-wok?

Until now, all our sensor readings have either been stored on a local storage device for later inspection offline (like an EEPROM), or dispatched to our designated MQTT broker using an available Wi-Fi connection.  In the context of our I-O-Tea series, MQTT payloads were sent to AWS IoT Core directly, or via our local Greengrass Core device.  In other words, we were highly reliant on wireless local area network connectivity, which also goes by the catchy name 802.11 (not to be confused with OU812, Van Halen's questionable second album featuring Sammy Hagar).  It's not always available out in the wild, however, when you're running with the devil along a highly dubious "route" someone mapped out for you on Strava.  Especially if we do not have data available using cellular networks... and our thirsty microprocessor is not near one of the billion branches of Caffè Nero with free Wi-Fi.

Think of Planet Tatooine - the "harsh desert planet".  Well, how harsh is it?  Well, it hasn't got a Caffè Nero for a start.  Can anyone - for extra loyalty stamps - tell us its surface temperature, humidity and pressure?  We bet no-one can, not because it is wholly fictitious, but because we haven't seen any entries in Wikipedia to indicate that the inhabitants have deployed a sophisticated, free Wi-Fi network.  And even if they had, what use is it for our microprocessors when they are likely to be deployed out of sight, in one of these "harsh" deserts?

What we want is a method for long-range communication.  Which uses as little power as possible.  It doesn't particularly need to support the ability to send lots of data (a Tatooine live web-cam anyone?)  And we aren't too worried about speed at which the data is transmitted, or perhaps, even some being missed.

LoRa is a protocol that appears to fit that bill.  It is a low-power, radio-based long-range communication technology, which works in sub-1Ghz frequency bands which happens to be unlicensed and free from the clutches of multinational telecommunication conglomerates and their subscription based operating models. It can also work over a range in excess of 10km.  This sounds perfect for sending the occasional byte or two housing our sensor readings.

So that's LoRa.  Yes please, we would like WAN.

The SX1276 chipset implements what's known as the physical layer: LoRa.  We can configure the transceiver by setting its various configuration registers, switch between operating modes, and when the time comes, load it with our packet formed of various pre-agreed headers and a payload, and tell the chip to send this out by the antenna.  LoRaWAN, on the other hand, works above this physical layer.  What should our packets look like (beyond just our payload)?  Which gateways should pick them up?  Once there, how do they get routed to where they need to go?

We will be using an open implementation of LoRaWAN called The Things Network (TTN)


You will all be used to our regular in-flight warnings about impending catastrophe and mayhem.  This one, however, might be worth reading (along with the one from the past about inadvertently poking your eye out with a red hot soldering iron).

Although LoRa communication takes place within unlicensed radio frequency bands, there are still regulations around how much of it we are allowed to use (remember, airwaves - like confusion about the general purpose of the Universal Soldier franchise - is shared by all those that inhabit this planet). There is an article by the The Things Network about how to stay compliant within your territory.  The summary is that the data should be small (as few bytes as possible), and sent out infrequently (intervals measured in minutes).  Please don't fall foul of these regulations in order to avoid Will Smith and Tommy Lee Jones paying you a visit.

Furthermore, precise frequencies permitted for use by LoRa differs globally by region.  For example, in Europe, the range stipulated by regulatory authorities is 863 to 870 MHz.  The assigned range for other regions is different, so it is essential to check what is approved for use in your country, and ensure that your transceiver is set to work in the allocated band.

Frequency bands are reserved by relevant authorities for a reason, and closely regulated. It is imperative that Tesco customers can operate their impulsive purchases at a very specific radio control frequency of 27.145 MHz, without interference from reckless IoT hobbyists bombarding the airwaves with sensor readings.  To a lesser extent, same applies to actual emergency services.



Lastly, when using the publicly available The Things Network - which we are - there is also a Fair Access Policy.

In conclusion, it's best to go and consult other LoRa sites and resources, rather than rely solely on a blog post found on a site that claims to be about a red robot called Rosie.

Whole lotta Yoda

This is the first (and possibly only) moan of the day.  Our HelTec ESP32 LoRa V2 development board arrived with its headers... well... headless.  We'll have to spend a couple of minutes reluctantly inhaling smouldering solder (thankfully, lead-free).


Luckily, the plastic OLED compartment could be neatly screwed off, better exposing the pads through which the headers need to be pushed through and soldered.  We're only inhaling burnt solder then, not plastic.


That's 10 minutes of our lives we'll never ever see again (but still 125 minutes less than when watching Solo: A Star Wars Story).  At least this little development board is no longer legless and can stand on its own 36 feet. Same cannot be said about Republican Walkers after a boozy night out in Coruscant.


If you've been following our posts, you know what usually comes next.  And you'd of course be right.

It's time to reveal our token sensors chosen to reflect the diversity of our electronics drawer.  Namely, the knockdown price DHT22 temperature / humidity sensor (although not as knocked down as the DHT11).  And a temperature / pressure sensor that is made by the German company that also makes lawnmowers and dishwashers (or possibly even a lawn mowing dishwasher) - the BMP280.

What do they all look like when they are chaotically hooked up to the ESP32 using a breadboard and Dupont cables?  As it transpires, a little like this:


We won't dwell for too long on our interactions with the sensors and displays, as we've used (and documented) them all before.

In short, we will be using:
  • DHT22 for temperature and humidity readings, obtained using MicroPython's built-in DHT driver
  • BMP280 for pressure readings, taken using a BMP280 I2C driver available on GitHub
  • SSD1306-based 128×64 OLED to display stuff.  Oh yes, there is a MicroPython driver for this too.
  • Development board's unspectacular LEDs.  Oh c'mon.  We know how to turn GPIO outputs high and low, right? ...Don't we?
Let's start by saying ¡olé! to the OLED.  The pre-mounted 128×64 display can be driven using the standard SSD1306 display library which is available for MicroPython.  We have used these before in Taking a Peak, albeit on a Pi. As we'd expect, the MicroPython driver is more austere, but is perfectly usable to print out a few lines of "hello world, hello world, hel (oops ran out of characters already)".

The pinout diagram of the development board informs us that the following pins are used internally to connect the ESP32 to the SSD1306.  Useful information, since we will be using I2C to drive the screen.
  • I2C SCL = 15
  • I2C SDA = 4
  • Reset = 16
...Armed with this information, and the library, we can make a crude attempt at displaying an uninteresting message on-screen.

Here we go.

from machine import I2C, Pin 
reset = Pin(16, Pin.OUT, value=1)
scl_pin = Pin(15, Pin.IN, Pin.PULL_UP)
sda_pin = Pin(4, Pin.IN, Pin.PULL_UP)
i2c = I2C(scl=scl_pin, sda=sda_pin, freq=400000)
i2c.scan()
import ssd1306
oled = ssd1306.SSD1306_I2C(128, 64, i2c)
oled.text("     NOT      ", 0, 0)
oled.text("    A VERY    ", 0, 16)
oled.text("   EXCITING   ", 0, 32)
oled.text("    SCREEN    ", 0, 48) 
oled.show()


Ladies and gentleman, this is not a very exciting screen.


Sensors next, then.

It gets even easier with the DHT22, because there is a ready-made MicroPython driver for it.  Here, we're using GPIO pin 13.

from machine import Pin
import dht
d = dht.DHT22(Pin(13, Pin.IN, Pin.PULL_UP))
d.measure()
d.temperature()
d.humidity()


Erm, ok.  Was that it?  Not quite.

We tinkered with a BMP280 only last episode - in Quantitative Wheezing - but here's a little refresher.

from machine import I2C, Pin
i2c = I2C(scl=Pin(2), sda=Pin(17), freq=400000)
i2c.scan()
import bmp280
b = bmp280.BMP280(i2c)
b.get()
b.getTemp()
b.getPress()


We'll discard the temperature reading from the BMP280, since we'll use the one from the DHT22 instead.  The objective here isn't to collect a gazillion datapoints.  Quite the opposite, in fact.  The very nature of LoRa means that we want to send as little as possible.

Lastly - before we move on to LoRa - let's get in on the blinky, blinky action.  How do we know this particular LED is on pin 25?  Well, because it says so in the schematics diagram.

led = Pin(25, Pin.OUT, value=1)
led.off()
led.on()
led.off()


Yawn. It's not very exciting so far, is it?  What does our OLED say?


OK, well that's imminently fixable.  Here is our refreshed screen, now that we have sensor readings we can display on it.


That's the easy bits done.  But that's not what we came here to do, right?

First and foremost, our particular LoRa transceiver chipset stuck to our development board - the SX1276 - has an unashamedly SPI interface.  And it has a whole set of readable and writable registers that we need to interact with, either through a SPI "read" or "write" operation.

Therefore, it's a good time to test whether we can actually talk to the SX1276 using MicroPython and its standard SPI library.

Incidentally, the SX1276 datasheet is a "gripping read, a literary masterpiece, by far the best work by author, 10/10 - The Times".  No really, it should be read, not least since it explains all the registers in detail, and what they are used for.  It's also a good idea to scan it in conjunction with the LoRaWAN Specification document, since the latter explains what shapes and sizes the packets need to be in order for it to work across a LoRaWAN.

Right.  No, write.  No, actually: read.  Let's attempt to read from register location 0x42, since this is the location of the hardware version information.  Our friend tells us that this will have a default value of 0x12 (18).


Here's an important fact.  The SPI protocol has a Chip Select "wire" that requires the line to be brought LOW before any communication, and returned HIGH once done.  We are also told by the datasheet that the first byte comprises of a Most Significant Bit (MSB) of 0 for read, 1 for write, followed by 7 bits identifying the register address.

Then, if it is a write operation we carry on sending the payload byte on the MOSI line.  If it is a read, we monitor what comes back on the MISO line.  It also tells us that phase and polarity are 0.  We appear to have all the pieces of information we need to achieve this test.


Again, we refer to the development board pinout out diagram to identify the physical pins used between the ESP32 and the LoRa transceiver.


This is what we have garnered:
  • SPI Chip Select = 18
  • SPI Clock = 5
  • SPI MOSI = 27
  • SPI MISO = 19
  • Reset = 14
  • DIO0 (IRQ) = 26
We won't worry too much about DIO0 and Reset for now.  DIO0 is used as a customisable interrupt pin (to signal that our packet has finished transmitting).  Reset, as far as we can gather, needs to be held high for the chip to work.

import machine
_REG_VERSION = 0x42
_BUFFER = bytearray(1)
buf = _BUFFER
_cs = machine.Pin(18, machine.Pin.OUT, value=1)
_rst = machine.Pin(14, machine.Pin.OUT, value=1)
_device = machine.SPI(baudrate=4000000, polarity=0, phase=0, bits=8, firstbit=0, sck=machine.Pin(5), mosi=machine.Pin(27), miso=machine.Pin(19))
address = _REG_VERSION
# MSB set to 0 for SPI read
# 0x7F = 01111111 used as a mask using &
_BUFFER[0] = address & 0x7F
_cs.off()
_device.write(_BUFFER[0:1])
_device.readinto(buf)
_cs.on()
print(_BUFFER[0])


There it is.  The magic number.  The age at which we can drink Hobgoblin ale in the UK.  Get married without parental interference.  Watch ANY Van Damme film.  The periodic table group number which houses a gas that makes our voice sound like that of Jar Jar Binks if inhaled.

With that, our fellow stormtroopers unluckily posted on Death Star, we transition to our next phase of this blog post: LoRa-Wan Kenobi: A New Hope.

Sorry folks, this is where it gets slightly tricky.  Because we need to achieve two things:
  • Interact with the SX1276 using the SPI protocol in a precise set of steps, addressing specific registers, with the aim of configuring it and transitioning it through different operating modes.  Eventually, we write the packet we want to send out to the device's FIFO buffer, then ask it to be transmitted.
  • Make sure that the packet is in a format that is compliant with the LoRaWAN specification, so that it can be picked up by nearest gateway and routed through The Things Network.  Besides our actual payload (sensor readings), which ends up being encrypted, there are preambles and headers that need to be appended.
Ideally, there would have been a MicroPython driver for this already written by someone, somewhere.  Eventually, we found one by Adafruit designed for CircuitPython, called TinyLoRa, but it is coded to run on the CircuitPython derivative.  So we decided to fork it and adapt it to run on standalone MicroPython.

Here is our uLoRa repository here.

We won't explain the code in detail here as this is better documented in the GitHub repo, but in essence, we instantiate the uLoRa class and send data using its send_data() method.

lora = uLoRa(
    cs=LORA_CS,
    sck=LORA_SCK,
    mosi=LORA_MOSI,
    miso=LORA_MISO,
    irq=LORA_IRQ,
    rst=LORA_RST,
    ttn_config=TTN_CONFIG,
    datarate=LORA_DATARATE,
    fport=FPORT
)
lora.send_data(data, len(data), lora.frame_counter)

Remember our sensor readings?  With LoRa, we need to be efficient about our payloads.  Let's see if we can distill our 3 datapoints down into 3 discreet bytes.

Ideally, for a reading to fit nicely into a byte, we need the values to be in the range of 0-255 (28-1).

Sensor ReadingExpected RangeTreatment
DHT22 - temperature-40 - 80If we want this byte to be a single unsigned integer of 1 byte (0-255) the negative value is a problem.  We will therefore simply apply an offset of +128 so this range becomes 88 - 208.  We can simply subtract our offset later at The Things Network.
DHT22 - humidity0 - 100Easy peasy.  This fits naturally into 1 byte, without any further work.
BMP280 - pressure30000 - 110000Do we care about the level of detail here?  Not for our purposes.  We'll divide this figure by 1000 to make it kPa.  30-110.  That sounds good for a byte of data.

And when we assemble our payload, that is exactly what we see.  3 bytes consisting of a byte of our sensor reading each.  Not a lot, eh?  The payload is encrypted, which is why we can't see them in the byte array of the eventual packet.  Moreover, the packet has other meta-data, such as headers / preambles and message integrity codes which are all additional bytes required by the LoRaWAN Specification, and added by the library.


But don't trust the console.  Trust this OLED screen which now shows us that 3 bytes of data has been sent.


Sure, we feel silly for missing the "k" from the display above.  At 101 pascals, well and truly below the Armstrong limit of 6.3 kPa, whether we are successfully dispatching LoRa packets or not would be the least of our concerns.  Think: ending of Total Recall.  Yes, we meant 101 kPa.

Right.  It was always our intention to the disseminate our LoRa packet out into the wild, and if there are any willing gateways that are part of TTN in range, for it to be picked up and forwarded through their network.

It's time to dabble in the TTN console.

We first create an application.  Let's call it "rosietheredrodbot", since we're not feeling particularly imaginative.


Then we create a device - "rosie-esp32".


We need to modify the device to use the ABP Activation Method.  Note also that Frame Counter Width is reduced down to 16 bits, and Frame Counter Checks disabled.


Remember we need some of these device details when we instantiate our class?  We can obtain them from here.


So the LoRa packets we dispatch are hopefully going to find their way to a nearby gateway. 

There's one more thing we ought to do while we're still in the console.  Once we create an application in TTN, we will create a "decoder" to decipher the bytes we send.  Not only are we mapping our individual bytes in our payload back into meaningful attributes, we can also perform some manipulation too (for example, negate the offset we applied earlier on the temperature reading).

If we test using 3 made up bytes here, we can see that the output looks suspiciously like JSON data, that can easily be forwarded onto other networks.


We're almost there.  And we're never too far away from a Cloud or two.

There is an official guide that we need to follow to integrate TTN with AWS IoT Core, but it basically boils down to deploying ready-made Elastic BeanStalk Stacks using CloudFormation. These handle all message forwarding between the two networks, as well as provide handy synchronisation of any registered devices.


After all this, we end up with two Stacks (and also an EC2 instance).


First signs of the integration working is when the synchronisation of our devices takes place.  Here it is: rosie-esp32.  The device we created in TTN has now appeared in AWS IoT Core.  It has a Type of LORAWAN.  And if we look at its attributes, we'll see that it has device details that were configured in TTN.


Can we now see payloads appearing in AWS IoT Core on topic rosietheredrobot/devices/rosie-esp32/up? We can test this by simulating uplink data in TTN, and observing if messages appear in AWS IoT Core.

It's sure looking that way!


With legible data now arriving in AWS IoT Core, we can see how these nicely formatted messages can now be forwarded onto other AWS services such as DynamoDB, IoT Analytics, or Lambda, as if the payload arrived directly into AWS IoT core through the usual MQTT channel.

Wait for it... it's the grand reveal.

We simply have to roam the post-apocalyptic landscape that some call the M4 motorway, in search of LoRa gateways connected to TTN that will accept our free-spirited packets.  This can be a challenge, since our titchy development board appears to have an inadequately sized antenna, and range of LoRa naturally deteriorates in built-up areas.

Let's go spread our LoRa packets* like we're in search of last remnants of civilisation.


*Please do not actually attach DIY electronics to your dashboard while driving... although the explanation dispensed to the police officer who pulls you over is going to be rather quite amusing (yes, you are likely to be breaking the law by driving with a breadboard attached to your dashboard).

Since we have our AWS IoT Core still forwarding payloads to AWS IoT Analytics through a Rule and a Channel, we get an indication on how successfully we are dispatching LoRa packets to the nearest available TTN gateway by looking at our Channel.  There are several incoming messages shown; which could be deemed to be progress.

Creating a Data Set in AWS IoT Analytics allows us to interrogate the Datastore, and inspect the content of some of these LoRa payloads.  In short, we have successfully sent data from our intrepid ESP32 development board to the nearest TTN gateway using LoRa, and payload is appearing in AWS IoT Core / Analytics.


By inspecting the metadata, we can obtain further information about the operation and the gateway that picked up our message.


We will deem this experiment to be a resounding success.

...So another episode has come and gone.  Punters remain perplexed as to for how much longer this gruelling saga can carry on for.  But enough about the latest reincarnation of the Star Wars franchise.  Today, we demonstrated that there is a use case for leveraging LoRa on Planet Tatooine (use of blockchain optional).  And all that is required is a visit to a galaxy with a bazaar stocking ESP32+LoRa development boards, and a planet inhabited by good citizens willing to lend you the use of their The Things Network LoRaWAN gateways.

We've gone the distance. So long, readers.  See you all at the Tannhäuser Gate for the next episode in which we will be setting attack ships on fire with some misconfigured C-beams.

May the docs be with you

HelTec Automation's ESP32 LoRa development board is documented in their GitHub repository.  Circuit diagrams, pinouts and more are housed here:
...And the LoRaWAN infrastructure we are connecting our device to is the The Things Network:
The LoRA transceiver being used here is a Semtech SX1276.
The latest LoRaWAN specification is here:
ESP32's SPI interface is used to communicate with the SX1276.  MicroPython documentation for the interface can be found here:
SSD1306 driver for interacting with the OLED is available on GitHub:
The Things Network to AWS IoT Core integration is detailed in this document.

Comments

  1. Hey Dude,

    realy nice article! It's the first time I could display anything on my esp32 with Micropython. Do you have any idea how I can communicate with only 2 single Lora devices ? I dont want to to connect to TTN, I just want to send 1 or 0 to another device - unencrypted. Is it possible to share the whole code of your project on github, so that I can maybe modify it for my project? Thanks a lot!

    ReplyDelete

Post a Comment

MOST VISITED (APPARENTLY)

Battle of BLEtain

The trolling . The doxing . An army of perplexing emojis. And endless links to the same - supposedly funny - viral video of a cat confusing a reflection from a dangling key for a golden hamster, while taking part in the mice bucket challenge. Has social media really been this immense force for good? Has it actually contributed significantly to the continued enlightenment of the human (or feline) race? In order to answer these poignant existential questions about the role of prominent platforms such as Critter, StinkedIn and Binterest, employing exceptional scientific rigour equal to that demonstrated by Theranos , we're going to set up a ground-breaking experiment using the Bluetooth Low Energy feature of MicroPython v1.12, and two ESP32 development boards with inexplicable hatred for one another.  And let them hurl quintessentially British expressions (others call them abuse) at each other like two Wiltshire residents who have had their internet access curbed by the co

Hard grapht

You would all be forgiven for assuming that bar , pie and queue line are favourite pastimes of the British .  Yet, in fact – yes, we did learn this back in GCSE maths – they are also mechanisms through which meaningless, mundane data of suspect origin can be given a Gok Wan -grade makeover, with the prime objective of padding out biblical 187-page PowerPoint presentations and 871-page Word reports (*other Microsoft productivity tools are available).  In other words, documents that nobody has the intention of ever reading.  But it becomes apparent over the years; this is perhaps the one skill which serves you well for a lifetime in certain careers.  In sales.  Consultancy.  Politics.  Or any other profession in which the only known entry requirement is the ability to chat loudly over a whizzy graph of dubious quality and value, preferably while frantically waving your arms around. Nevertheless, we are acutely conscious of the fact that we have spent an inordinate amount