Bluetooth Low Energy (BLE) locating using ML

My build log starts with a short story, so grab a drink and nestle back in your chair a moment…

In the beginning

You see before I joined balena I wasn’t enjoying my work life at all, I was surviving it. Then, one day I went to use etcher and noticed it needed updating. The site showed me that “the etcher people” did other things besides Etcher. Two hours of reading documentation and blog posts later I was clicking the link to apply for a job!

My second interview was due to be a technical chat with @chrisys and I received an email telling me what to expect. That email had a paragraph at the bottom, telling me that I should be prepared to talk about a project I was proud of, which I later found out came from a template, and Chris didn’t know was there. Too late, I had mentally left my current job, set my heart on joining balena, and the email said I needed a project I was proud about.

I didn’t have a project I was proud of.

A project was born

The interview was in ten days. So all I had to do was think of an idea, make the idea, write up the idea and rock up at the interview with Chris ready to show him it. :thinking:

Somewhere in the back of my mind I remembered reading something about museums using Bluetooth low energy (BLE) tags dotted about it’s building sending out advertisement beacons, and an app using them to triangulate it’s indoor position. The app could then show information about the exhibit the user was nearest. However, because museums tend to be large and have lots of flat surfaces (e.g. walls, display cases) the beacon signals would bounce around a lot, and confuse the app. So they had to use machine learning to improve it’s accuracy. And I remember, when I read that, wondering if you could turn the whole design on it’s head: have a BLE beacon attached to something (e.g. a parcel) sending out beacons, and use receivers dotted about a building. Could you then use machine learning to work out the position of the BLE tag?

I had ten days to work out if three raspberry pi’s dotted around my one house could triangulate the position of a BLE tag. :worried: So this happened:

The results

“How did it go Phil?” I hear all three readers cry. Well it went well enough that Chris recommended I get the job! :slight_smile:
Little did he know, it failed to actually work until the night before my interview. :+1:

Here’s the repo: phil-d-wilson/balenaLocating-ML: PoC to show indoor BLE triangulation using RaspberryPi sensors, IOT Hub and a KNN classifier model (github.com)

The new project

However, that was 2019 Phil that made that. 2019 Phil really liked putting code into “the cloud”. And 2019 Phil didn’t mind using expensive cloud services just to stream a few BLE beacon messages past some code and work out where a beacon might be in his house. The beacon he was holding. :\

2021 Phil doesn’t want to use cloud services. 2021 Phil wants to remake this project, but do all of the processing on the edge. And he also wants to improve it. He wants heatmaps. He wants X, Y and Z locating (i.e. upstairs!). And he wants alerting when movement anomalies happen.

Buckle up dear readers, we’re going on a (build) journey together. :slight_smile:

5 Likes

What a good story!
It will be a great trip!

Problem

Firstly, let me state the problem I am trying to solve here. I want to locate a device (phone, tag, BLE watch) in a building, based off triangulating it’s beacon signals. To keep things simple, let’s call them tags, and picture that every now and then they send out a “I’M HERE!” message, like this:

As you can see (because I stole the image from my blog post) a listening device can receive those beacon signals and know that it is within range of the tag, and how strong it received the signal. Roughly, the stronger the signal, the closer the tag.

Now add more listening devices (sensors) which all have the potential of receiving the beacons from a tag:

Now, if all those sensors hear the tag…where is the tag? Is it simple the sensor that hears the beacon with the strongest signal? Well it could be…but there are reasons why it might not. And there are also situations where a beacon is being heard equally strongly by several sensors. And finally, you don’t want to have to add a sensor to every room (imagine trying to locate equipment in an office block, or luggage in a hotel) just to try and locate where something is accurately. Hence the need for machine learning! :slight_smile:

Machine Learning

To train a model you start off by feeding it things you know to be true. So in this case we can take a BLE beacon into each room, and tell the model what room we are in. It can then find which sensors can hear that beacon, and the signal strength, and capture that as a fingerprint. You carry on doing that, to train the model what that fingerprint looks like for all of the rooms.

Once the model is trained, you can pose it questions: if tag1 is being heard by these sensors, at these signal strengths, what room is it in? And the model (in this simple design) finds the nearest match (K Nearest Neighbour) based on it’s model.

Design

I’m only just starting to think how I’m going to design this second go at solving the problem. As I put in the first post, the original solution involved sending data to the cloud (Azure) and paying for cloud services to do the processing. I want to do it all on the edge this time, which changes the approach completely.

Some more (self-imposed) constraints:

  • any software used should be open source
  • the solution should be as low-cost as possible
  • the solution should be anti-fragile and low maintenance (more on this another time)

Early thoughts

Well I’m really keen on using Dapr at the moment, and my pint-sized buddy @rahul-thakoor very kindly made me a slimmed down container for it: phil-d-wilson/daprd (github.com). So that’s going into the pot. :slight_smile:

My current plan is to use the Dapr sidecar to put any BLE signal readings a device has, into an Actor Model, making virtual actors for sensors and tags, and maybe rooms…not sure yet. Then have a service which can message a tag actor and ask what sensors can hear it and at what strength. Then pass the replies to the KNN ML model and find out where the tag is. Something along those lines. Here’s some early scribbles:

1 Like

Hi Phill !!

Your idea is very interesting! I’ve been reading about DAPR Sidecar, and I’m waiting for build-log number 5 !!

Regards

JCRC

Hi Phil!!
It would be interesting to know what has happened with this project, if you can comment something about the progress or difficulties that you have encountered, I am sure it would be very good

Regards

Hi there @jcramirez404 -

I’ve not worked on this project for a while, but not because I’ve abandoned it…

I am working on a new balenaBlock which also uses dapr.io. My hope is that I gain more experience of dapr, and possibly make a dapr block at the same time. Then I use all those new smarts, in this BLE project. :slight_smile:

Phil

Ok Phil

Thanks for your answer and good luck!!

JCRC