Normalizing ranges across devices

Are there any tools to help normalize beacon range measurements across devices?

Just to explain - my mental model here is that each device has an Rx gain which scales the reported RSSI measurements. This differs by phone model (and probably a bit by individual phone).

e.g. If we take a lot of phones and walk around near a beacon, we find that some types of phones consistently report being further from the beacon that others. i.e. There is a scaling factor applied to their ranges. e.g. Nexus 5X tends to report low ranges, Sony phones higher ranges, etc. iPhones tend to be much more consistent.

What I’d like to know is - has anyone calibrated a lot of phone models to try to eliminate this effect? e.g. If there was a database of scale factors where you could look up a phone model and apply a correction to the range measures to get to a more device-independent range.

(I’m assuming here that the main variation is between phone models, and different individual phones of the same model are similar. Though I haven’t tested this)

Hope that makes sense. Is there anything out there to help with this?

I know that the AltBeacon’s open-source Android Beacon Library laid some groundwork for that, but it seems that so far, they only have data for 3 handsets:

I guess the catch is, although it isn’t rocket-science, it requires a lot of time (to do all the measurements for individual devices) and money (to buy/rent the devices) (and also, engineers’ time is money) … and I personally don’t actually believe there’s that much of a return for this investment. Vast majority of beacon apps I’m seeing only require rough “what beacons are in range,” and sometimes “which beacon is the closest” (which is relative RSSI, and thus doesn’t require normalization other than for the differences in the beacons’ Tx power).

What would you want to do with more precise cross-device distance estimations?

OK, thanks, that’s very helpful.

For our application the absolute range does matter - e.g. We’d like to trigger one behaviour when the user is ~10m from the beacon, and another when the user is ~1m from the beacon.

But it’s proving very hard to reliably trigger the ~1m event in a way that’s reliable across devices. I understand it’s a hard problem and beacons aren’t the best at doing absolute range.

We don’t really need absolute range, we just need an accurate way to get absolute range buckets in a cross-device way. e.g. Far, Immediate, etc.

Hmm, got it, makes perfect sense actually!

I wonder if just grouping the RSSI into some rough zones would be enough. For example, on the edge of beacon’s discoverability, the RSSI is going to be roughly in the -90 dBm range, no matter what device. Whereas near the beacon, it’ll more closely track towards its measured power.

Another naive option would be to use two beacons, one with higher Tx power, one with lower. (Hint hint: we’ll be announcing something that’ll enable you to do something similar … only better … soon! Keep an eye on our blog and/or social media.)

I wonder if just grouping the RSSI into some rough zones would be enough. For example, on the edge of beacon’s discoverability, the RSSI is going to be roughly in the -90 dBm range, no matter what device. Whereas near the beacon, it’ll more closely track towards its measured power.

Yes, this is basically what we’ve been doing. It works to an extent, (e.g, very near vs very far), but it’s still a challenge to distinguish between say 5-10m vs 1-2m in a cross-device way.

Another naive option would be to use two beacons, one with higher Tx power, one with lower. (Hint hint: we’ll be announcing something that’ll enable you to do something similar … only better … soon! Keep an eye on our blog and/or social media.)

Ah, interesting, thanks!