I am doing a bachelor project where i wanted to differentiate between when i am standing at 0.5m from a beacon and when i am standing at 1.0m from a beacon and so on till 5.0m so I gathered rssi signals at each distance for 10 minutes but i found this https://postimg.org/image/fkz9mpvtz/ As you can see it was going well till 1.5m then the 2.0m one overlapped with the 1.5m and 1.0m ones so I was wondering is there a logical explanation for that (maybe the transmission power changes automatically as the distance between the beacon and the device increase?) I did all the tests with the same beacon with transmission power set at 4dBm and advertising interval set at 200ms
Thank you for your support
Hi Omar,
What you’re observing is a natural behavior. Bluetooth uses 2.4 GHz radio waves that are susceptible to environmental effects, like multipath propagation, absorbing, blocking, and diffraction. Hence calculating distance from raw RSSI readings is quite hard: even at identical settings and distance you might arrive at varying signal strength results. Doing it right requires adopting additional methods. That’s basically what our team of data scientists has been doing to develop Estimote Indoor Location SDK: http://developer.estimote.com/indoor/
And if you want to read more about physics behind beacons, there’s a post on our blog: http://blog.estimote.com/post/106913675010/how-do-beacons-work-the-physics-of-beacon-tech
Cheers.
1 Like