I've just been testing out your Android SDK, and comparing the readings to an iPhone app. The accuracy / distance values coming from the Android SDK are very different from what I'm seeing on iOS, and fluctuate rather a lot. I placed a beacon around 4 - 5 meters away from the devices. iOS showed the expected distance - mid-to-high 4m readings. The Android app switched constantly between values as low as 0.2m and as high as only 1.7m.
Would I be right in thinking that the differences are due to Estimote having to effectively reverse engineer Apple's accuracy calculations? If so, are there any improvements to this calculation planned in the future?
Sorry for the late reply. The iPhone readings are more accurate and for the foreseeable future, they'll remain such. Thet's because iOS CoreLocation framework already has some noise reduction features implemented. In Android Bluetooth stack that's not the case, therefore RSSI fluctuations are heavier and readings more accurate.
Thanks for the question, we've found that the values do vary from device to device and it may be due to the location of the BT antenna on each phone.
Going forward it is preferable you consider real world deployment beaded on zones rather than an estimated distance.
On zones, please refer to the following article: http://community.estimote.com/hc/en-us/articles/201029223-RSSI-Range-Zones-Trilateration-and-Distance-Accuracy
I recently test the Estimote demo app for Iphone 5C and Sony Experia (Android 4.3) I experience the reverse. I place a beacon 10-20 cm from both devices
- Iphone - the app accurately place it in the IMMEDIATE zone
- Sony - the app placed in the NEAR zone
So some fairly large differences especially if you are trying to identify apps based on zones. Between IOS and Android how do we accomodate these
differences? eg. immediate in IOS is near in Android?
Does anyone have other comparions between Android an IOS?