I have an AR UseCase that has a complex Model - that needs to be displayed at a fixed location - through multiple rooms. (Complex Ductwork to be projected an an unfinished construction site)
The big issue is that AR really likes ‘Relative’ positioning. This usecase cannot use relative. 50% of the wall surfaces done even exist at the time of projection.
Therefor Absolute is needed,
The closest thing to Absolute in ARKit seems to be GPS Anchor.
My approach would be to use an Estimote Beacon Constellation as a GPS provider - then Tell ARKit to use this provider.
My TestCase would be to project a 1’ ball - 2’ above an assigned Beacon. Move the Beacon - would result in moving the projected Ball.
Does this sound like the right approach - or am I missing something?
thanks for your thread.
The best way would be to integrate ARSession with NISessions using our UWB Beacons and our SDK.
You will essentially get in your ARSession a 3d vector pointing towards the UWB Beacon. iPhones have multiple UWB antennas and can compute orientation/angle in addition to time-of-flight distance.
See this example: https://twitter.com/gugaoliveira/status/1538614568615821312
Thank you Jim,
I saw the demo - looks promising.
For an AR Model that would span multiple rooms - am I correct in assuming I would need several beacons per room - all grouped into a single constellation.
The model - would be anchored to some key origin point of the constellation.
The iPhone user would then pickup a NISession Vector to nearest beacon as they move through the rooms. AR/Model Projection would stay ‘fixed’ base on calculations between constellation& Iphone location.
Thanks - C