iPad Pro 2020 LiDAR vs iPhone Camera, Simple AR Test

196 Просмотры
Издатель
I quickly made a very simple test, it finds and highlights horizontal and vertical surfaces. Then put that exact app, no changes, on both my new iPad Pro 2020 and my iPhone XS. I chose the XS for comparison because I don't have the 11, and XS at least has two cameras it can combine for AR sensing (I first tried my old 1st-gen iPad Pro with its single camera, it did far worse than either of these).

You can clearly see the difference the LiDAR makes! Many depth sensors have problems with blank, detail-less surfaces, especially at a distance (notice the carpet with all that detail got picked up quickly).

With ARKit and a camera, as you move your device, it captures various different points of view and looks for the differences to measure depth, similar to how your eyes work, which is why you are supposed to sweep your device back and forth. But if all it sees are white wall, it can't see any differences to gauge depth. LiDAR, on the other hand, sprays out a grid of IR lasers, and times how long it takes for each laser beam to get reflected back, and using simple math (the speed of light), it figures out how far each laser traveled before it returned.
Категория
iPad Pro 12.9
Комментарии выключены