Vuforia recently released a new way to track real objects using computer vision. The Vuforia Advanced Model Targets 360 introduces instantaneous recognition and tracking of physical objects, regardless of the viewing position. You must have a good model of your real world object to try this out, but the results are fantastic and better than object tracking I have used to this point. In addition to having a model (an untextured obj is fine) you must upload the target model to Vuforia cloud where is trained by a deep learning process.
The result is super fast recognition of an object on a mobile phone or even a webcam. Additionally, the phone’s internal accelerometer-driven slam capabilities are incorporated into the tracking model for great stability.
In my opinion, this capability surpasses the state of Mateo before they were bought by Apple and also surpasses the current capabilities of VisionLib or any other vision-based tracking system.
The only difficulty I had is that in unity the camera and tracked object can jump around in the scene, a LOT. For no good reason. Meaning the camera and tracked object can jump 10 degrees from each other and move around all over the place.
Because the distance and heading between the two remain the same, you don’t see it unless you place other objects in the scene. Then you have to make everything a child of the AR camera or broadcast the coordinates and heading. However, this new capability is definitely a game changer and I expect to see more Vuforia usage very soon!