NeARtracker @AWE 2017

“This is the most original idea I’ve seen in this hall” – Prof. Thad Starner, Google Glass Project
“De puta madre !” – some guys from Mexico
“Hey, you actually did something I’ve also envisioned, but didn’t think it’s possible !”

These are some first-hand reaction we received at the #AWE2017 demo stand in the startup area. It was really exciting and rewarding to see people faces lighten up when presenting them the technology. There’s a lot of input, ideas and challenges to digest and follow up and we’re confident we’re now significantly closer to bringing neARtracker to real-world projects.

Below a few pictures:

 

Awakening paper to life: neARtracker for Vuforia + Unity

We’re excited to announce that we have reached an important milestone in bringing about near-surface Augmented Reality for smart devices: the neARtracker sensor and software has been integrated with one of the industry-leading AR platforms, Vuforia + Unity, to create a Proof-of-Concept AR application that uses printed paper as an immersive digital medium.

Using Vuforia + Unity efficient authoring workflow and 3D performance as well as high-quality plant models from our partners at Laubwerk on top of our neARtracker technology enables the creation of compelling AR experiences directly on printed paper. The neARTracker PaperTrack app is showcasing the augmentation of a printed house and garden plan. We will demo this Рand more Рat AWE Europe in Munich, 19-20 October 2017.

Highlights:

  • Works with the standard Vuforia + Unity editor / distribution
  • Currently available on Android devices, iOS to follow
  • A commercial version of the sensor currently in advanced development

Unlike conventional AR, near-surface AR does not constrain the viewing device to a minimum distance away from the tracked object(s). Rather, the device can freely move directly on a surface, thus enabling an immersive “magic lens” type of experience that is fully integrated with the printed content and removes the awkward “focus-on-a-single-hand-held-small-screen” type of interaction. The clever use of device sensor data allows the experience to partially transcend the 2D surface and permit the user to also navigate in 3D space.