What’s inside ?

One of the many use cases for near-surface Augmented Reality is taking a really close look at what’s inside something without taking it apart. Conventional AR technology cannot zoom in enough for this kind of application because it relies on inside-out tracking of the device position and orientation. This works well both indoor and outdoor, but stops working when getting too close to plain surfaces: tracking is lost and the magic is gone.

NeARtracker is all about bringing the lost AR magic back and being able to place innovative AR content formats overlaid directly on printable physical flat surfaces, including features like annotations and an e-commerce “paper interface”. Here’s how it turns a smartphone into a “magic lens” showing the inner life of a MacBook pro – something we envision to be used for training, industrial marketing or simply understanding complex things.

This “paper app” has been sketched up in less than ten minutes using the powerful Vuforia + Unity combo and, of course, our neARtracker sensor. Enjoy !

 

The quest for pervasive displays in times of wearables @PerDis2018

Munich, June 6-8:  the Seventh ACM International Symposium on Pervasive Displays (PerDis) just happened, a small, but critically important conference brought to life by a handful of “true believer” type of people, this year Prof. Albrecht Schmidt and his team at the Ludwig Maximilian University (LMU) in Munich.

perdis-2018

Why do I say “critically important” ? Because it fosters research to counter-balance what can be regarded as the current technological “local optimum” in AR/VR/XR, namely near-field display technology. Most mainstream VR/AR currently focuses on technology that works close to the human eye (near-field): VR headsets like Oculus, AR see-through displays (HoloLens, Meta) and prospectively retina micro-projection and BCI (Brain-Computer Interfaces). All these technologies are useful and have important applications as we speak. They have also been euphorically claimed to be the pinnacle of what can, and should, be achieved in terms of bridging the digital to the physical, with some people in the field prophesying a “display-less world” in a matter of decades (meaning that only near-eye displays will exist).

However, technologies bridging the digital to physical “out there” in the physical world – far-field 2D and 3D displays – have several unique desirable properties: (1) they are intrinsically shared and social (although more research is needed to develop meaningful interactions in shared display environments, as pointed in the keynote by Prof. Nigel Davis at PerDis 2018) (2) don’t require any awkward human augmentation with wearable devices and (3) they allow a probably healthy degree of control over the “over-virtualization” of the physical world – the unforeseeable negative effects that might arise from replacing physical reality with a 100% controlled digital environment in which everything happens “at will”.

Thus, they must have their own place in our digital development. I would argue that the main reason for the industry’s focus on wearable, near-field display technology today is because it is significantly easier (albeit not by any means trivial !) to implement “immersively” than truly pervasive displays (“anything is a display”), 3D holograms and immersive display environments. That’s why I think of wearable displays as a “local optimum”: given sufficient advance in far-field technology, it would be probably preferable to near-field due to the reasons exposed above.

Long-term, thus, we can expect to see a shift towards far-field displays and one key problem to solve by research is meaningful interaction. As we demoed at PerDis2018, neARtracker technology turning smartphones into tangible interfaces is a powerful tool to use with the additional opportunity of using the smartphone’s display as a “magic lens”. The video below exemplifies this setup on a cultural heritage application: the virtual reconstruction of a lost garden in Sansoucci, Potsdam. The paper in which we’re discussing, among other things, where technology could head to in the future under the title “An 1834 Mediterranean Garden in Berlin: Engaged from 2004, 2018, 2032, and 2202” is also available in the Proceedings of PerDis2018.

   

 

 

NeARtracker @AWE 2017

“This is the most original idea I’ve seen in this hall” – Prof. Thad Starner, Google Glass Project
“De puta madre !” – some guys from Mexico
“Hey, you actually did something I’ve also envisioned, but didn’t think it’s possible !”

These are some first-hand reaction we received at the #AWE2017 demo stand in the startup area. It was really exciting and rewarding to see people faces lighten up when presenting them the technology. There’s a lot of input, ideas and challenges to digest and follow up and we’re confident we’re now significantly closer to bringing neARtracker to real-world projects.

Below a few pictures:

 

devEyes at TEI 2017 in Yokohama, Japan

Last week I’ve got the chance to demo devEyes at TEI 2017 in Yokohama as part of the work-in-progress program. TEI stands for “Tangible, Embedded and Embodied Interactions”, a computing paradigm that has ist roots mainly in the visionary paper of Mark Weiser from, well, 1991: “The Computer for the 21st Century“.

Now the TEI community is a seriously cool and creative bunch of people, starting with MIT Prof. Hiroshi Ishii, the man who coined the term “tangible” and encompassing a lot of very cool ideas and demos all related to how we integrate computing in an ubiquitous, seamless way in our physical environment – as opposed to all of us being sucked in virtual reality.

I’m quite glad to say that the demo went really incident-free (which is rare…), got quite a few “aaahs” and “ooohs” and generated really interesting discussions – you could really feel people getting excited about possible applications. Some healthy skepticism as well. Here are some pictures, not so many due to me being busy with my visitors: