What’s inside ?

One of the many use cases for near-surface Augmented Reality is taking a really close look at what’s inside something without taking it apart. Conventional AR technology cannot zoom in enough for this kind of application because it relies on inside-out tracking of the device position and orientation. This works well both indoor and outdoor, but stops working when getting too close to plain surfaces: tracking is lost and the magic is gone.

NeARtracker is all about bringing the lost AR magic back and being able to place innovative AR content formats overlaid directly on printable physical flat surfaces, including features like annotations and an e-commerce “paper interface”. Here’s how it turns a smartphone into a “magic lens” showing the inner life of a MacBook pro – something we envision to be used for training, industrial marketing or simply understanding complex things.

This “paper app” has been sketched up in less than ten minutes using the powerful Vuforia + Unity combo and, of course, our neARtracker sensor. Enjoy !

 

The quest for pervasive displays in times of wearables @PerDis2018

Munich, June 6-8:  the Seventh ACM International Symposium on Pervasive Displays (PerDis) just happened, a small, but critically important conference brought to life by a handful of “true believer” type of people, this year Prof. Albrecht Schmidt and his team at the Ludwig Maximilian University (LMU) in Munich.

perdis-2018

Why do I say “critically important” ? Because it fosters research to counter-balance what can be regarded as the current technological “local optimum” in AR/VR/XR, namely near-field display technology. Most mainstream VR/AR currently focuses on technology that works close to the human eye (near-field): VR headsets like Oculus, AR see-through displays (HoloLens, Meta) and prospectively retina micro-projection and BCI (Brain-Computer Interfaces). All these technologies are useful and have important applications as we speak. They have also been euphorically claimed to be the pinnacle of what can, and should, be achieved in terms of bridging the digital to the physical, with some people in the field prophesying a “display-less world” in a matter of decades (meaning that only near-eye displays will exist).

However, technologies bridging the digital to physical “out there” in the physical world – far-field 2D and 3D displays – have several unique desirable properties: (1) they are intrinsically shared and social (although more research is needed to develop meaningful interactions in shared display environments, as pointed in the keynote by Prof. Nigel Davis at PerDis 2018) (2) don’t require any awkward human augmentation with wearable devices and (3) they allow a probably healthy degree of control over the “over-virtualization” of the physical world – the unforeseeable negative effects that might arise from replacing physical reality with a 100% controlled digital environment in which everything happens “at will”.

Thus, they must have their own place in our digital development. I would argue that the main reason for the industry’s focus on wearable, near-field display technology today is because it is significantly easier (albeit not by any means trivial !) to implement “immersively” than truly pervasive displays (“anything is a display”), 3D holograms and immersive display environments. That’s why I think of wearable displays as a “local optimum”: given sufficient advance in far-field technology, it would be probably preferable to near-field due to the reasons exposed above.

Long-term, thus, we can expect to see a shift towards far-field displays and one key problem to solve by research is meaningful interaction. As we demoed at PerDis2018, neARtracker technology turning smartphones into tangible interfaces is a powerful tool to use with the additional opportunity of using the smartphone’s display as a “magic lens”. The video below exemplifies this setup on a cultural heritage application: the virtual reconstruction of a lost garden in Sansoucci, Potsdam. The paper in which we’re discussing, among other things, where technology could head to in the future under the title “An 1834 Mediterranean Garden in Berlin: Engaged from 2004, 2018, 2032, and 2202” is also available in the Proceedings of PerDis2018.

   

 

 

Unfold your phone – the tangible AR way

At neARtracker.com, we share the vision of using the physical world as a big pervasive digital display and fusion it with the power of smart mobile devices. It is a bold and mid-to-long-term bet, but we think the dominating current trend of wearable near-eye displays is going to reverse at some point, once technology allows for truly ubiquitous pervasive displays “out there”.

It is along this lines of thought that we develop applications showcasing our vision of combining the ease of use of a smartphone with the immersive-ness of spatial AR  and pervasive displays – our tracking sensor makes this possible already. You can think of these applications as regular smartphone apps, but additionally with the key ability to “unfold” your smartphone onto a larger, more immersive physical display.

The first application is a collaborative photo sharing illustrating the key concept of “unfolding” photo content onto a shared physical display and use the smartphone to manipulate, present and share pics.

The second application is a tangible Air Hockey AR game that uses a smartphone for real-time interaction on a projected surface. Just as the previous app, it combines both near-surface AR on the smartphone’s display and spatial AR on the larger display.

Enjoy !

 

What is near-surface Augmented Reality and why should I care ?

This is understandably one of the most frequently asked questions received during AWE2017, so I believe it deserves a blog post.

Some context first: Augmented Reality, or AR, is the process of overlaying something digital on top of the real world. Exactly in which way and where the digital/physical interface should be located is not fixed a priori.  That being said, most current mainstream AR are near-eye, placing the digital/physical interface more on the user’s side, either on a smartphone screen or on a head-mounted display/smart glasses. The opposite approach is obviously to place the augmenting digital information more on the world side, directly on surfaces in the real world. One example is projection-based AR like Lampix. Another one is Neartracker, which is smartphone-based. AR that happens directly on or very close to real-world objects is what I call near-surface AR, to differentiate it from the first near-eye category.

near-surface

As is often the case in engineering, there are pros and cons to the different AR paradigms described above. A full comparison is beyond the scope of this post (check [1] and [2] for more details).

Near-eye AR in a nutshell:

  1. Conventional smartphone-based AR is readily available and takes advantage of existing hardware. However, it forces the user to actively hold and move a device with a small screen, which is tiresome, looks awkward, keeps your hands busy and also potentially rises privacy concerns which is probably why smartphone-based AR is not more prevalent.
  2. Head-mounted display-based AR offer a much more immersive, hands-free experience and has known an huge technological push lately, especially in industrial applications. However, AR displays and glasses are still heavy, expensive and regarded as intrusive and lacking naturalness by many users.

Near-surface AR essentially frees the user’s hands and eyes from wearing any devices.  The “reality” part in “augmented reality” is seen directly with the naked eye.

  • Projection-based AR (also called spatial AR) uses either a fixed or a mobile projector plus optionally a camera system to allow user interaction/input on the projected surface – typically a desk.
  • Near-surface smartphone-based AR is what we envision with Neartracker: smartphones placed directly onto arbitrary real-world surfaces. It blends together a few key features:
    • it is essentially hands-free, allowing precise touch-screen interaction
    • takes full advantage of existing, wide-spread hardware (smartphones) as well as existing AR SDKs and frameworks
    • unlike conventional smartphone-based AR tracking specific markers, images or objects, it enables “magic lens” usage of smartphones across surfaces of arbitrary size via a grid of almost-invisible markers.
    • compatible with – but not requiring –  projection-based spatial augmentation (only for use cases where projection makes sense, like interactive games).
    • unlike projection-based AR, it is compatible with printed content (as long as  tracking gird is still partially visible): it can turn a paper sheet into an UI
    • turns smartphones into tangible digital avatars, effectively bridging AR and tangible user interface technology which seeks to use physical objects to control digital environments – check our game example.

As a conclusion, it should be noted that none of the different AR flavours are good for every use case and near-surface AR is no exception to that. It makes most sense for scenarios in which the interaction naturally happens on a surface (print + digital magic lens, virtual desktop, mixed reality games, smart tables for collaboration, exhibitions, education, etc.). In other scenarios that for example require to display objects in mid-air, head-mounted display are the way to go.