• [Translate to EN:] Bat-Bike auf zwei Rädern

    Bat bike

    Futuristic bike concept with series potential

  • EDAG SOULMATE

    When the car suddenly becomes the better smartphone

  • Success that gives you wings

    How a female software developer at EDAG sets new standards

  • THE STUFF THAT CHANGE IS MADE OF

    EDAG LIGHT COCOON – POSSIBLE PARADIGM SHIFT FOR FUTURE MOBILITY

Multimedia interactive project table EDAG INNTUO

EDAG Engineering Czech Are you still looking for any really usable touch table? It is astonishing that, despite the advanced touch sensor technologies used in mobile devices and large screens, horizontal touch screens are rarely seen anywhere other than in info kiosks, museums or restaurants. 

Apparently, one of the reasons for this scarcity is the ergonomics of using these large devices. While it is fairly easy to touch-control smaller devices such as tablets or smartphones, the horizontal positioning of interactive tables is much demanding on body posture and movement. One way of improving the controls is to place one or more physical objects on the surface, detect their positions, and incorporate them into the user interface.

One such device has recently been developed in EDAG's EE department. Intended as a means of showcasing active car zones, it featured a unique combination of the physical object laid on the table and the graphic user interface on the touch screen just under it. Thanks to this user interface, the user can easily switch various sensors and assistance systems in the car and then see their position, angle and range on both the screen and the physical object. The electronics inside car model communicate wirelessly with the table software and control LEDs representing each sensor in the car's systems.

However, there is the possibility of a much more sophisticated solution. Our new concept EDAG INNTUO incorporates a large 4K screen with a touch surface capable of detecting tens of touchpoints simultaneously. A powerful PC runs the graphic interface, which can even use one of the contemporary 3D engine software versions available, Unity 3D for example. The user interface features a custom SDK that can detect not only touch inputs, but also special physical markers. The position and orientation of such markers can be incorporated into the user interface, and they can also be attached to various objects. Using a 3D printer, custom objects can easily be manufactured, and the controls then extended into the new dimension.

And even more interesting features are in the pipeline. Using augmented reality hardware in combination with our device can enhance the experience even further, bringing the interface and content into the whole volume of space above the table. In comparison to other AR solutions, in this case, the work space is distributed across all dimensions:  2D space of touch screen, physical objects and the whole area above the table, with almost no limits. Further, an observer without an AR headset can still see the main screen and keep up with others at least in 2D.

We believe we can finally deliver a device that can be used not only for impressive presentations, but ultimately, for everyday work on creative projects - without limits.