PAYMENT

All products serviced by TERATTO provide a system that allows consumers to pay as they wish, such as card payment, simple payment, and TRC. Card payment and simple payment are applied at the regular…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Interaction Interfaces for Mixed Reality

In my article titled The Future of Mixed Reality is Web-Based I spoke about the very attainable goal to develop over 90% of our Mixed Reality (MR) software using new and upcoming, platform agnostic, web-based technologies. The remaining application code would need to be platform specific for access to hardware such as vision and position sensors, although this ‘layer’ can and should remain small. The target platforms of a web-based MR solution would be any modern mobile device such as smartphones and tablets, as well as dedicated MR headset hardware also supporting newer web application technologies. While mobile devices now provide a lot of computational power and a basic set of sensors required for MR, the user experience and interaction interfaces will be severely limited.

Visual Gesture Interaction Interfaces

A majority of the existing MR interface methods are based on visual hand recognition, and this presents a number of known and unavoidable problems:

The single rear-facing camera found on most current mobile hardware is also problematic. With no depth information it is difficult to calculate the distance of a user’s hand(s) within the visual range. Mathematical depth estimation can be applied to overcome this limitation, but at the expense of already limited processing resources and unreliable low accuracy.

Datagloves

Accurate high-resolution tracking and a sense of touch are vitally important for a number of application domains, such as preoperative (surgery) planning and training, or fine telerobotic manipulator control for interacting with a remote environment and using small tools. A dataglove for hand-based interaction interfacing will be virtually lag free with one-to-one, spatially accurate position tracking. Hands can be kept out of the user’s FOV for a more natural way of interfacing, with unobtrusive virtual hands providing an interaction frame of reference. Various aspects of hand motion can be controlled in software, most notably a range of motion translation which provides the ability to make very small physical movements appear large (beneficial to mobility impaired individuals), or for large physical movements to appear small (to enhance precision tasks), depending on the needs of the application domain or the end user. The same software methods can be used to correct for and smooth out small physical twitch movements, another benefit for individuals with impaired motor control (such as individuals with Parkinson’s or MS).

MR Interfacing and the Dataglove

Add a comment

Related posts:

ABOUT THE AUTHOR

Whichever party forms government in Queensland after October 31, has the opportunity to generate policies that lead the way in actually working for everyone says Professor Scott Baum.

It Takes Dislocating My Elbow To Start Writing Again

As part of the Tuesday ritual of bouldering followed by Turkish doner kebap and beers, I head to Bouldergarten in Neukölln to meet my friend. I’m delayed due to construction on the Ringhahn¹ — all…

Free Currency Exchange API To Get New Zealand Dollar Data

Maybe you are seeking an API with New Zealand dollar data. In this post, we show you how to find it with a free currency exchange API. The New Zealand dollar is the country’s official currency…