Unreal Realm
Unreal Realm is a set of gesture-based interactions for cross-space object manipulation in augmented reality (AR). In light of the growing prevalence of commercial stereoscopic 3D displays, it explores the 3D equivalent of the foundational touch gestures as well as the interface across different spaces.
Concept
Background
Screen-based devices have become the dominant medium through which we interact with digital content. Despite the variety of device types and screen sizes, our experience rests on a handful of touch gestures (swiping, pinching, tapping) that act as the backbone of mobile interactions today. Even with the emerging popularity of mixed-reality experiences today, screen space, AR space, and physical space have been in relative isolation.
Problem
What are the foundational 3D interactions of a mixed-reality future?
What are the ways through which the boundaries between spaces can be blurred?
It is necessary that XR interfaces should allow for additional interactions with virtual objects, which should not only enable natural and intuitive interaction with them, but also it should help to provide seamless interactions between real and virtual objects at the same time.
Approach
This research aims to investigate how to interact with cross-space objects in XR, and how to seamlessly transition between hand gestures for conflict-free 3D digital content arrangement.
We define the 3 spaces as follows:
. Screen Space - 2D pixel coordinate system of a computer display
. AR Space - 3D virtual coordinate system overlayed onto a real-world camera feed
. Physical Space - The real world with tangible materials and matters
Catalog of Gestural Interactions
1. Flip to Show the Menu
The menu is designed to be displayed on the right side of the left palm when the user's palm is facing toward the user. To press a button in the hand menu, the user uses the non-dominant hand (right hand) to press the virtual button with the index finger.
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806209767?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
2. Grab to Move
By grabbing virtual objects naturally, the user is able to move them to another position and orient them as they wish.
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806210604?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
3. Raycast and Pinch
Ray-casting is a grabbing technique where a light ray extends from the user’s palm. By intersecting an object with this ray and pinching the fingers, the user is able to drag an object from a distance. Here, the user drags the object out of screen space into AR space.
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806211522?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
4. Pull to Explode
Two hands moving apart diagonally while pinching is often a metaphor for opening things up. The user explodes the chair to have a closer look at the product’s structure. When the user moves the two hands closer while pinching, the exploded chair is manipulated back to its original state.
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806210760?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
5. Magnify, Extract, and Apply Materials
Applying materials is a common function in 3D rendering and the product design process. The process of obtaining a material that mimics the real object is not intuitive and efficient enough. In our implementation, we propose that users can directly search for and extract materials from real-world objects, and transform them into digital content for further AR-based and screen-based creation. The user uses an ‘OK’ sign gesture that mimics a magnifier to indicate texture on a real object. After the material is confirmed, a material ball will be generated beside the user’s hand. The user drags the texture ball to the object to apply the texture.
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806211219?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806211190?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
6. Scan to Duplicate
We envision a gesture-based 3D scanning process that produces a digital twin in a seamless and intuitive way. The user moves the hand in one direction to perform the scan.
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806211522?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
7. Swipe to Switch
The user swipes to flip through a virtual object catalog.
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806211759?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
8. Shoot to Delete
The user removes the virtual object via a shooting gesture.
<div style="padding-bottom: 56.25%; max-width: 100%; position: relative;"><iframe src="https://player.vimeo.com/video/806211948?background=1" width="800px" height="450px" style="position: absolute; top: 0px; left: 0px; width: 100%; height: 100%;" frameborder="0"></iframe></div>
Credit
Davide Zhang: Research, Ideation, Technical Pipeline, Scripting, Prototyping
Aria Xiying Bao: Research, Ideation, UX Design, Scripting, Prototyping
Nix Liu Xin: Research, Ideation, 3D Modeling, 3D Animation, Prototyping
Advisor: Allen Sayegh