Real Vision

  • VR App
  • Visual Design
  • User Experience
  • ← Index

In an industry that has always been home to fragmented and inconsistent experiences, Realvision strives to make real estate simple, beautiful and effective. As a technology company, they partner with professional photographers and agents to improve their services by creating the tools and platform for an engaging, holistic digital experience. A single photo shoot gives photographers not only the high quality still images, but also generates a 3D tour and dimensioned floor plan.

But a walk-through tour in a web browser can only be so engaging. What if we could go one step further, and fully immerse our users in the experience? By integrating the full experience into a VR environment, we’re able to pull users directly into properties, allowing them to walk through a home without ever taking a step.

VR has been a rapidly evolving field of study for user research and behavior. It was an exciting exercise to explore such a wild frontier from an interaction perspective.

Once a list of properties is loaded into the app, the user is presented with the main facade and the top-level details of the property. From here they can view a bird’s eye view of the floor plan before deciding to explore. If there are multiple properties loaded, they’re free to cycle through the selected listing.

Hardware solutions for interactivity wouldn't always be available (given the variety of device contexts), so I devised a method of selection with the user's "gaze." Interface elements would float in space (realtive to either the user's head or the environment itself), and the user could then activate these elements by looking directly at them for a brief duration of time. Above demos the visual feedback that would be displayed to represent the interaction.

VR can be physically draining on our user, so we devised a menu system to avoid unnecessary neck stress. This menu worked on a reverse rotational anchor that met the user half way as they began to look down.

This mechanism, combined with our ‘gaze’ interaction for selection, meant the entire experience could be navigated without any need for hardware controls and with minimal strain on our users.

Hardware controls were still to be utilized when made available by the users' hardware. With the diverse amount of platforms, our goal was to create a uniform set of inputs that could be applied to anything from the Samsung Gear (our target, shown above) to the Oculus and others.

Looking for more?

👈Previous Project 🏠Back Home 👉Next Project