released project | December 2016

LS4D

 
 

project description

A lysergic virtual reality experience designed to experiment with the immersion capability of virtual reality. The user interactions triggers "4D" feedbacks during the user experience, such as water sprays, fans and fragrances, on what become a very trippy and quite immersive music visualization.

A Study On Immersion

We have long observed and interacted with Virtual environments via external monitors or screens. However, not only the visual fidelity of these virtual environments rapidly increases with time, but also the hardware’s capability to immerse us in them. A long standing challenge for both scientists and companies worldwide, the chase for immersion led us into developing systems with mapped projections and a simplistic tracking costing over $20,000. Rapid Advancements in both sensors and computing technology allow us today to wear more immersive consumer-grade virtual reality headsets for a fraction of that cost.

By developing LS4D, we managed to take advantage of several current technologies, such as hand tracking and external stimuli to put together an extremely immersive experience. not relying on the need of photorealism or even resemblance to our physical reality, we chose to promote a sense of immersion by keeping the user senses constantly activated by multiple stimuli. In addition, we also present music as a way to define an emotional tone to the experience and effectively facilitate the user’s suspension of disbelief, primarily by psychologycally allowing the user to engage with the experience in the most entertaining way.

The quest for immersion, however, is far from over. Cloud computing, 5g streaming, Photorealism, more natural interactions (hand gestures, body movement, voice), haptic feedback, brain-computer interfaces and many other technologies show promising paths to the advancement of xr technologies towards an even more immersive future.


User hand interactions and gameplay footage

User hand interactions and gameplay footage


Main features

  • Four unique scenarios which interact with sound, providing a synesthesic experience. Get immersed in a virtual forest, a vaporwave sanctuary, a rooftop at night or in a more neutral sound visualization room.

  • Interact in a more natural way directly with your Hands without the need for external controllers. A powerful but easy-to-use experience which relies on the spontaneous combination three simple hand gestures.

  • Choose between several music genres, from electronic music to binaural meditation tracks, post-rock songs or navajo-inspired music to design your own unique experience.

technical aspects

The project was developed for the oculus rift using a custom build of Unreal Engine based on the version 4.12. After writing a script on what the experience would be like, we developed a working prototype with fluid simulation, hand interactions, external stimuli activation and sound visualization.

Fluid simulation was achieved by implementing the nvidia™ PhysX sdk into the unreal engine source code. We then implemented the nvidia™ flex particle based simulation technique to simulate and parametrize an interactive fluid in realtime. At the time of production, we couldn’t find any example of fluid simulation working within a virtual reality application, so we optimized the source code to maintain a solid 90 fps performance.

Hand capture and gesture recognition was made possible by using a Leap motion™ controller and the leap motion sdk for unreal engine. We took advantage of hand gestures that were already recognized by the sdk, such as a “pinching” hand and a closed hand, and took the approach of loosely combining them into pre-fixed interactions with relatively predictive results. We also added pseudorandom parameters to provide a feeling of "self-consciousness“ and lack of control to the experience itself.

External stimuli, such as the fans and water spray, were wired to a ARDUINO® protoboard and implemented into unreal using the ue4duino plugin developed and made available by gryzly32 in the unreal engine forums.

We used mainly autodesk 3ds max for the 3d pipeline of this project, while the audio samples and music were processed in audacity and the ui elements designed in adobe photoshop.

In early 2017, ls4d was exhibited for aerolito’s course participants in brazil.

In september 2017, VICE Brazil wrote a story on ls4d and the benefits of simulating significant virtual experiences without the use of psychedelic substances.

In March 2019, ls4d was exhibited as an interactive art installation on the 1º tech art festival in Porto Alegre, Brazil.

Hand Interaction with the fluid

Hand Interaction with the fluid