Research:
UCSB’s Research Center for Virtual Environments and Behavior (ReCVEB) has an interesting virtual reality technology called the Head Mounted Display (HMD), a visual headset that also tracks the every motion of the viewer (eye movement, body movement, etc.) and sends the data back to the virtual environment software so that it can react accordingly. The software used to create these virtual environments is 3DS Max, a modeling, animation, and rendering tool. Using the HMD, ReCVEB’s main goal is to study human behavior and interaction through these virtual environments. For example, one environment may involve the user being chased by spiders in order to induce and observe fear. Professor Blascovich, the Co-Founder and Director of ReCVEB, has stated that immersive virtual environments “give us the experimental control we need while allowing us to realistically simulate the environments in which the behaviors we study normally occur”. In other words, researchers are allowed to conduct far-reaching experiments, which induce real reactions such as fear, while still keeping everyone safe.

The HMD.
Oculus VR, a technology company, hopes to release a similar virtual reality HMD device for video-games in 2013. They are calling it Oculus Rift. The project was originally started by Palmer Luckey, Oculus VR’s founder, but has since gained the support of significant video-game industry figures such as John Carmack. Since its start, Oculus Rift has raised over $2,400,000 through the website KickStart which is a “non-profit organization that develops and markets new technologies that are bought by local entrepreneurs and used to establish new small businesses.” There are three integral parts to the Oculus Rift: immersive stereoscopic 3-D rendering, a massive field of view (110 degrees diagonally), and ultra low-latency head tracking. Currently, it is primarily being developed for use with the PC and uses a Digital Video Interface (DVI) video input but can be adapted to HDMI through an adapter. In order to integrate it with their video game, (1) developers must integrate Rift motion tracking with your character’s view camera transformation and (2) implement stereoscopic 3D rendering and optical distortion adjustment that creates the correct left/right eye image on the device. The Software Development Kit (SDK) will assist with these tasks, providing accelerometer and gyro data, screen size information, as well as pre-made shaders to simplify development. However, one of the revolutionary aspects of the Oculus Rift technology is that the image transformation to a 3D environment is primarily done through software, not optics. Earlier HMDs used a six-lens display with very little distortion, which was costly and produced a low field of view. Now that computing power is on a much higher level than it was in the past, the software that inputs video games into Rift is actually able to distort the image before it goes through the optics.
http://www.youtube.com/watch?v=uzCwczY1jTM

The Oculus Rift in action.
The Gallant Lab at UC Berkeley is in the process of decoding dynamic natural visual experiences from the human visual cortex. Using functional magnetic resonance imaging (fmrI), "a procedure that measures brain activity by detecting associated changes in blood flow", the lab measures brain activity in the visual cortex. "The human visual cortex consists of billions of neurons. Each neuron can be viewed as a filter that takes a visual stimulus as input, and produces a spiking response as output. In early visual cortex these neural filters are selective for simple features such as spatial position, motion direction and speed." The data obtained from these neural filters in the visual cortex are then used to develop computational models that allow for the approximate reconstruction of what was seen by that person.
https://www.youtube.com/watch?feature=p ... sjDnYxJ0bo
fmrI
Translation into Art:
For this project, I am partnering with ReCVEB, Oculus VR, and The Gallant Lab to develop a new HMD called The Mindscape. The Mindscape will maximize the psychological research conducted by ReCVEB, and the technological resources from both Oculus VR and The Gallant Lab. Utilizing similar technology to the Oculus Rift, users will wear a HMD that will allow them to re-create what they currently see into a virtual environment and explore it based on their mind alone. This will be made possible by embedding fmrI technology into the HMD. This will measure any brain activity gathered from the visual cortex. The neural filters within the visual cortex will then provide the data necessary to create a computational model. The 3DS Max software will then translate this computational model into a virtual environment for the user of The Mindscape to see and explore. This virtual environment will simultaneously be projected onto a wall in the room while the user explores it so that spectators are able to essentially see how the world is viewed from that user's eyes. By having various individuals re-create the very same surroundings, we are able to see the difference in perception from individual to individual. For example, if a user who is color-blind and near-sighted is in an art gallery with paintings, the entire room may be rendered into the virtual reality as a different color and the furthest painting may be rendered completely blurry. The Mindscape will be plugged into nearby computers which will power it as well as supplement the programming for the device. A projector will also be connected to each computer so that the virtual environment can be seen by spectators. To navigate through this virtual environment, a game controller will be connected to The Mindscape.

A virtual art gallery rendered through the eyes of someone who is color-blind and near-sighted.
Museum Presentation:
For presentation within an art museum, The Mindscape will have its own large gallery room. There will be multiple copies of The Mindscape (each accompanied by its own computer, projector, and game controller) available for use within the center of the room. All the walls will be painted white and the lights will be very dim so that the projections of the user’s virtual environment will appear crisper. While waiting in line, visitors can watch the projected virtual environments.

Sources:
http://thebottomline.as.ucsb.edu/2011/0 ... ertainment
http://www.kickstarter.com/projects/152 ... o-the-game
http://www.nbcnews.com/technology/ingam ... ity-918527
http://www.nbcnews.com/technology/ingam ... ift-931087
https://sites.google.com/site/gallantla ... et-al-2011
http://en.wikipedia.org/wiki/Functional ... ce_imaging
http://www.recveb.ucsb.edu/