California State University, Long Beach
Inside CSULB Logo

Harnessing Virtual Reality

Published: May 8, 2017

An interdisciplinary team of CSULB scholars is harnessing virtual reality to conduct research on human-machine interaction with the help of a two-year $381,075 National Science Foundation (NSF) Major Research Instrumentation (MRI) grant.

The team is headed by Mechanical and Aerospace Engineering’s (MAE) Panadda Marayong, who served as the Principal Investigator (PI) for the grant. MAE’s Praveen Shankar and Emel Demircan, Physical Therapy’s Vennila Krishnan and Kinesiology’s Will Wu served as co-PIs.

The team’s grant will fund a project running through 2018 titled “Acquisition of Dynamic Immersive Virtual Environment for Research in Human-Machine Interaction.”

MRI grants come in two forms—development grants, used to acquire next generation research instruments; and acquisition grants, targeted for major research instruments to be shared among multiple faculty and scientific disciplines. CSULB received the latter.

“It is a way for the NSF to help universities purchase expensive equipment to help build their research infrastructure,” said Marayong. “This is the first NSF MRI grant for both the College of Engineering and College of Health and Human Services.”

The MRI program received 823 proposals representing 819 projects and eventually accepted 812 to review for $531,680 million in funding. This year, NSF planned to invest approximately $75 million for MRI awards, providing approximately 150-170 grants.

The team’s new equipment, the CAVE Automatic Virtual Environment system (CAVE), will create a virtual reality environment in a room in the Engineering Technology Building. According to Marayong, a projection system focuses on three walls to create an immersive environment. The equipment can be programmed to generate different environments such as someone walking up hill or through a community. It can even simulate a user so that the same individual can observe and evaluate themselves virtually.

The equipment also comes with a full body-tracking system which allows the team to track the movements of users at any point in time. It allows coordination between the user’s movements and positions within the virtual environment. If users move their heads, the virtual environment will follow, thus making for a more realistic experience.

The CAVE system will enable a variety of research activities relating to human-machine interaction, reflecting work being carried out by the co-PIs and is strongly supported by existing collaboration and resources at CSULB. Demircan, who has established such collaborations across several colleges since starting at CSULB last year, believes that the new infrastructure will enable new avenues for highly inter-disciplinary collaborations in research, student training and outreach. Demircan, the head of the Human Performance and Robotics Laboratory, is also an assistant professor of MAE and the newly established Biomedical Engineering departments, and will lead the research in human-robot interaction for rehabilitation.

Wu, an associate professor who serves as the Director for the campus’ Center for Sport Training and Research, sees the value of the new system as an exciting avenue to develop new applications for enhancing sport performance and rehabilitative strategies.

“Using the system to not only measure performance but track how those performances change within different environments will provide researchers and practitioners alternative methods for improving sport performance or activities of daily living,” he said. “The CAVE virtual environment system will allow a diverse group of researchers to answer complex questions of human performance in a way that can’t be done by a single scientific discipline.”

The project also drew raves from Clinical Rehabilitation And Biomechanics lab director Krishnan, who explained that the lab’s responsibilities include gait rehabilitation in chronic stroke survivors and the design of novel rehabilitation interventions.

“For stroke patients with half their bodies affected, they often drag the affected limbs. When you want them to put weight onto that side, they don’t,” she said. “When you’re looking at them, their walking is not symmetrical. This is where our project can help.”

Shankar summarizes the project this way.

“This new equipment is a way of testing technologies that would be hard to test in the real world,” he said. “When you have new technology, there are safety issues. But you can test something to see if it will work in a virtual world. I am interested in the potential for interaction between machines and humans and I know things are a lot less dangerous in a virtual environment.”