top of page

Design and Evaluation
Of StreamBED VR

My dissertation asks whether qualitative stream assessment skills can be taught to potential citizen scientists through virtual reality (VR). VR affords users with a sense of telepresence, an illusion of actually "being" in a virtual world. Previous research suggests that VR can help learners develop spatial knowledge and help train environmental judgment skills. The goal of this work was consider whether VR can be employed to train citizen scientists to make qualitative stream assessments the way experts do, giving them practice rating virtual stream habitats.


My research asks:  (1) How can technology effectively train scientific qualitative judgment tasks? (2) Do multisensory cues help learners make habitat observations?  (3) How to scaffold learning to support the expert process?


To undertake this series of questions, I  first gathered requirements by through expert interviews and in-person training with expert monitors. Then, I designed an initial prototype of the Streambed VR tool through an iterative process, and conducted an initial evaluation.  To understand how  multisensory cues contribute to habitat observations, I  prototyped an ambient multisensory tool called the Ambient Holodeck, and evaluated whether ambient sensory information helps learners make better observations of their environment. Finally, I designed the second prototype of Streambed VR through a user-centered process, and evaluated its value with and without multisensory cues against traditional PowerPoint training. 


Requirement Gathering

Study 1 observes and reports on differences in background and training between professional and volunteer monitors, synthesizing findings about volunteer training needs. I observed on differences in background and training between professional and volunteer monitors, using these experiences to synthesize findings about volunteer training needs. My findings reveal that to successfully make qualitative stream assessments, volunteers need to: 1) experience a diverse range of streams, 2) discuss judgments with other monitors, and 3) construct internal narratives about water quality. 


Initial Tool Design

Initial Tool Evaluation

Study 2 evaluated the initial StreamBED VR training that teaches volunteers to make qualitative assessments of stream habitats through physical exploration. The goal of this study was to consider the viability of the initial training design, and to understand novice learner needs. To accomplish this goal, this study taught participants who are not expert monitors to make and update qualitative assessments of 4 EPA protocol metrics. During training, participants first saw differences in habitat quality by experiencing an ``optimal'' and a ``poor'' habitat stream, then learned to calibrate assessments by evaluating and getting feedback on virtual streams exhibiting diverse quality characteristics. 





My findings reveal that 1) the StreamBED VR training immersed and motivated study participants, inspiring them to 2) interact with the environment through storytelling and virtual surveying. However, pilot findings revealed that 3) participants had trouble with several facets of the training; they were distracted by the rendered training environment, had trouble interpreting protocol subjectivity, and anchored to extreme protocol judgments.

Evaluation of Multisensory Realism

on observation skills

Study 3 evaluated the role of multisensory realism on observation skills in VR by designing the Ambient Holodeck, that allows users to feel and smell landscape and environmental conditions. The study tests participant observations of stream habitats with and without ambient multisensory cues.  In this study, I first built  a new multisensory environment interface for VR, called the Ambient Holodeck, that allowed users can feel and smell landscape environmental conditions (wind, temperature, humidity). I used this system to compare how participants made observations and inferred patterns between 2 stream habitats in VR, in either Standard VR with audio-visual cues, or in Multisensory VR with ambient sensory cues. My findings reveal that multisensory information 1) improved the number of observations participants made, and 2) positively impacted engagement and immersion.

FINAL Prototype





The final study tested the effectiveness of this prototype with and without multisensory sensory cues, and against a baseline PowerPoint training.  The study was multifaceted and complex. Findings between Standard and Multisensory VR conditions reveal that participants in both conditions 1) found the StreamBED VR training system easy to use, 2) enjoyed and benefited from collaborative learning, and 3) were engaged by the system. Comparing all three conditions found that VR participants were more excited for future training, and that Standard VR participants scored closest to a gold standard assessment. Remarkably, the research also found PowerPoint training was also effective when it presented training using the training scaffold I developed.

unnamed (1).jpg
bottom of page