Northwestern Mutual executives were all set to roll out a proprietary financial-planning tool the company wanted its network of independent financial advisers to embrace. Then the pandemic hit. The crisis required pivoting to virtual engagement, which resulted in creating more opportunities for outreach and better uptake and understanding of the tool.
Since the start of the pandemic, 42 percent of employees globally have reported a decline in mental health. Mental-health conditions cause absenteeism, presenteeism, and approximately$1 trillion worth of lost productivity a year, according to the World Health Organization. The good news is that a meta-analysis shows that for every dollar companies spent on wellness programs, their healthcare costs fell by approximately $3.27 and their absenteeism costs by about $2.73. McKinsey identified six main types of digital offerings that could be helpful for companies implementing an employee well-being strategy. These include mindfulness tools, data-collection wearables, meditation and hypnosis apps, and virtual mental-health services.
While therapeutic approach incorporating alternative treatment therapies such as energy psychology is beyond the scope of this review, in terms of future research, it may be nonetheless worthwhile to note these findings and to consider the potential for virtual reality exposure therapy (VRET) treatment of PTSD at an extended familial level. Specifically, this review will focus on olfaction as a variable in virtual reality exposure therapy of PTSD. Notably, a number of studies such as Vasterling et al. (2000) and Dileo et al. (2000) have supported the argument that PTSD patients have significant olfactory deficits. These data provide an opportunity to investigate cognitive aspects of olfactory function in PTSD, leading to the consideration of the use of remedial and therapeutic olfactory stimulation in virtual reality therapy programs.
Reger et al. (2011) evaluated the effectiveness of virtual reality exposure therapy (VRET) for 24 active duty soldiers seeking treatment following a deployment to Iraq or Afghanistan. The study showed that virtual reality exposure therapy resulted in significant reductions in PTSD symptoms following an average of seven treatment sessions (Reger et al. 2011). Additionally, 15 (62 %) patients reported clinically meaningful, reliable reduction in PTSD symptoms, thus supporting the effectiveness of exposure therapy for active duty soldiers. These findings were supported by McLay et al. (2012) who tested a method for applying virtual reality exposure therapy to active duty service members diagnosed with combat posttraumatic stress disorder (PTSD). Forty-two service members with PTSD were recruited, 20 participants completed the treatment, it was reported that of those who completed posttreatment assessment, 75 % experienced at least a 50 % reduction in PTSD symptoms. Notably, there were no adverse events associated with VRET treatment, thus providing additional support for the use of VRET in combat-related PTSD (Reger et al. 2011; McLay et al. 2012).
Two main virtual reality setups are used to immerse patients in a virtual environment, namely the head-mounted display (HMD) and the computer automatic virtual environment (CAVE). The virtual environment used by Rothbaum et al. (2001), consisted of a basic hovering helicopter simulation, experienced via an HMD, with therapist controlled visual and auditory effects. The Rothbaum et al. (2001) study reported a reduction in PTSD ranging from 15 to 67 %; however, there was no control group. Following the attack on the World Trade Centre, Difede and Hoffman (2002) developed a virtual environment with a gradual immersion simulation process. Patients were exposed to explosions, sound effects and virtual subjects jumping from burning towers and reported a significant reduction in PTSD symptoms following treatment; however, it was a small study with only ten participants. Both Rothbaum et al. (2001) and Difede and Hoffman (2002) indicated positive results, albeit with limited participants and some design flaws.
Rizzo et al. (2006) studied the design and development of a virtual Iraq PTSD VR application; notably, olfactory stimuli including the scent of burning rubber, cordite, body odor, diesel fuel, Iraqi spices and gun powder were deployed. While the need to add olfactory and tactile stimuli in VR prototype environments was noted, no data were reported to quantify its effectiveness in terms of gradual staged (step by step) immersion. This gap in the literature was partly addressed by the Josman et al. (2008) study that measured participant distress precipitated by staged sensory VR exposure. Results indicated that the staged addition of sound to visual stimulation elicited emotional responses in subjects incrementally, the more realistic the sensory environment, the greater the emotional response. However, a study of sensory modality in VR therapeutic environments (DiScalfani 2012) reported that overall virtual reality exposure, including visual and auditory stimulation, was sufficient to evoke distress. It was reported that the addition of olfactory and tactile stimulation did not have a significant impact. The authors did, however, note a number of limitations including independent variable considerations and potential experimenter effects (DiScalfani 2012). Overall findings support the implementation of sense-specific staged stimuli in VR treatment of PTSD, and the need to make the experience as real as possible. However, the way in which sensory modalities may work together to heighten stimuli sensation in VR PTSD treatment methodologies requires further study.
Buxton (1994) maintains that modern technologies have failed to take advantage of all of our human physical abilities, thus perhaps reflecting a distorted view of human senses. For example, standard input devices such as keyboards almost completely fail to take advantage of highly developed human senses such as touch and control over pressure. In terms of olfaction, smells inform regarding immediate vicinity, be that burning toast or fragrant floor polish. Similar to sound, a smell may be strongly tied to a specific source (such as a perfumed flower) or smells may form ambient mixtures in the background (such as the powerful smell of pine trees in a forest). Cater (1992) emphasizes the importance of ambient smell in a physical environment in terms of creating a sense of presence in the virtual environment. Sense of smell is constantly used to inform us about our immediate environment, and therefore logically in a simulated or virtual reality environment, smell plays a key role, and researchers and technology developers should focus on its potential.
In terms of the behavioral sciences, Spooner and Pachana (2006) maintain that replication of everyday life environments in laboratory experiments is crucial as it directly improves validity of results especially concerning subtle interactions. In terms of the focus of this paper, replication of real-world stimuli is critical in terms of the research design of VR therapeutic environments. Regarding current VR design, Nakamoto et al. (2008) argue that real-world auditory and visual perceptions are almost perfectly replicated, senses working together to create the overall experience. However, according to Craig et al. (2009), this simulation almost never includes chemosensory perception. Arguably as olfaction is more complex to implement and control (Chen 2006), use in VR environments remains more the exception than the rule. Barfield and Danas (1995) first outlined this fact almost a decade ago, maintaining that olfactory information has been largely ignored as input to virtual environment participants despite the fact that olfactory receptors provide a rich source of information to humans. However, some research to date has incorporated olfactory stimulation, for example, in virtual environments for military training (Vlahos 2006), fire-fighter training and medical diagnosis (Spencer 2006).
Evaluation of the role of sensory stimulation in the field of VR has highlighted olfactory stimulation as a potentially powerful yet underutilized therapeutic protocol. Historically, olfactory deficit has been noted as a component of PTSD (Myers 1915), and notably, early designs incorporated smell in the virtual experience (Mortonheilig.com 2010). However, arguably olfaction in virtual environment design and development has failed to maintain a position commensurate with its sensory capacity (Chu and Downes 2000; Chen 2006; Ischer et al. 2014), exemplified by the paucity of research, application and likely associated development cost factors.
Furthermore, research is required into causative or reactive mechanisms that may underlie olfactory deficit in PTSD and perhaps other disparate syndromes that present olfactory dysfunction. Undoubtedly, there will be continued debate as to the effectiveness of olfaction in virtual reality. Arguably the literature supports a hypothesis that olfaction as an element of multi-sensory reconstruction in a virtual environment PTSD program may have a positive impact. That is, with a proviso that research be undertaken to maximize the potential and effectiveness of olfaction as a variable in any form of virtual reality exposure therapy.
As always there are cost implications in terms of researching and developing new technical prototypes, future research projects are complex given the vulnerable nature of the specific PTSD population under study. Emergence of apparent online content moderation-induced PTSD arguably provides impetus for technology companies to engage actively in researching and developing VR therapeutic protocols. Given likely cost-efficiencies of technology facilitated early intervention immersive virtual reality multi-sensory therapy, versus long-term standard treatment of PTSD, further innovative research approaches are undoubtedly