After Harrison left ASU in 2019 to join business partner NewSpace Planet as Director of Strategic Science Initiatives, the project continued and took off.
The augmented and virtual reality research laboratory of LiKamWa where Lai and Bahremand work, the Meteor Studio, took on much of the work overseeing the engineering side of the project. LiKamWa and Spackman collaborated on a grant proposal document, which won them $850,000 in funding from the national science foundation to support the work. The research led to a prototype platform that the team developed for testing.
“It’s exciting to collaborate across disciplines on this project, combining the burgeoning software and hardware engineering expertise of our PhD students with the rich understanding of olfactory systems of Dr. Brian Smith and Dr. Rick Gerkin,” says LiKamWa. “Most importantly, integrating this partnership with Dr. Spackman’s socio-cultural lens of how our sense of smell directs our relationships with food, water, education and training has allowed this collaboration to seek applicability broader research.
Bahremand took the initiative to write an academic article, published by the Institute of Electrical and Electronic Engineers (IEEE) and titled “The Smell Engine: a system for the artificial synthesis of smells in virtual environments“, on the functioning of the smell engine.
“I spent several years attending conferences and reviewing the literature on olfactory displays to understand the software,” says Bahremand. “During this period of intensive research, we saw the need to design a hardware-software framework capable of calculating and delivering olfactory signals on the fly in virtual environments.”
The technology uses an olfactometry system that delivers odors through a device placed on a user’s nose.
LiKamWa leads the engineering side of the project, overseeing the development of the hardware and software needed to deliver scents to a user, while Spackman takes the lead in developing training materials for future use of the system in educational applications.
One of the challenges the team faces is how to mix different chemical compounds to recreate real-world smells. This is where Lai’s role comes in. Her first test to analyze how the technology works is to accurately represent the smell of a strawberry in different stages of freshness.
“Olfaction can evoke and enhance a range of emotions, and emotions are the basic layer of human thought and action,” says Lai. “The pervasive digital smell could benefit people by expanding their set of digital media tools to enhance different emotions and their perception of what is real.”
While accurately representing a scent is a major factor, Smith helps keep the project on track with his understanding of the biology of scent. He notes that a real environment has many factors at play that the smell engine will need to replicate, including the turbulence blowing odors into an environment and the strength of an odor in different areas of an environment.
Given the Smell Engine’s goal of accurately replicating smells in a chaotic environment, Smith envisions applications that include educating firefighters about the hazards they need to be alert for and teaching would-be space colonists about what that Mars could smell.
The pervasive digital smell could benefit people by expanding their set of digital media tools to enhance different emotions and their perception of what is real.
— Jessica Lai, PhD student in electrical engineering
Spackman sees another potential educational application for the Smell Engine: training people to know what water should smell like. She says some of the main reasons people choose to drink bottled water instead of tap water are that they don’t trust the safety of municipal water systems and they don’t don’t like the taste.
Spackman hopes adding a smell to virtual reality will help those working in water management recognize when water is contaminated.
“Teaching people who are going to be on the front line to work with water all the time so they get up to speed faster, that could have a big impact,” Spackman says.
Beyond educational applications, the team envisions a world of possibilities for implementing smell in VR, such as VR game and movie experiences.