A Multi-Sensorial Food Kiosk
Aiming to provide a more informative, universal food selection process, I have built a kiosk that uses all five human senses and experimented with the public.
See more in the paper:βA Multi-Sensorial Kiosk Interface to Familiarize the Public to Try New Healthy Foodsβ IMWUT β24
My Role: Individual Research
Problem definition, Multi-sensorial design, Kiosk implementation, Field experiment, Quantitative and qualitative analysis
Location
Hamyang Expo, Korea
Time
3 months prototype + experiment + 1 month documentation
Deliverables
Sensory Workshop, Arduino-operated touchscreen kiosk, Publication
Table of Contents
Problem Definition
01
Study Design
02
Study Procedure
Experiment Data Analysis
04
03
How Do We Normally Choose Food?
Sensory Workshop
I have chosen an unfamiliar food item that is also healthy and antioxidant, a turmeric latte, designed a novel sensory workshop to extract necessary design requirements, and observed unfamiliar food flavor perceptions.
Multisensory Prototyping
Ingredient Exposure was devised as the user touches each presented ingredient. For audio stimulation, it was transmitted through embedded speakers, tactile through molding texture, and olfactory diffused through Arduino humidifier module.
Sensory Design Requirements
Two design majors and two food and nutrition majors who have no food allergies to the item and its ingredientsβturmeric, cinnamon, honey, ginger, and milkβwere recruited for the workshop. They were engaged in one session each, two people at a time, and spent around 45 to 60 minutes completing the worksheet.
Research Questions:
How Can We Educate and Familiarize Foods to The Public Effectively?
How Can We Test the Theory of Sensory Exposure Using Computation?
How Can We Design a Multisensory Kiosk?
Sensory Workshop
Vision
Color, shape, typography, imagerySound
Food sound, origin sound, associative musicSmell
General smell, origin smell, imageryTouch
Temperature, texture, shape-feelTaste
General taste, pleasure, imagery
Interface Design
Prototyping
-
Visual
Using Framer for the kiosk visual and on-screen interactions, I designed food selection to select the menu and start the individual ingredient exploration.
-
Audio
With the extracted auditory modalities, I designed a sound file with the associated music, food sound, and the sound of its origin to play upon selection.
-
Olfactory
To implement olfactory modalities, fragrance oils from Nature In Flavor Co., Ltd. were utilized to match each ingredientβs smell attributes, which were then diffused by planting an Arduino humidifier module.
-
Tactile
Tactile stimulations were formulated through modeling clays, resin, and appropriate ornamental components, each fixed with a tact switch attached to the pertinent ingredient.
User Testing and Results
Control VS
Audio-visual VS
Multi-sensorial
One-Way ANOVA revealed significantly increased willingness to try and food familiarity. Paired t-tests showed significant differences in pre-food familiarity level and post-food familiarity levels.
More results and detailed insights can be found in the manuscript here.