A Multisensory Embedded System for Interactive Spatial Experience
Keywords:
Human-Computer Interaction
Multimodal Interaction
Embedded System
Sound Interaction
Pavilion
interaction part: Ruyi Yang, Bingxin SHI, Yu MI, Xiongwei LUO
instructor: Professor Chao YAN & Feng DENG, Tongji University
source code for interaction part: https://github.com/Emmmmmmaa/FirForestLight_ESP
Spectrum Mode Sample
Pure Music
Instrumental
Vocal
Vocal
Designed a multimodal interactive system combining voice-controlled and proximity-based interaction, with ESP32-based master-slave communication and embedded sensors.
The master unit integrates a microphone for real-time sound signal analysis, while each slave unit controls 35 LEDs and collects distance data via ultrasonic sensors.
Successfully implemented three interaction modes: constant lighting (color/ambient/breathing), sound-reactive lighting (volume & spectrum), and human-triggered effects (growth simulation & particle fireworks).
construction process
Credit