Feel What You See: A Novel Sensory Interface Linking Visual Heat Cues and Instant Thermal Feedback

Published in SIGGRAPH Asia 2025 Emerging Technologies, 2025

We present a novel sensory interface that allows users to truly feel what they see by transforming visually implied heat cues into physical temperature sensations. Using a vision-language model, the system semantically analyzes video content to detect heat-related elements—such as fire, ice, or sunlight—and determines the appropriate thermal response based on both type and screen prominence. The core of this experience is a custom-designed, non-contact thermal device combining two physical mechanisms: an ethanol-based aerosol module for cooling, and a visible-spectrum radiation module for heating. The system dynamically adjusts the type and intensity of stimulation to reflect the semantic properties of each visual scene. This prototype demonstrates a new form of semantic-driven multisensory interaction, extending the expressive power of visual media. It opens new possibilities in immersive storytelling, sensory accessibility, and emotion-enhanced interfaces—where the ambient temperature of a scene can be directly felt, not just seen.

Recommended citation: Xu J, Chen K, Kuroda Y, et al. Feel What You See: A Novel Sensory Interface Linking Visual Heat Cues and Instant Thermal Feedback[M]//Proceedings of the SIGGRAPH Asia 2025 Emerging Technologies. 2025: 1-2.
Download Paper