OverwriteXR

Resources and content on everything XR

Beyond Sight and Sound

Most of what we call “media”—newspapers, movies, books, social feeds, even VR right now—is built around sight and sound. Audio/visual go hand-and-hand. The term “television” could technically be replaced with “televisionphone.” Why? Because they’re scalable, controllable, and easy to design for—what a free market craves. They’re also the foundation of how humans have historically communicated—through spoken language, writing, and body cues—and how we’re trained to learn from an early age. Senses like taste and touch might be culturally seen as more primitive (picture a caveman going up to something and smelling or tasting it to learn about it), while sight and sound were framed as intellectual—shaping which ones media valued. Sight is focused—you point your eyes at what you want to see. Sound is unfocused—it fills the room and lets you multitask while engaging with it—but it’s temporary, which makes it manageable. And you can scale these by blasting a sound to a stadium or beaming an image to a billion screens.

Other senses are trickier. Smell is also unfocused, but it lingers. You can’t direct it, and it sticks around too long for most media formats. Touch and taste require direct contact, and simulating them means dealing with pressure, temperature, moisture, surface texture, and nerve response. Balance and proprioception (your sense of movement and body position) are even harder—they involve your inner ear and full-body tracking. That’s part of why smell-o-vision failed, why touchscreens feel flat, and why most digital content treats you like a floating pair of eyeballs.

Haptic gloves can now simulate grabbing rough or soft objects. VR treadmills and motion rigs engage your balance and movement. Some headsets trigger temperature shifts or simulate wind. Prototypes exist that emit scents, or stimulate nerves with electricity to mimic touch. Some systems are even exploring ways to respond to internal states like heart rate or breath. These tools don’t just add cool features—they make digital worlds more immersive and increase accessibility for people with different sensory needs.

Picture a virtual campfire setting where you can smell and taste the marshmallows, feel the campground, see the forest, hear the birds and fire crackling, and sense the heat of the flames. Don’t forget the omnidirectional treadmill for a sense of motion when you’re ready to head back to the tent. Or perhaps later down the line we’ll just have direct brain implants that provide all the sensations. The metaverse can engage the whole body, changing what media can be.