We are auditory beings
Introduction: A World We See More Than We Hear
Modern life trains us to believe that sight is our primary way of knowing the world. We watch the news, scroll through information, visualize ideas, and measure understanding in terms of what is “clear” or “visible.” Hearing, by contrast, is often treated as secondary, useful, but supplementary.
Yet this sensory hierarchy is historically recent and biologically misleading.
For most of human existence, hearing was essential to survival, orientation, and meaning. Sound warned us of danger before we could see it, connected us socially across distance and darkness, and allowed us to navigate environments where vision failed. The question is not whether hearing is important, but how and why it came to be sidelined and what we lost when it was.

1. Hearing Before Vision: A Sense of Time, Space, and Presence
Hearing as an Early and Fundamental Sense
From a developmental standpoint, hearing precedes vision. The auditory system becomes functional before birth, while vision matures only after weeks or months of postnatal experience. This early primacy is not incidental: hearing is deeply tied to rhythm, bonding, language, and temporal awareness.
Unlike vision, which frames the world spatially, hearing structures it temporally. We do not see movement the way we hear it; sound unfolds over time and forces attention. You can close your eyes, but you cannot shut your ears. You see only in your field of vision during the day. You hear up, down, left, right, back, front, day and night.
Philosophers as early as Aristotle recognized this distinction. While he ranked sight as the noblest sense for acquiring knowledge, he acknowledged that hearing was essential for speech, learning, and shared understanding (De Anima).
2. Sound and Navigation: Knowing Without Seeing
Auditory Space and Orientation
Long before neuroscience, observers noticed that humans can orient themselves through sound alone. Blind individuals were reported to avoid obstacles, judge distances, and perceive spatial layouts without touch.
In 1749, Denis Diderot, in Letter on the Blind for the Use of Those Who See, described blind people sensing nearby objects without contact. While initially mysterious, later research revealed this ability to be: auditory echolocation, the interpretation of reflected sound. By the mid-20th century, experiments at Cornell University demonstrated that this “obstacle sense” disappeared when auditory cues were masked, confirming that hearing, not touch or air pressure, was responsible.
Today, studies show that expert human echolocators recruit brain areas typically associated with vision, underscoring a critical insight:
The brain is not strictly visual or auditory, it is spatial and predictive.
Hearing, in other words, can stand in for vision when needed.

3. When Did Vision Take Over?
Classical Roots of Visual Dominance
Western philosophy planted the seeds early. Plato’s metaphors of light, illumination, and vision dominate his theory of knowledge. Truth is something to be “seen.” Ignorance is darkness. This visual bias persisted through Euclidean geometry, Renaissance perspective, scientific illustration and optical instruments (telescopes, microscopes). Vision became associated with objectivity, distance, and control, while sound remained tied to subjectivity, emotion, and proximity.
The Industrial Revolution: Acceleration, Not Origin
The Industrial Revolution did not invent visual dominance but it amplified it. Key shifts included: Print culture and mass literacy, mechanical clocks standardizing visual time, factory layouts privileging sight over sound, and urban noise reframed as “pollution” rather than information. At the same time, sound became harder to interpret meaningfully. Natural soundscapes were replaced by mechanical noise, weakening our ability to use hearing for orientation. Later technologies, photography, cinema, television, screens completed the transformation. The modern subject became a spectator.

4. Science Reinforces the Bias
Psychology itself inherited this hierarchy. Experiments such as the Colavita visual dominance effect show that when visual and auditory stimuli compete, people often fail to notice the sound entirely. Neuroscience research has historically devoted far more resources to vision than audition, reinforcing the idea that sight is “primary.” Language mirrors this bias: “I see” means I understand. “Clear” means true. “Obvious” means visually apparent. By contrast, hearing metaphors (“I hear you”) imply empathy rather than knowledge.

5. What We Miss When We Ignore Hearing
Sound as Relational and Immersive
Vision separates subject and object. Hearing connects them. Sound places us inside events rather than outside them. It conveys emotional nuance in speech, social presence, environmental change, Threat and opportunity beyond the field of view. Anthropologists note that many non-Western cultures place greater emphasis on listening than looking, valuing attunement over observation.
Cognitive and Emotional Consequences
Modern research links hearing loss not only to communication difficulties but also to increased cognitive load, social withdrawal and higher risk of dementia. These effects suggest that hearing is not an accessory sense, it is foundational to cognition and social life.
6. Toward a Rebalanced Sensory World
The story of hearing is not one of decline, but of neglect.
Rebalancing our sensory priorities does not mean diminishing vision. It means recognizing that we navigate the world multimodally, sound provides information vision cannot, and listening is an active, skilled form of perception. In an age saturated with images, reclaiming hearing may be one of the most radical perceptual acts available to us.
How could we reclaim hearing? Will it need to be through a complete abandonment of tech? or maybe through developing tech in an auditory sense to allow audition to reclaim its important position alongside vision?
Whatever the way, what is sure is that it is necessary. We are certainly moving into an era where sound matters more and we ought to be ready for receiving and processing auditory information better.
References
- Aristotle, De Anima – Early sensory theory; distinction between sight and hearing
- Plato, Republic – Visual metaphors shaping Western epistemology
- Denis Diderot (1749) – Letter on the Blind; early observation of auditory spatial perception
- Supa, Cotzin & Dallenbach (1944) – “Facial Vision” experiments (Cornell)
- James J. Gibson (1966) – The Senses Considered as Perceptual Systems (important corrective to visual dominance)
- Colavita (1974) – Visual dominance effect
- Research on human echolocation (Thaler, Arnott, et al.)
- Cross-modal plasticity in blind individuals (visual cortex processing sound)
- Auditory scene analysis (Albert Bregman)