Psychoacoustic principles

Psychoacoustic Principles in the field of spatial audio perception involve understanding how the human auditory system processes and interprets sounds in different environments. This knowledge is essential for creating immersive and realist…

Psychoacoustic principles

Psychoacoustic Principles in the field of spatial audio perception involve understanding how the human auditory system processes and interprets sounds in different environments. This knowledge is essential for creating immersive and realistic audio experiences in various applications, such as virtual reality, gaming, and music production. Here are some key terms and vocabulary related to psychoacoustic principles:

1. Frequency and Pitch

Frequency refers to the number of cycles or oscillations of a sound wave per second, measured in Hertz (Hz). The pitch of a sound is the perceived highness or lowness of a tone, which is related to its frequency. Higher frequency sounds have a higher pitch, while lower frequency sounds have a lower pitch.

Example: The A4 note on a piano has a frequency of 440 Hz, which corresponds to a pitch that most people would recognize as "middle A."

Practical Application: In music production, producers can use EQ (equalization) to adjust the frequency content of a sound, boosting or cutting certain frequencies to make the sound fit better in the mix.

Challenge: Try to identify the pitches of different sounds you hear in your environment, and think about how their frequencies might be adjusted to change their perceived pitch.

2. Amplitude and Loudness

Amplitude refers to the magnitude or strength of a sound wave, which determines its loudness or volume. Louder sounds have higher amplitudes, while softer sounds have lower amplitudes.

Example: A loud car horn has a higher amplitude than a soft whisper.

Practical Application: In audio mixing, engineers can use compression to adjust the amplitude of sounds, reducing the dynamic range and making quieter sounds more audible.

Challenge: Try to adjust the volume of different sounds in your environment, and think about how their amplitudes might be changed to make them sound louder or softer.

3. Timbre and Harmonics

Timbre refers to the unique tonal quality or character of a sound, which allows us to distinguish between different instruments or voices playing the same pitch. Timbre is determined by the harmonics or overtones of a sound, which are additional frequency components that are integer multiples of the fundamental frequency.

Example: A flute and a saxophone playing the same note will have different timbres due to their different harmonic structures.

Practical Application: In sound design, artists can manipulate the harmonics of a sound to create unique timbres and textures.

Challenge: Try to identify the timbres of different sounds you hear in your environment, and think about how their harmonics might be changed to create new and interesting sounds.

4. Localization and Binaural Hearing

Localization refers to the ability of the auditory system to determine the direction and distance of a sound source. This is achieved through binaural hearing, which involves comparing the differences in time and level between the sounds arriving at each ear.

Example: When you hear a sound coming from your left, your brain is able to locate the source by analyzing the time and level differences between the sound arriving at your left and right ears.

Practical Application: In virtual reality applications, binaural audio can be used to create a more immersive and realistic audio experience, allowing users to hear sounds in 3D space.

Challenge: Try to locate the sources of different sounds you hear in your environment, and think about how your brain is using binaural hearing to determine their direction and distance.

5. Reverberation and Decay Time

Reverberation refers to the persistence of sound in a space after the original source has stopped. This is caused by multiple reflections of the sound wave off surfaces in the environment, which create a complex pattern of echoes that gradually decay over time. Decay time is the time it takes for the reverberation to decrease by 60 dB after the source has stopped.

Example: A large concert hall will have a longer decay time than a small living room, due to the larger surface area and greater number of reflections.

Practical Application: In recording and mixing, engineers can add reverb to simulate different acoustic environments, creating a sense of space and depth.

Challenge: Try to listen to the reverberation in different spaces you enter, and think about how the decay time might be affected by the size and surface materials of the space.

6. Masking and Critical Bands

Masking refers to the phenomenon where a loud sound can make a softer sound less audible or even inaudible. This occurs because the louder sound can partially or completely cover up the softer sound, especially if they are within the same critical band of frequencies.

Example: If you are listening to music with headphones and someone speaks to you, you may not be able to hear the music as clearly until the person stops speaking.

Practical Application: In audio production, engineers can use masking to reduce unwanted noise or interference, by boosting or cutting certain frequencies to make them less audible.

Challenge: Try to listen to different sounds in your environment and think about how they might be masking each other, especially if they are within the same critical band of frequencies.

7. Binaural Beats and Entrainment

Binaural beats are an auditory illusion created when two slightly different frequencies are presented separately to each ear. The brain perceives the difference between the two frequencies as a beating or pulsing sound, which can have a variety of effects on brain waves and cognitive states. Entrainment is the phenomenon where external stimuli, such as sound, can synchronize brain waves to a specific frequency.

Example: Listening to a binaural beat with a frequency of 4 Hz can induce a state of deep relaxation and meditation, as the brain waves synchronize to the same frequency.

Practical Application: Binaural beats and entrainment can be used for a variety of purposes, such as stress relief, improved focus and concentration, and enhanced creativity.

Challenge: Try listening to a binaural beat with a specific frequency, and pay attention to any changes in your cognitive state or perception.

8. Spatial Release from Masking

Spatial release from masking refers to the ability of the auditory system to separate sounds in space, even if they are within the same critical band of frequencies. This is achieved through a combination of binaural hearing and other spatial cues, such as timing and level differences between the ears.

Example: If two people are speaking to you at the same time, you may be able to focus on one person's voice by orienting your head towards them, which enhances the spatial cues and reduces masking.

Practical Application: In audio production, engineers can use spatial release from masking to create a more natural and realistic sounding mix, by separating sounds in space and reducing masking.

Challenge: Try listening to two sounds in your environment that are within the same critical band of frequencies, and see if you can separate them in space by orienting your head or using other spatial cues.

9. Precedence Effect and Haas Effect

The precedence effect (also known as the law of the first wavefront) refers to the phenomenon where the initial sound arrival time is used to localize a sound source, even if later arrivals have a higher amplitude or level. The Haas effect (also known as the law of the first wavefront) refers to the ability of the auditory system to integrate multiple sound arrivals into a single percept, as long as they are within a certain time window (typically around 30-50 ms).

Example: If you are listening to a live concert, the sound from the speakers may arrive at your ears slightly later than the sound from the stage, but your brain will still localize the sound source to the stage due to the precedence effect.

Practical Application: In audio production, engineers can use the precedence and Haas effects to create a more natural and realistic sounding mix, by controlling the timing and level differences between different sound sources.

Challenge: Try listening to different sounds in your environment and pay attention to the timing and level differences between them, and see if you can identify the precedence and Haas effects

Key takeaways

  • Psychoacoustic Principles in the field of spatial audio perception involve understanding how the human auditory system processes and interprets sounds in different environments.
  • Frequency refers to the number of cycles or oscillations of a sound wave per second, measured in Hertz (Hz).
  • Example: The A4 note on a piano has a frequency of 440 Hz, which corresponds to a pitch that most people would recognize as "middle A.
  • Practical Application: In music production, producers can use EQ (equalization) to adjust the frequency content of a sound, boosting or cutting certain frequencies to make the sound fit better in the mix.
  • Challenge: Try to identify the pitches of different sounds you hear in your environment, and think about how their frequencies might be adjusted to change their perceived pitch.
  • Amplitude refers to the magnitude or strength of a sound wave, which determines its loudness or volume.
  • Example: A loud car horn has a higher amplitude than a soft whisper.
May 2026 intake · open enrolment
from £90 GBP
Enrol