Originally published by Pro Audio Files http://theproaudiofiles.com/music-and-the-brain/
What’s the Brain Got to Do With It?
The Perception of Sound
How do we perceive sound? Well it seems simple enough (ok, it’s not that simple): sound waves travel through the air as compressions and rarefactions, these travel through our outer ears and cause our eardrums to vibrate, this vibration is transmitted via tiny bones in the inner ear and translated to electrical impulses by thousands of little hair cells.
Once the mechanical sound has been transformed into electrical signals and picked up by the hearing nerve it gets a lot less straightforward. Our brain now has to make sense of the vibrations that hit our eardrum. This is where psychoacoustics come into play.
Filling in the Gaps
We don’t always receive perfect and coherent information through those vibrations hitting our ears. Our brain then needs to fill in the gaps. A lot of times our brain guesses right and fills in the missing piece of our reality.
An example of this is a telephone conversation. The frequency range of the microphone and speaker of a telephone is limited to such an extent that the actual pitch of our voice (the fundamental frequency) is not being transmitted.
The Missing Fundamental
Because of the harmonics in our voice creating a series that is recognizable to our brain, our brain simply creates the fundamental frequency for us. It’s a pretty convincing illusion. Even though our intellect can understand that the fundamental is missing, there is really no way for us not to hear it.
This effect, known as the missing fundamental, can be useful to us in a mixing situation.
The missing fundamental is the illusion at play when we choose to distort a low bassline so that it can be heard on small speakers. Again, it’s the harmonics that fool our brain to “hear” the fundamental. Unfortunately there doesn’t seem to be a way to fool ourselves into feeling the bass resonate in our gut.
Is What We See What We Hear?
Our eyes can certainly do some strange things to influence how we perceive a sound. One effect is known as the McGurk effect and shows us once again that even though we know something to be true on an intellectual level, it doesn’t seem to change the way our perception works.
In this video you see a man saying “ga-ga-ga-ga-ga-ga”, but the sound you’re hearing is actually “ba-ba-ba-da-ba-ba”. When combining the two, your brain will tell you that he’s saying “da-da-da-da-da-da.” Again, knowing this to be the case doesn’t make you hear “ba-ba,” unless you close your eyes or look away.
Our eyes certainly play a big part in evaluating sound, keep this is mind next time you lay your eyes on the fancy GUI of your new plugin. And make a habit of turning off your screen every once in a while when you’re listening to your mix. If you’re not yet convinced that your eyes can distort your interpretation of sound, keep reading.
To further illustrate how ingenious our brain is and how much it helps us interpret sound, let’s talk briefly about sound localization.
We as engineers apply the science of sound localization when we use a stereo miking technique, pan a track on our console or in our DAW. The brain essentially uses two factors when determining where a sound source is located. These two factors are phase and amplitude.
When turning a pan knob, you change the balance in level between the left and right channel thus making the brain assume that the sound source is located where it seems the loudest. Phase is a big factor when using a stereo miking technique such as X/Y (the phase/timing of the sound wave hitting each microphone is vital information for our brain).
A Different Train of Thought
Sound localization may seem quite straightforward. But even though our brain is relying on principles that are very much a part of the physical world, our eyes can be a powerful deceiver.
An experiment was done where the participants watched a train (not moving) in front of them and heard a sound that seemingly originated from the train.
After awhile the train began to move and traveled in a circle around the participant while the sound source was still in front of them and didn’t move along with the train. Now a spectacular thing happened, since the sound is obviously not moving and the assumed sound source obviously is moving, the brain has to find a new plausible (stationary) origin of the sound. One that does not interfere with what we see. The solution is to place the source right above us. And that’s where the participants of the experiment heard the sound when they saw the train moving.
Another part of psychoacoustics that is a major part of our job as mixers is audio masking.
The general principles of masking go like this: a louder sound masks a quieter sound, low frequencies mask high frequencies. The way we counter these effects in mixing is by using our level faders, EQs and pan pots, and thus getting a sound out of the way from another sound so our brain can distinguish it.
Acknowledging that our brain is a big part of what we hear will also tell us that our visual perception, general thoughts and expectations are monumental in the assessment of what we hear. Regularly doing blind tests (simple ones like clicking the bypass button on a plugin several times while keeping your eyes closed) is a really good way to know that you’re actually making positive changes in your production or mix.