What does EDM music teach you
Estimated reading time: 12 minutes
Video killed the Radio Star
MTV was just the beginning. It seems that video has completely gained the upper hand over good ol 'audio. We know from communication science that images represent the most direct form of communication. Does that mean at the same time that visual communication is generally superior? How does this affect the way we consume and produce music? Is seeing more important than hearing?
Corona changes our world sustainably. Since the beginning of the pandemic, video data traffic in Germany has quadrupled and instead of picking up the phone, people prefer to make a zoom call. This has a significant impact on our communication structures. But as with every mass phenomenon, there is always a balancing correlate, a countermovement. This manifests itself in the form of the good old record player.
For the first time, more records than CDs were sold in Germany in 2021. This decelerated way of consuming music completely contradicts the prevailing zeitgeist. The desire to be able to hold your favorite music in vinyl form in your hand is very popular. The fact that we only process music from the turntable with our ears is so archaic that it seems to be removed from time.
The enjoyment of music with the help of a record player corresponds phylogenetically (phylogenetically) completely to human nature. We will clarify why this is so in the following. We learn from the past for the future. This also applies to producing music. The aim of successful music production should be to inspire the audience. Music is not an end in itself. For that we only need to look at the making of music [sic].
The making of music
Germany is an old cultural nation. Very old, to be precise. This is shown by archaeological finds that were found during excavations in a cave on the Swabian Alb. Researchers found flutes made of bone and ivory that are thought to be up to 50,000 years old. The flute makers have even implemented finger holes that allow the pitch to be manipulated. Experts estimate that humanity has been expressing itself rhythmically and melodically for a very long time. It is assumed that these non-linguistic acoustic events primarily served social contexts. Music made it possible to express emotional sounds and established itself as a second communication system parallel to language.
Much of the emotional level of making music has been preserved to this day, such as the so-called "chill effect". This occurs when the music gives you goose bumps. The goose bumps are the physical reaction to a chill moment. The chill effect also causes the brain's reward system to be stimulated and happiness hormones to be released. This happens when the music has special moments in store for the listener, and these moments are often very subjective. But it is precisely from this that music listeners derive their benefit during music consumption. Emotion is the currency of music. For this reason, children should be given the opportunity to learn a musical instrument.
Along with language, music is a profoundly human means of expression. Music teaches children to experience feelings and also to express their own feelings. An alternative way of expression in case language fails. It is the desire for emotionality that makes us resort to vinyl as our preferred music medium in special moments.
Then and now
The record is preserved music. The flutists of the Swabian Alb could only practice their music in the "here and now". No recording, no playback - handmade music for the moment. That meant making music for the longest period in human history. With the digital revolution, making music changed radically. In addition to traditional instruments, keyboards, drum computers, sampling and sequencers were added in the 80s. The linearity of making music was broken. Music no longer had to be played at the same time. Rather, a single musician could gradually play a wide variety of instruments and was no longer dependent on fellow musicians. As a result, several new musical styles emerged side by side in a short time, a trademark of the 80s.
The triumph of digital recording and sampling technology continued in the 1990s. Real sounds were replaced by samplers and romplers, which in turn had competition from midi programming. With the midi sequencers, screens and monitors increasingly found their way into recording studios and music was made visible for the first time. The arrangement could be heard and seen at the same time. The 2000s is the time of the comprehensive visualization of music production. Drums, guitars, basses and synths - everything is available as a VST instrument and has been virtually at home on our monitors ever since.
At the same time, the DAW is replacing the hard disk recorders that were used until then. The waveform representation in a DAW is the most comprehensive visual representation of music to date and allows precise intervention in the audio material. For many users, the DAW is becoming a universal means of production that theoretically provides infinite resources in terms of mix channels, effects, EQs and dynamics tools. In recent years, the previously known personnel structure has also changed. It is not the band, but the producer who creates the music. Almost everything takes place in the computer.
This paradigm shift creates new styles of music that are particularly at home in the electronic field (trap, dubstep, EDM). It is not uncommon for these productions to no longer use audio hardware or real instruments.
Burnout from a wellness vacation
A computer with several monitors is the most important means of production for many creative people. The advantages are apparent. Inexpensive, unlimited number of tracks, lossless recordings, complex arrangements tradable, unlimited number of VST instruments and plugins. Everything can be automated and saved. A total recall is mandatory. If you get stuck at any point in production, YouTube offers suitable tutorials on almost every audio topic. Paint by numbers. Music from the Thermomix. Predefined ingredients predestine a predictable result without a big headache.
Our Swabian flautists would be surprised. Music only visual? No more hardware required? No lending a hand? The Neanderthals hidden in our brain stem resist subconsciously. Does the eye replace the ear? Somehow something is going wrong. In fact, this type of production contradicts the natural prioritization of the human senses. The Stone Age flute player could usually only hear dangers before he could see them. Thanks to our ears, we can even locate the direction from which a saber-toothed tiger is approaching with astonishing accuracy.
Evolution has thought something that the sense of hearing is the only sense that cannot be completely suppressed. You can hold your nose or close your eyes, but even with your fingers in your ear, a person can perceive the approaching herd of mammoths. The dull vibrations trigger a feeling of fear. That was and is vital. Noises are always associated with emotions. According to Carl Gustav Jung (1875 - 1961), the human psyche has collective memories in the subconscious. He called these archetypes archetypes.
Noises like thunder, wind or water create immediate emotions in us. Conversely, emotions such as joy or sadness can best be expressed with music. Hearing is extremely important here. Hands and ears are the most important tools of the classical musician and for this reason there are relatively many blind musicians who play at the highest level. Anyone who only relies on the computer for music production is depriving themselves of one of their best tools. Music production with keyboard and mouse is seldom more than sober data processing with artificial icing. The DAW operation via mouse requires constant control by our eyes. There is no tactile feedback. In the long run this is tiring and does not remain without collateral damage. Intuition usually comes first when reporting the damage.
Seeing instead of hearing?
The visualization of music is not problematic per se. On the contrary, because sometimes it is extremely helpful. Capturing complex song sequences or precisely editing audio files is a blessing with adequate visualization. As far as the core competence of music production is concerned, the balance is much more ambivalent. Setting an EQ, compressor, effect or even adjusting volume ratios exclusively with a monitor & mouse is ergonomically questionable. It is like trying to saw through a board with a plane. It is simply an unfortunate tool choice.
Another aspect also has a direct influence on our mix.
The visual representation of the EQ curve in a DAW or a digital mixer has a lasting effect on how we process signals with the EQ. Depending on the resolution of the display, we use the filters sometimes more and sometimes less drastically. When the visual display conjures up a massive EQ hump on the screen, our brain inevitably questions this EQ decision. Experience shows that with an analog EQ without graphic representation, these doubts are much less pronounced.
The reason: The reference of an analog EQ is the ear and not the eye. If a guitar needs a wide boost at 1.2 kHz to assert itself in the mix, then we are more willing to make drastic corrections with an analog EQ than with a DAW EQ, the visualization of which piles up a massive EQ hump in the monitor. Successful producers and mixers sometimes work with drastic EQ settings without giving too much thought. Inexperienced users who use an equalizer with optical curve display too often use their eyes instead of their ears in search of suitable settings. This often leads to wrong decisions.
Embrace the chaos
When asked what is missing most in current music productions, the answer is: intuition, interaction and improvisation. When interacting with other musicians, we are forced to make spontaneous decisions and sometimes to make modifications to the chords, processes, tempos and melodies. The improvisation creates new ideas or even a song framework, the DNA of which can be traced back to the hearing and the sense of touch.
Touch and feel
The sense of touch in combination with a real instrument offers unfiltered access to the subconscious. Or, according to Jung, to the archetypes, the archetypes. The keyboard & mouse do not have this direct connection. In order to be able to interact musically with VST instruments and plugins, we therefore need new user interfaces that meet our desire for haptics and tactility. A lot has happened on this point in recent years. The number of DAW and plug-in controllers is steadily increasing and creating a countermovement to keyboard & mouse.
Faders, potentiometers and knobs are fun!
Feeling the position of the potentiometer allows operation without consciously looking, as with a car radio. For this reason, the Federal Motor Vehicle Office sees the predominant operation of a modern electric car via touchscreen as problematic. The fact is: With this operating concept, the driver's gaze wanders from the road to the touchscreen more often than with conventional cars with hardware buttons and switches. The wrong tool for the job? The parallels are striking. A good drummer can record a song in just a few takes. Still, some producers prefer to program the drums, even if it takes significantly longer. Especially when you want to implement something like feeling and groove in the binary drum takes.
The same applies to programming automation curves for synth sounds, for example the cut-off of a TB 303. It is recorded faster than programmed and the result is always organic. It is no coincidence that seasoned sound engineers see their old SSL or Neve desk as an instrument. In the literal sense. Intuitive interventions in the mix with potentiometers and faders bring the hearing into focus and deliver original results in real time.
Maximum reduction as a recipe for success
In the analog days you could only afford a limited number of instruments and pro audio equipment. Purchasing decisions were made more consciously and the little equipment that was available was exhausted in its possibilities. Today it is easy to flood the plugin slots of your DAW with countless plugins on a small budget. One fact is often overlooked.
The reduction to carefully selected instruments is very often style-forming. Many musicians generate a unique musical fingerprint precisely through their limited choice of instruments. The concentration on a few, but deliberately chosen tools define a signature sound, which in the best case becomes an acoustic trademark. This applies to musicians as well as sound engineers and producers. Would Andy Wallace deliver the same mixes if he swapped his preferred tool (SSL 4000 G +) for a plugin bundle including a DAW? It is no coincidence that plugin manufacturers try to port the essence of successful producers and sound engineers to the plugin level. Plugins are supposed to capture the sound of Chris Lord Alge, Al Schmitt or Bob Clearmountain.
A comprehensible approach. However, with the strange aftertaste that these gentlemen are only known to a limited extent for preferring to use plugins. Another example is to revive popular hardware classics as plug-in emulations. A handsome GUI should convey a value comparable to that of the hardware. Only the programming, the code, determines the sound of the plug-in. Here you can see again how the visualization influences the choice of audio tools.
So that we don't misunderstand each other, good music can also be produced with a mouse and keyboard. But there are long-term reasons to question this way of working. We do not spread the gospel of sound engineering. We just want to offer an alternative to visualized audio production and direct the focus from the eye back to the ear. The fact that music is often sent in the background noise of the zeitgeist, we will hardly be able to turn around.
But maybe it helps to remember the archetypes of music. Hear music instead of seeing it and literally lend a hand again. Use real instruments, interact with other musicians, using Pro Audio hardware that allows tactile feedback.
The self-limitation to a few, consciously selected instruments, analog audio hardware and plugins with hardware controller connection. This intuitive workflow can help break through familiar structures and ultimately create something new that touches the audience. In the ideal case, we find our way back to the real essence of music: emotion!
Finally, one last tip: “Just switch off!” The DAW monitor. Listen to the song instead of watching it. No plug-in windows, no meter displays, no waveform display - hear the song without any visualization. Like a record, because unlike MTV, it has a future.
What do you think? Leave a comment and share this post if you like it.
With best regards, your Ruben
- May women have an emotional man
- Why did Brazil abolish its monarchy?
- Groenland lies between these two oceans
- How can you describe the elimination system
- What makes people lose hope for education
- Why is the media ignoring Gary Johnson
- How compatible are ENFP and ISTP
- You can make root beer at home
- What is offline activation
- Why do people use GIFs excessively
- Which American presidents have been wrongly indicted?
- What are the best Manoj Bajpayees movies
- NIT requires AICTE approval
- What is unregistered copyright
- Which is better in-house or outsourced employees
- What is the chemical formula for petrolatum
- What can be found on the DBMS
- Is globalization a process of homogenization?
- Will Canada ever have libertarian policies?
- Which river forms the world famous delta
- What is the chemistry behind cooked food
- What is a ghost 8
- What is a Rogers board
- What do we call parents love
- What percentage of Apple employees are Indian?
- Does your child still believe in Santa Claus?
- How do I delete my Dubsmash account
- Veja is a reliable source of news
- How can personnel planning be explained
- What does the controller do in spring
- Could dinosaurs be tamed?
- Are there millionaires in Quora
- When did yoga become popular?
- How can you make Hartley's jelly