SONIC DESIGN / LECTURE NOTES
Naura / 0356798 / Interactive Spatial Design
VSA60304 / Sonic Design
Lecture Notes
- Weekly Sonic Exercises
- Week 1 - Introduction
- Week 2 - Sound Fundamentals
- Week 3 - Sound Design Tools
- Week 4 - Diegetic Vs Non-Diegetic Sound
Week 1 - Introduction
In Week 1, we received an introduction to the module along with an overview of the assignments for the semester. We were also advised to have high-quality headphones ready for the upcoming projects.
__________________________________________________________
Week 2 - Sound Fundamentals
The lecture was essentially a review of concepts from IGCSE physics and biology, covering the basics of sound and how it interacts with the human ear. Sound is a vibration of air molecules that stimulates the eardrum and requires a medium, such as air, to exist.
![]() |
| Fig 1.0 The Human Ear | Britannica |
The human ear consists of three parts: the outer ear, the middle ear (which includes the eardrum and an air-filled cavity containing three tiny bones—malleus, incus, and stapes), and the inner ear (which contains the cochlea, endolymphatic sac, and semicircular canals). These structures work together to send sound signals to the brain, where they are processed and perceived.
There are two types of waves:
- Transverse waves: where vibrations occur at right angles to the direction of the wave.
- Longitudinal waves: where vibrations are parallel to the wave's direction. Sound waves are longitudinal.
![]() |
| Fig 1.1 Mediums | We Grow Thinkers |
Sound travels at different speeds depending on the medium: it moves fastest in solids, slower in liquids, and slowest in gases. This variation is due to the differing distances between particles in each state.
A wavelength refers to the distance between two consecutive rarefactions (troughs) or compressions (peaks) in a wave.
The study of how humans perceive sound is known as psychoacoustics, which explores how we perceive pitch, loudness, volume, and timbre. It also examines how individuals may experience sound differently—for example, why some people can concentrate or study with loud music in the background.
The properties of sound include:
- Pitch
- Loudness
- Timbre
- Perceived duration
- Envelope (how sound changes over time)
- Spatialization (how sound is perceived in space)
Humans can typically hear within a frequency range of 20 Hz to 20 kHz. Based on the website shared during the lecture, my hearing falls within the 17 kHz range.
__________________________________________________________
Week 3 - Sound Design Tools
Most sound editing software comes with several key tools for sound design. Here are five common ones:
- Layering: Combines different sounds to create a fuller, more unique sound. It's useful for blending ambient effects, vocals, and sound effects.
- Time Stretching/Compression: Alters the speed or tempo of audio without changing its pitch. For natural results, avoid stretching by more than 50%. This technique helps sync sound with visuals or alter pacing.
- Pitch Shifting: Changes the pitch of sound. Higher pitches make the sound feel thinner and smaller, while lower pitches create a deeper, larger sound. It can also imply the size of the source of the sound.
- Reversing: Plays audio backward, creating an eerie or unnatural effect. When layered with other sounds, it can add mystery or tension.
- Mouth It: Uses vocalization to create flexible sound effects, especially for hard-to-replicate noises. It's a go-to for improvising sound effects when the exact one isn't available.
These techniques enhance the complexity and depth of sound in creative projects.
__________________________________________________________
Week 4 - Diegetic vs Non-Diegetic Sound
Filmmakers use these sounds to immerse audiences and break traditional rules to create unique effects:- Diegetic Sound: Sounds heard by both characters and the audience, such as dialogue, footsteps, and music within the film's world.
- Non-Diegetic Sound: Sounds only heard by the audience, like background scores or voiceovers, which enhance the scene's emotional or narrative impact.
- Trans-Diegetic Sound: Sound that shifts between diegetic and non-diegetic, adding creative layers to storytelling.
__________________________________________________________
Week 9 - Microphones
Dynamic and Condenser Microphones
- Dynamic Microphone:
- Durable and does not require external power.
- Condenser Microphone:
- Requires external power, typically supplied as +48V phantom power.
Microphone Polar Patterns
- Omnidirectional:
- Captures sound equally from all directions.
- Best for recording environmental sounds.
- Note: Proximity effect does not apply to omnidirectional microphones.
- Cardioid:
- Focuses on sound from the front, minimizing noise from the sides.
- Commonly used for vocals and presentations.
- Hypercardioid (Shotgun Microphone):
- Highly directional, capturing clear audio from the front.
- Rejects most sound from the sides and rear.
- Figure-of-Eight:
- Captures sound from the front and back while rejecting noise from the sides.
- Rare; used in studio settings for interviews or dual-source recordings.
Connectors
- Quarter-Inch Jack: Unbalanced connector.
- XLR Connector: Balanced connector.
- Key Difference:
- Unbalanced: More prone to noise interference.
- Balanced: Provides cleaner, interference-free audio.
Proximity Effect in Microphones
- When the microphone is close to the sound source, the bass frequencies are amplified.
- This effect does not occur with omnidirectional microphones.
Studio Equipment
- The school’s studio uses a hybrid mixer, which combines a mixer and a controller.
Tips for Good Recordings
- Control the Environment:
- Reduce background noise (turn off fans, AC, etc.).
- Record in quiet spaces like a car, under a blanket, or using makeshift "forts."
- Timing:
- Choose less noisy times, avoiding daytime when background noise is higher.
- Environment Setup:
- Minimize sound reflections by recording in a smaller, noise-dampened space.
__________________________________________________________
Week 12 - Game/Film Audios
-
Film Audio vs. Game Audio:
- Film audio follows a linear progression, while game audio is non-linear, adapting to the player's state, choices, and actions.
-
Event Mapping and Cue Spotting:
- In games, we plan and predict potential player actions to prepare for corresponding audio responses.
- This process, known as "Event Mapping" or "Cue Spotting" in audio design, identifies all possible actions requiring sound or changes in sound states.
-
Triggers, Cues, and Events:
- These are the key elements that initiate audio changes in a game.
- Triggers may include parameters that define how and when a sound plays.
-
Art Style and Audio Design:
- The art style influences the type and tone of audio used in the game, often shaped by the game's genre.
-
Inspirational Influence:
- The audio design in this case draws significant inspiration from Miyazaki's animation style.
.png)


Comments
Post a Comment