Mar10
2014

By: Elyse                Categories: BooksGames

The basic principle of sound in Unity is determined by two Game Object Components—the Audio Listener and the Audio Source. Both of these components work in combination with the sound file itself. You can think of the Audio Listener as you—or, more specifically, as your ears, or perhaps a microphone—while the Audio Source is, as you might expect, the source of the audio, which can be any object in the game world.

Audio Listener

The Audio Listener Component is usually attached to a first- or a third-person controller combined with a camera (which represents your eyes or your view of yourself). An Audio Listener is necessary only if the sound functions within a three-dimensional environment; it is not generally needed for a two dimensional game. Unity limits you to only one Audio Listener.


Audio Source

The Audio Source Component is just that, a source of audio in the game world. Audio sources can be set up to play 2D or 3D sounds in a variety of ways. One way is a simple ambient background that loops. You’ve already heard examples of this in the Mushroom Forest game environment we covered in a previous level.

If the Audio Source is attached to an object and plays continuously, distance from the object is important in the game, and the volume settings of the Audio Source can be customized to a wide variety of settings based on the distance. Audio Filters, if available, can also be employed by this same method, enabling more realistic depictions of occlusion and obstructiion.

Triggering an Audio Source

An Audio Source can be triggered by a number of conditions, but only a very few of them can be configured using no coding or scripting expertise. The simplest of these is an object appearing in the game space for the first time. The sound is tied to the source by a sound trigger, and looping can be turned on or off as well.

Any other type of triggering will require some scripting knowledge and usage of other components. For example, an object might make a sound if the player runs into it. The object will not play the sound on its own, however; the sound requires the trigger event, which in this case is the collision, and the direction to play a sound comes from the script. Another example would be a sound that occurs when you click on the mouse to fire in a first person shooter. In this case, a script is set up to look for mouse events, so when a mouse event is detected, the script triggers the sound to play.

BONUS FOOTAGE – Watch this video on the basic structure of Unity

__________________________________________
Excerpt from The Essential Guide to Game Audio by Steve Horowitz and Scott Looney © 2013 Taylor & Francis Group All Rights Reserved.

About the Book

The Essential Guide to Game Audio: The Theory and Practice of Sound for Games is a first of its kind textbook and must-have reference guide for everything you ever wanted to know about sound for games. This book provides a complete overview of game audio, how it has developed over time, and how you can make a career in this industry.

Written by two experts in the field of sound for games, this book focuses on the practical details of designing and implementing sound for social, console, mobile and even cloud gaming markets in a fun, engaging, and interactive way. Steve Horowitz and Scott Looney take you step-by-step through the implementation of sound in one of today’s most popular game engines, Unity 3d.

No Comments

No Comments

Tell us what you think!