One of my first impressions of the iPad was how quiet it is mechanically. The most tactile sound it makes, besides the three manual buttons, is made when tapping the glass, and even so when I watch people interact with it they tend to press, slide and swipe, relatively silent gestures, rather than than hit and poke loud enough to make sound.
Computing up until now, by comparison has been a cacophony — the clack of the keys as we type, the clicks and thumps of the mouse as we scrape it over the mouse pad, the whirr of the HD and other inherent mechanical sounds — but it is so ubiquitous and common place that we hardly notice it any more. It has become sonically invisible, like a cooling fan we have tuned out, until it suddenly stops and we are snapped into a sudden silence.
It’s very interesting. The physical act of writing has always been accompanied by tool sounds, a stick scratching images and symbols into dirt, hands and fingers slapping muddy paint onto cave walls, chisels chipping out messages on stone tablets, styluses squelching wedges into wet clay, brushes swishing against dry papyrus, and as technology improved the art of writing became quieter and quieter until, after all this progress into silence, we faced the noisy typewriter which unceremoniously placed itself between hand and page and reintroduced the sound of a tool to the writing process.
And what a sound!
A competent typewriter is a percussionist extraordinaire but as technologies have improved writing has again become quieter and quieter and we find the cycle repeating itself.
Now we face a device even more imposing and alienating than the noisy typewriter ever was. The iPad’s glass interface is silent and smooth, the antithesis of mechanical input, and what’s more, it is only a precursor of things to come. Peering shortly into the near future we can see that even smooth glass my give way to air itself, just look at developments in tactile holography for example, which in some cases may eliminate touchable surfaces entirely.
Technology is silencing the act of writing once again.
At the first sound design meeting at iA we all agreed “realistic” sounds just wouldn’t work. A slick word processor app that sounds like an actual typewriter comes across cheap and gaudy and the sound is ultimately distracting, unsophisticated and gimmicky. The iPad is not a computer, or a typewriter, or a book and how it behaves sonically needs to reflect this fact.
“Realistic” or Natural sounds are sounds recorded from actual sources (samples of pages turning for example or real buttons being pushed) and characteristic sounds are sounds that meet perceptions or expectations of what should be heard (I just pushed what looks like a button and it should make a click sound of some kind.)
Characteristic sounds are not necessarily synthetic they are most often sounds of one thing being manipulated to imitate or emulate another. The creepy creaking sound, for example, of the little girl’s neck twisting 360° around in “The Exorcist” is the sound of an old leather wallet being twisted and turned near the microphone. The natural sound of an actual neck being twisted in such a way would probably be far less dramatic and in fact far less believable.
This is why natural sound can be counter intuitive to desired interface effects.
There is a fundamental disconnect.
These natural sounds can rapidly fall into the sonic equivalent of the “uncanny valley”. The more “real” the sound gets the less believable it becomes until at a certain point it becomes outright distracting and even comical.
On the iPad, in most cases, natural sounds seem to only help in alienating the function of the GUI design or animation from itself, especially if the GUI is emulating 3D metaphors of real world interfaces like books and magazines. For example by integrating the sound of an actual page being turned it is difficult to convince a user that a page is being turned without exacerbating the falsity of the interaction.
The reality is that the iPad is just not paper.
Thus the design goal is to create sound that meets user expectations yet does not rely on overtly didactic sonic metaphors to do so. It should inspire and facilitate creation and productivity through interaction with the app rather than drawing attention to how different it is to the medium it is replacing or emulating.
Also key, of course, is designing sounds that don’t drive the user nuts after hours of exposure.
For now the generic A5# “Tock Sound” sound will remain as the sound for the letter key input so the focus has been primarily on mapping sound to the custom navigation keys and the two principle GUI animations.
The two main animation sounds (emulating a “zoom in” and “zoom out” function) are a combination of a manipulated human voice and samples of synthetically generated sound using a random noise generator in Reaktor 5.
The Word Select “swish” sounds (backward and forward) are are an equalised human voice mixed with white noise, filtered and set at one semitone pitch difference between the two.
In the end the final sounds turned out very simple, elegant and airy. They let you know something is happening without smashing you over the head with the fact.
As interfaces become increasingly ephemeral sound will increase in importance as an indicator of human interactions with design elements within GUIs.
The future of computing is leaning towards mechanical silence, ubiquitous computing will not whirr and click in and of itself so understanding the role of designed sound on the iPad outside of games is essential for understanding the role of sound in the next generation of user interfaces.
“The key to good sound effects is working with natural and real sounds.(…) These analogue sounds can be digitally reworked as much as necessary, but the origin has to be natural”.
– Lead sound-designer Per Sundström on the sound design for the film “Låt Den Rätte Komma In” (Let The right One In).
Why the obsession over sound? “If it’s not there, you will notice, it’s just that little organic element.”
– Jeffrey Wilhoit ( Foley artist at Todd-AO Studios)