Installations

Sound Mapping

Iain Mott - www.reverberant.com

Sound Mapping is a participatory work of sound art made for outdoor environments. The work is installed in the environment by means of a Global Positioning System (GPS), which tracks movement of individuals through the space. Participants wheel four movement-sensitive, sound producing suitcases to realise a composition that spans space as well as time. The suitcases play music in response to nearby architectural features and the movements of individuals. Sound Mapping aims to assert a sense of place, physicality and engagement to reaffirm the relationship between art and the everyday.

ASX Voices

Fabio C. Ciardi

ASX Voices is an audio-visual performance by Fabio Cifariello Ciardi based on the sonification of real trading data for the largest cap stocks of the Australian Stock Exchange. ASX Voices aims to establish a multimodal real time landscape of Australian economy that can be entered and explored by the audience.
There are no metaphors or illusions here: its an "audio reality-show" about billions of dollars changing hand every second. ASX Voices provocatively invites you to listen and judge the Australian "voices" of the global economy.
The performance is driven by Cifariello Ciardi “sMAX”, a toolkit for stock market data sonification to be presented at the conference.

PlantA

Garth Paine
University of Western Sydney

PlantA is an interactive sound installation by Dr Garth Paine, which continues his exploration of the relationship between computer based real-time music composition and the behaviour patterns of natural phenomena. The PlantA installation contains a weather station, gathering wind speed, wind direction, temperature, and solar radiation data. The meteorological conditions, vital to a plants? life processes, are transmitted ten times a second back to a computer where they are analysed, the data being transformed into eight channels of musical sounds. These sounds give a voice to the secret activity of the inner life processes of the plant. PlantA offers a unique soundscape that reflects the subtle changes of the weather. In essence, the environmental conditions encountered in the moment. As conditions change, so too will the sounds. The nuances of nature are mirrored in the nuances of the score. As an artist my interest lies in exploring ways of contextualising digital art processes within the natural organic environment. I have little interest in the purely synthetic, that is the synthesis of sound or images from purely academic or theoretical viewpoint; but prefer, as is illustrated in the PlantA project, to take a fundamentally organic source as the basis for the synthesis process. In so doing, I hope that some quality of that organic material will permeate the work, thereby bringing the synthetic output at least a small way towards the organic world, and therefore within the human context. Located in one of Sydney's most beautiful settings, PlantA promises to provide a truly unique aural and visual experience. More details are available from Garth Paine, ga.paine@uws.edu.au

Dr Garth Paine is Head of Program for Electronic Arts at the University of Western Sydney

Hypersense Complex

Alistair Riddell, Somaya Langley, Simon Burton

The goal of the hypersense project is to explore different ways of interacting with the computer to produce sound.

At the moment we are using flex sensors that give a variable resistance the more you bend them. Each performer has an MCU, and each of those has 8 analog inputs so we have ended up with 8 fingers each being wired up. The MCUs sample each sensor 100 times per second. This raw data is then encoded in a midi-like protocol to be sent over the USB. The receiving python script is where most of the "smarts" are. It does work interpreting gestures, building compositional structures and translating these into individual sound events. The audio engine used is the freely available SuperCollider (version 3, OSX only), which the python script communicates to via the network. Being network based has the handy benifit of allowing the use of two laptops, one for the python processing, and the other for sound generation. The protocol connecting python and SuperCollider is the (UDP based) open sound control (OSC), which is kind of like "midi meets the internet" and is widely used in sound applications. These OSC commands contain instructions to start/stop sampled sounds, change sound effects such as reverb and echo, and other control changes such as moving a sound to a different channel or changing the playback rate of a sound.

Sponsors



Editors: Stephen Barrass & Paul Vickers
Published by the International Community for Auditory Display (ICAD).
CD-ROM I.S.B.N: 1-74108-048-7     Website I.S.B.N: 1-74108-062-2

Additional ICAD information and publications can be found at www.icad.org.


Copyright © 2004 by the ICAD contributors.
All rights reserved. Copyright remains with the individual authors. No part of this publication can be reproduced, stored in a retrieval system, or transmitted in any form by any means, electronic, mechanical, photocopying, recording, or otherwise without prior written permission of the individual authors.

Created: 19-Aug-2003     Last modified: 30 June 2004