ICAD logo

bibliography

A 
B 
C 
D 
E 
F 
G 
H 
I 
J 
K 
L 
M 
N 
O 
P 
Q 
R 
S 
T 
U 
V 
W 
X 
Y 
Z 

 
 
Madhyastha, T. M., and D. A. Reed. "A Framework for Sonification Design." In Auditory Display: Sonification, Audification, and Auditory Interfaces, edited by G. Kramer. Santa Fe Institute Studies in the Sciences of Complexity, Proc. Vol. XVIII. Reading, MA: Addison Wesley, 1994.
The authors describe Porsonify, a toolkit that provides a uniform network interface to sound devices through table-driven sound servers. All device-specific functions are encapsulated in control files, so that user interfaces to configure sound devices and sonifications can be generated independently of the underlying hardware. Creation of some example sonifications using this toolkit is discussed.

Mansur, D. L. "Graphs in Sound: A Numerical Data Analysis Method for the Blind." Unpublished Thesis, University of California, Davis, 1984.

The author tested the ability of subjects to make certain judgements about x-y "plots" using continuously varying pitch to represent the dependent variable (y), and time the independent variable (x). He was primarily concerned with the development of displays to make exploratory data analysis possible for visually impaired analysts. He found that with limited training, subjects were able to recognize key features of the data, such as linearity, monotonicity, and symmetry, for between 79 and 95 percent of the trials.

Mansur, D. L., M. M. Blattner, and K. I. Joy. "Sound-Graphs: A Numerical Data Analysis Method for the Blind." In Proceedings of the 18th Annual Hawaiian International Conference on System Science, held January 1985 in Honolulu, Hawaii. Los Alamitos, CA: IEEE Computer Society Press, 1984. Also in J. Med. Sys. 9 (1985): 163--174.

Sound-Graphs are composed of three-second periods of continuously varying pitch. They were developed and used to provide the blind with a rapid and intuitive understanding of numerical data (x--y graphs). This work is primarily from the M.S. thesis (with the same name) by Douglass L. Mansur, University of California, Davis, 1984; also published as Lawrence Livermore Technical Report UCRL-53548.

Mansur, D. L., M. M. Blattner, and K. I. Joy. "The Representation of Line Graphs Through Audio Images." Technical Report UCRL-91586, Lawrence Livermore National Laboratory, Livermore, CA, September 1984.

Holistic sound and graphical images bear certain resemblances to the way we manipulate them. This article examines tools that manipulate both graphical and line graphs and their sonic equivalents.

Mansur, D. L., M. M. Blattner, and K. I. Joy. "Sound-Graphs: A Numerical Data Analysis Method for the Blind." J. Med. Sys. 9 (1985): 163--174.

The authors describe how simple line graphs can be translated into nonspeech sounds for presentation to blind people.

Matlin, M. W. Sensation and Perception, 2nd ed. Massachusetts: Allyn and Bacon, 1988.

A good introductory text to perception that distinguishes between the physical responses to stimuli and the perceptual effects.

Mayer-Kress, G., R. Bargar, and I. Choi. "Musical Structures in Data From Chaotic Attractors." In Auditory Display: Sonification, Audification, and Auditory Interfaces, edited by G. Kramer. Santa Fe Institute Studies in the Sciences of Complexity, Proc. Vol. XVIII. Reading, MA: Addison Wesley, 1994.

The authors exhibit parallels between structures of chaotic dynamical systems and music and indicate the possibility of using this connection to enhance the perception of recurrent features in complex signals. They describe three auditory representations of chaotic systems.

McAdams, S. "Spectral Fusion and the Creation of Auditory Images." In Music, Mind and Brain, Chap. XV. New York: Plenum, 1982.

McAdams discusses aspects of musical perception beyond the boundaries of acoustic analysis; and, in evoking an auditory image from an acoustic signal, the roles of familiarity, of learning and context, of synthetic and analytic listening, and of interacting with the primitive grouping processes of harmonicity and coordinated modulation.

McCabe, R. K., and A. A. Rangwalla. "Auditory Display of Computational Fluid Dynamics Data." In Auditory Display: Sonification, Audification, and Auditory Interfaces, edited by G. Kramer. Santa Fe Institute Studies in the Sciences of Complexity, Proc. Vol. XVIII. Reading, MA: Addison Wesley, 1994.

Direct simulation and parameter mapping techniques are discussed in the context of how they can be used to enhance the understanding of data from computational fluid dynamics simulations. Two case studies are presented. The first case describes how parameter mapping techniques were used to help analyze the results from a simulation of the Penn State artificial heart. The second case shows how direct simulation was used to better understand the tonal acoustics of rotor stator interactions inside a jet turbine.

McIntyre, M. E., R. T. Schumacher, and J. Woodhouse. "On the Oscillations of Instruments." JASA 74 (1983): S52.

An account of temporally based physical modeling techniques with examples; an excellent work.

Meijer, P.B.L., "An Experimental System for Auditory Image Representations," IEEE Transactions on Biomedical Engineering, Vol. 39, No. 2, pp. 112-121, Feb 1992.

Meijer presents an experimental system for the conversion of arbitrary images into sound patterns, possibly as a step towards the development of a vision substitution device for the blind. The soundscapes generated by the system provide a resolution of up to 64 x 64 pixels with 16 grey-tones per pixel. The actual resolution obtainable with human perception of these soundscapes remains to be evaluated. Spectrographic reconstructions were made to prove that much of the image content is indeed preserved in the soundscapes.

Meyer, L. B. Emotion and Meaning in Music. Chicago: University of Chicago Press, 1956.

Forming the basis of Meyer's past 35 years of research and theoretical writing, this text is doubtlessly a classic in the field of music psychology. Proceeding from John Dewey's (1894) "conflict theory of emotion," the author provides results of experimental investigations and musical examples to support his premise that "emotion or affect is aroused when a tendency to respond is arrested or inhibited."

Mezrich, J. J., S. P. Frysinger, and R. Slivjanovski. "Dynamic Representation of Multivariate Time-Series Data." J. Am. Stat. Assoc. 79 (1984): 34--40.

Dynamic data representation employing both auditory and visual components for multivariate time-series displays, such as for economic indicators. In their scheme, the analyst is confronted at any moment with one multivariate sample from the time series, rather than the whole data set, producing samples which are displayed in succession rather like frames in a movie. The results of their experiment indicate that the dynamic auditory/visual display outperforms the static visual displays in most cases for the correlation detection task.
For a 30-second excerpt of DRI economic indicators from 1948 to 1980, click here.

Monk, A. "Mode Errors: A User-Centered Analysis and Some Preventative Measures Using Keying-Contingent Sound." IJMMS 24 (1986): 313--327.

Monk uses sound to reduce the number of mode errors in an interface.

Moore, F. Richard. Elements of Computer Music. ISBN 0-13-252552-6. Prentice Hall, 1990.

Moore covers how to analyze process and synthesize musical sound. A lot of digital signal processing, composition techniques (random numbers, Markov Processes, etc.), and the uses of music.

Mulligan, B. E., D. K. McBride, and L. S. Goodman. "A Design Guide for Nonspeech Auditory Displays." Pensacola, FL: Naval Aerospace Medical Research Laboratory, 1987.

The authors provide algorithms that assist the designer in designing auditory signals, especially in ways to enhance detectability of signals in noise and to increase loudness without increasing signal length.

Mynatt, E. "Auditory Presentation of Graphical User Interfaces." In Auditory Display: Sonification, Audification, and Auditory Interfaces, edited by G. Kramer. Santa Fe Institute Studies in the Sciences of Complexity, Proc. Vol. XVIII. Reading, MA: Addison Wesley, 1994.

Mynatt presents work in designing interactive, auditory interfaces that provide access to graphical user interfaces for people who are blind. She discusses a prototype system called Mercator which explores conveying symbolic information and supporting navigation in the auditory interface.

A 
B 
C 
D 
E 
F 
G 
H 
I 
J 
K 
L 
M 
N 
O 
P 
Q 
R 
S 
T 
U 
V 
W 
X 
Y 
Z 

Home

Webmasters