Academic Journals Database
Disseminating quality controlled scientific knowledge

Real-Time Gesture-Controlled Physical Modelling Music Synthesis with Tactile Feedback

ADD TO MY LIST
 
Author(s): Howard David M | Rimell Stuart

Journal: EURASIP Journal on Advances in Signal Processing
ISSN 1687-6172

Volume: 2004;
Issue: 7;
Start page: 830184;
Date: 2004;
Original page

Keywords: physical modelling | music synthesis | haptic interface | force feedback | gestural control

ABSTRACT
Electronic sound synthesis continues to offer huge potential possibilities for the creation of new musical instruments. The traditional approach is, however, seriously limited in that it incorporates only auditory feedback and it will typically make use of a sound synthesis model (e.g., additive, subtractive, wavetable, and sampling) that is inherently limited and very often nonintuitive to the musician. In a direct attempt to challenge these issues, this paper describes a system that provides tactile as well as acoustic feedback, with real-time synthesis that invokes a more intuitive response from players since it is based upon mass-spring physical modelling. Virtual instruments are set up via a graphical user interface in terms of the physical properties of basic well-understood sounding objects such as strings, membranes, and solids. These can be interconnected to form complex integrated structures. Acoustic excitation can be applied at any point mass via virtual bowing, plucking, striking, specified waveform, or from any external sound source. Virtual microphones can be placed at any point masses to deliver the acoustic output. These aspects of the instrument are described along with the nature of the resulting acoustic output.
Why do you need a reservation system?      Affiliate Program