In this paper we describe the different investigations that are part of the development of a new singing digital musical instrument, adapted to real-time performance. It concerns improvement of low-level synthesis modules, mapping strategies underlying the development of a coherent and expressive control space, and the building of a concrete bi-manual controller.
Cite as: D’Alessandro, N., Dutoit, T. (2007) RAMCESS/handsketch : a multi-representation framework for realtime and expressive singing synthesis. Proc. Interspeech 2007, 4011-4012
@inproceedings{dalessandro07b_interspeech, author={Nicolas D’Alessandro and Thierry Dutoit}, title={{RAMCESS/handsketch : a multi-representation framework for realtime and expressive singing synthesis}}, year=2007, booktitle={Proc. Interspeech 2007}, pages={4011--4012} }