A Weighted Superposition of Functional Contours Model for Modelling Contextual Prominence of Elementary Prosodic Contours

Branislav Gerazov, Gérard Bailly, Yi Xu


The way speech prosody encodes linguistic, paralinguistic and non-linguistic information via multiparametric representations of the speech signals is still an open issue. The Superposition of Functional Contours (SFC) model proposes to decompose prosody into elementary multiparametric functional contours through the iterative training of neural network contour generators using analysis-by-synthesis. Each generator is responsible for computing multiparametric contours that encode one given linguistic, paralinguistic and non-linguistic information on a variable scope of rhythmic units. The contributions of all generators' outputs are then overlapped and added to produce the prosody of the utterance. We propose an extension of the contour generators that allows them to model the prominence of the elementary contours based on contextual information. WSFC jointly learns the patterns of the elementary multiparametric functional contours and their weights dependent on the contours' contexts. The experimental results show that the proposed weighted SFC (WSFC) model can successfully capture contour prominence and thus improve SFC modelling performance. The WSFC is also shown to be effective at modelling the impact of attitudes on the prominence of functional contours cuing syntactic relations in French and that of emphasis on the prominence of tone contours in Chinese.


 DOI: 10.21437/Interspeech.2018-1286

Cite as: Gerazov, B., Bailly, G., Xu, Y. (2018) A Weighted Superposition of Functional Contours Model for Modelling Contextual Prominence of Elementary Prosodic Contours. Proc. Interspeech 2018, 2524-2528, DOI: 10.21437/Interspeech.2018-1286.


@inproceedings{Gerazov2018,
  author={Branislav Gerazov and Gérard Bailly and Yi Xu},
  title={A Weighted Superposition of Functional Contours Model for Modelling Contextual Prominence of Elementary Prosodic Contours},
  year=2018,
  booktitle={Proc. Interspeech 2018},
  pages={2524--2528},
  doi={10.21437/Interspeech.2018-1286},
  url={http://dx.doi.org/10.21437/Interspeech.2018-1286}
}