Skip to main content
Kent Academic Repository

The Common Prosody Platform (CPP) — where Theories of Prosody can be directly Compared

Prom-on, Santitham, Xu, Yi, Gu, Wentao, Arvaniti, Amalia, Nam, Hosung, Whalen, D. H. (2016) The Common Prosody Platform (CPP) — where Theories of Prosody can be directly Compared. In: Speech Prosody. Speech Prosody 2016. (2016). Speech Prosody Special Interest Group (SProSIG), Urbana, USA (doi:10.21437/SpeechProsody.2016) (KAR id:60723)

Abstract

This paper introduces the Common Prosody Platform (CPP), a computational platform that implements major theories and models of prosody. CPP aims at a) adapting theory-specific assumptions into computational algorithms that can generate surface prosodic forms, and b) making all the models trainable through global optimization based on automatic analysis-bysynthesis learning. CPP allows examination of prosody in much finer detail than has been previously done and provides a means for speech scientists to directly compare theories and their models. So far, four theories have been included in the platform, the Command-Response model, the AutosegmentalMetrical theory, the Task Dynamic model, and the Parallel Encoding and Target Approximation model. Preliminary tests show that all the implemented models can achieve good local contour fitting with low errors and high correlations.

Item Type: Conference or workshop item (Proceeding)
DOI/Identification number: 10.21437/SpeechProsody.2016
Uncontrolled keywords: speech prosody, theory comparison, software package, parametric modeling
Divisions: Divisions > Division of Arts and Humanities > School of Culture and Languages
Depositing User: Amalia Arvaniti
Date Deposited: 06 Mar 2017 10:34 UTC
Last Modified: 08 Dec 2022 22:53 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/60723 (The current URI for this page, for reference purposes)

University of Kent Author Information

  • Depositors only (login required):

Total unique views for this document in KAR since July 2020. For more details click on the image.