Skip to main content

A Conditional Generative Model for Speech Enhancement

Li, Zheng-xi, Dai, Li-Rong, Song, Yan, McLoughlin, Ian Vince (2018) A Conditional Generative Model for Speech Enhancement. Circuits, Systems, and Signal Processing, . ISSN 0278-081X. E-ISSN 1531-5878. (doi:10.1007/s00034-018-0798-4) (KAR id:66126)

PDF (Authors accepted manuscript - not final) Author's Accepted Manuscript
Language: English
Download (2MB) Preview
[img]
Preview
Official URL
http://dx.doi.org/10.1007/s00034-018-0798-4

Abstract

Deep learning based speech enhancement approaches like Deep Neural Networks (DNN) and Long-Short Term Memory (LSTM) have already demonstrated superior results to classical methods.

This paper proposes a novel architecture to address both issues, which we term a conditional generative model (CGM). By adopting an adversarial training scheme applied to a generator of deep dilated convolutional layers, CGM is designed to model the joint and symmetric conditions of both noisy and estimated clean spectra.We evaluate CGM against both DNN and LSTM in terms of Perceptual Evaluation of Speech Quality (PESQ) and Short-Time Objective Intelligibility (STOI) on TIMIT sentences corrupted by ITU-T P.501 and NOISEX-92 noise in a range of matched and mismatched noise conditions. Results show that both the CGM architecture and the adversarial training mechanism lead to better PESQ and STOI in all tested noise conditions. In addition to yielding significant improvements in PESQ and STOI, CGM and adversarial training both mitigate against over-smoothing.

Item Type: Article
DOI/Identification number: 10.1007/s00034-018-0798-4
Uncontrolled keywords: Deep learning, Speech enhancement, Generative model
Subjects: T Technology
Divisions: Faculties > Sciences > School of Computing > Data Science
Depositing User: Ian McLoughlin
Date Deposited: 26 Feb 2018 10:39 UTC
Last Modified: 09 Jul 2019 09:13 UTC
Resource URI: https://kar.kent.ac.uk/id/eprint/66126 (The current URI for this page, for reference purposes)
McLoughlin, Ian Vince: https://orcid.org/0000-0001-7111-2008
  • Depositors only (login required):

Downloads

Downloads per month over past year