Adaptive Block Transform Coding of Speech Based on LPC Vector Quantization

Yunus Hussain, Nariman Farvardin

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

In this paper we describe several adaptive block transform speech coding systems based on vector quantization of LPC parameters. In order to account for the power fluctuations, the speech signal is normalized to have a unit-energy prediction residual. The temporal variations in the short-term spectrum, on the other hand, are taken into account by vector quantizing the LPC parameters associated with the vector of speech samples and transmitting the codeword index. Also, a variation of the scheme in which the pitch information is used to better estimate the spectrum is considered. For each block, based on the codevector associated with the input vector, an optimum bit assignment map is used to quantize the transform coefficients. We consider two types of zero-memory quantizers for encoding the transform coefficients, namely the Lloyd-Max quantizer and the entropy-coded quantizer. The performance of these schemes is compared with other adaptive transform coding schemes. We show by means of simulations that the system based on entropy-coded quantizer design leads to very high performance and in most cases as much as 5-dB performance improvement in terms of segmental signal-to-noise ratio is observed over the adaptive block transform coding scheme of Zelinski and Noll. The effect of incorporating the pitch information in the coder is studied and numerical results are presented. The effects of the bit rate and the size of the codebook on the performance of the systems are also studied in detail.

Original languageEnglish
Pages (from-to)2611-2620
Number of pages10
JournalIEEE Transactions on Signal Processing
Volume39
Issue number12
DOIs
StatePublished - Dec 1991

Fingerprint

Dive into the research topics of 'Adaptive Block Transform Coding of Speech Based on LPC Vector Quantization'. Together they form a unique fingerprint.

Cite this