Structured fixed-rate vector quantizer derived from a variable-length scalar quantizer. Part II. Vector sources

Rajiv Laroia, Nariman Farvardin

Research output: Contribution to journalArticlepeer-review

7 Scopus citations

Abstract

The fixed-rate scalar-vector quantizer (SVQ) was described in Part I for quantizing stationary memoryless sources. In this sequel, the SVQ has been extended to a specific type of vector sources in which each component is a stationary memoryless scalar subsource independent of the other components. Algorithms for the design and implementation of the original SVQ are modified to apply to this case. The resulting SVQ, referred to as the extended SVQ (ESVQ), is then used to quantize stationary sources with memory (with known autocorrelation function). This is done by first using a linear orthonormal block transformation, such as the Karhunen-Loeve transform, to decorrelate a block of source samples. The transform output vectors, which can be approximated as the output of an independent-component vector source, are then quantized using the ESVQ. Numerical results are presented for the quantization of first-order Gauss-Markov sources using this scheme. It is shown that the ESVQ-based scheme performs very close to the entropy-coded transforms quantization while maintaining a fixed-rate output and outperforms the fixed-rate scheme that uses scalar Lloyd-Max quantization of the transform coefficients. Finally, it is shown that this scheme also performs better than implementable vector quantizers, specially at high rates.

Original languageEnglish
Pages (from-to)868-876
Number of pages9
JournalIEEE Transactions on Information Theory
Volume39
Issue number3
DOIs
StatePublished - 1993

Fingerprint

Dive into the research topics of 'Structured fixed-rate vector quantizer derived from a variable-length scalar quantizer. Part II. Vector sources'. Together they form a unique fingerprint.

Cite this