TY - JOUR
T1 - Structured fixed-rate vector quantizer derived from a variable-length scalar quantizer. Part II. Vector sources
AU - Laroia, Rajiv
AU - Farvardin, Nariman
PY - 1993
Y1 - 1993
N2 - The fixed-rate scalar-vector quantizer (SVQ) was described in Part I for quantizing stationary memoryless sources. In this sequel, the SVQ has been extended to a specific type of vector sources in which each component is a stationary memoryless scalar subsource independent of the other components. Algorithms for the design and implementation of the original SVQ are modified to apply to this case. The resulting SVQ, referred to as the extended SVQ (ESVQ), is then used to quantize stationary sources with memory (with known autocorrelation function). This is done by first using a linear orthonormal block transformation, such as the Karhunen-Loeve transform, to decorrelate a block of source samples. The transform output vectors, which can be approximated as the output of an independent-component vector source, are then quantized using the ESVQ. Numerical results are presented for the quantization of first-order Gauss-Markov sources using this scheme. It is shown that the ESVQ-based scheme performs very close to the entropy-coded transforms quantization while maintaining a fixed-rate output and outperforms the fixed-rate scheme that uses scalar Lloyd-Max quantization of the transform coefficients. Finally, it is shown that this scheme also performs better than implementable vector quantizers, specially at high rates.
AB - The fixed-rate scalar-vector quantizer (SVQ) was described in Part I for quantizing stationary memoryless sources. In this sequel, the SVQ has been extended to a specific type of vector sources in which each component is a stationary memoryless scalar subsource independent of the other components. Algorithms for the design and implementation of the original SVQ are modified to apply to this case. The resulting SVQ, referred to as the extended SVQ (ESVQ), is then used to quantize stationary sources with memory (with known autocorrelation function). This is done by first using a linear orthonormal block transformation, such as the Karhunen-Loeve transform, to decorrelate a block of source samples. The transform output vectors, which can be approximated as the output of an independent-component vector source, are then quantized using the ESVQ. Numerical results are presented for the quantization of first-order Gauss-Markov sources using this scheme. It is shown that the ESVQ-based scheme performs very close to the entropy-coded transforms quantization while maintaining a fixed-rate output and outperforms the fixed-rate scheme that uses scalar Lloyd-Max quantization of the transform coefficients. Finally, it is shown that this scheme also performs better than implementable vector quantizers, specially at high rates.
UR - http://www.scopus.com/inward/record.url?scp=0027597057&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0027597057&partnerID=8YFLogxK
U2 - 10.1109/18.256494
DO - 10.1109/18.256494
M3 - Article
AN - SCOPUS:0027597057
SN - 0018-9448
VL - 39
SP - 868
EP - 876
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 3
ER -