Optimal precoding design and power allocation for decentralized detection of deterministic signals

Jun Fang, Hongbin Li, Zhi Chen, Shaoqian Li

Research output: Contribution to journalArticlepeer-review

22 Scopus citations

Abstract

We consider a decentralized detection problem in a power-constrained wireless sensor network (WSN), in which a number of sensor nodes collaborate to detect the presence of a deterministic vector signal. The signal to be detected is assumed known a priori. Each sensor conducts a local linear processing to convert its observations into one or multiple messages. The messages are conveyed to the fusion center (FC) by an uncoded amplify-and-forward scheme, where a global decision is made. Given a total network transmit power constraint, we investigate the optimal linear processing strategy for each sensor. Our study finds that the optimal linear precoder has the form of a matched filter. Depending on the channel characteristics, one or multiple versions of the filtered/compressed message should be reported to the FC. In addition, assuming a fixed total transmit power, we examine how the detection performance behaves with the number of sensors in the network. Analysis shows that increasing the number of sensors can substantially improve the system detection reliability. Finally, decentralized detection with unknown signals is studied and a heuristic precoding design is proposed. Numerical results are conducted to corroborate our theoretical analysis and to illustrate the performance of the proposed algorithm.

Original languageEnglish
Article number6168287
Pages (from-to)3149-3163
Number of pages15
JournalIEEE Transactions on Signal Processing
Volume60
Issue number6
DOIs
StatePublished - Jun 2012

Keywords

  • Decentralized detection
  • Detection outage
  • Precoding design
  • Wireless sensor networks

Fingerprint

Dive into the research topics of 'Optimal precoding design and power allocation for decentralized detection of deterministic signals'. Together they form a unique fingerprint.

Cite this