Improving Data Integrity with Randomness -- A Compressive Sensing Approach [Conference Paper]

NESL Technical Report #: 2009-6-2


Abstract: Data loss in wireless sensor systems is inevitable, either due to exogenous (such as transmission medium impediments) or endogenous (such as faulty sensors) causes. While there have been many attempts at coping with this issue, recent developments in the area of Compressive Sensing (CS) enable a new perspective. Since many natural signals are compressible, it is possible to employ CS, not only to reduce the effective sampling rate, but to improve the robustness of the system at a given Quality of Information (QoI). This is possible because reconstruction algorithms for compressively sampled signals are not hampered by the stochastic nature of wireless link disturbances and sensor malfunctions, which has traditionally plagued attempts at proactively handling the effects of these errors. In this paper, we show how reconstruction error remains unchanged despite extreme data losses by marginally increasing the average sampling rate. A challenge with this approach is that link errors and sensor faults exhibit bursty exponentially distributed losses, while CS strategies assume independent uniformly distributed random sampling instants. We show that a simple re-ordering of samples prior to communication re-enables successful reconstruction with high probability.

Local downloads:

Publication Forum: Annual Conference of the ITA

Page (Count): 2

Date: 2009-09-10

Place: Maryland, USA

Public Document?: Yes

NESL Document?: Yes

Document category: Conference Paper