Signal processing and machine learning are all the rage now. Recovering high dimensional sparse signals from noisy low dimensional measurements has always been a problem. This is a compressive sensing problem that is relevant to both signal processing and machine learning.
Although there have been many advances in compressive sensing, only a few algorithms can offer credible support recovery and estimation performance without having any knowledge of both noise variance or signal sparsity. In this paper, the authors of which include Mr. Sreejith Kallummil from Walmart Global Technology Solutions India, Bangalore, India, and Prof. Sheetal Kalyani from the Department of Electrical Engineering, Indian Institute of Technology Madras, Chennai, India, sparse recovery is done without relying on any knowledge of noise variance and signal sparsity.
In this paper, 4 sparse recovery scenarios are considered and they are as follows:
1. Single Measurement Vector (SMV)
2. Block Single Measurement Vector (BSMV)
3. Multiple Measurement Vector (MMV)
4. Block Multiple Measurement Vector (BMMV)
Many sparse recovery algorithms have been proposed for the above 4 scenarios. These include extensions of algorithms such as orthogonal matching pursuit (OMP), least absolute shrinkage and selection operator (LASSO), etc. These have all been developed for single measurement vector (SMV) scenarios.
A technique is proposed in this paper, by generalizing a support recovery technique called residual ratio thresholding (RRT). This technique was first developed by the authors of this paper and was published in ICML 2018, a top rated machine learning conference.
A sequence of a statistics called residual ratios is computed, which are associated with a sequence of candidate supports generated by OMP and are compared against a sequence of deterministic thresholds.
But, there are 2 limitations of RRT:
1. RRT is not applicable to BSMV, MMV, or BMMV scenarios.
2. Even in SMV scenarios, RRT is applicable only to OMP and not to algorithms like LASSO.
To overcome these limitations, the authors developed a support aggregation strategy that can be used to operate non-OMP type algorithms like LASSO in all 4 scenarios within the RRT framework. Thus a “generalized RRT” (GRRT) is proposed, which is a combination of support aggregation, generalized threshold and the RRT technique.
Support recovery guarantees for operating algorithms like simultaneous OMP (SOMP), block OMP (BOMP) and LASSO were derived using GRRT without any knowledge of noise variance and sparsity. This is the first time to the best of the authors’ knowledge that sparsity and noise variance agnostic guarantees were derived.
Thus a novel support recovery technique was used to operate signal and noise variance independent support recovery algorithms in all 4 sparse recovery scenarios. It was found that algorithms that were operated using GRRT suffer only a small signal-to-noise ratio penalty in comparison with the performance of algorithms provided with knowledge of noise variance and sparsity. In practice, one will not know the sparsity level of the underlying signal and variance of the noise perturbing the measurements, and this method enables one to apply compressive sensing to real life practical applications.
Prof. Chandra Murthy from the Department of Electrical Communication Engineering, Indian Institute of Science, Bangalore, India, emphasized the importance of this work by giving the following comments: “In the paper “Generalized residual ratio thresholding”, Kallummil and Kalyani tackle the difficult problem of sparse signal recovery, which is the problem of recovering high dimensional vectors from noisy, underdetermined, linear measurements, when either the noise variance or the sparsity level is unknown. Further, they consider a bouquet of related sparse signal recovery problems with additional structure, such as block sparsity, joint sparsity, and block-and-joint sparsity. There are two key contributions. The first is the development of novel algorithms based on the idea of generalized residual ratio thresholding, which can recover sparse vectors without requiring knowledge of either the noise variance or sparsity level. Second, and more importantly, they derive rigorous theoretical guarantees for the developed algorithms, in terms of their ability to correctly recover sparse vectors from a finite number of noisy undetermined linear measurements. While the guarantees are mathematically deep, it suffices to say here that they come at a negligible cost in both computational complexity and performance of the algorithms. The paper also provides extensive numerical evidence for the efficacy of the algorithms, by comparing them with the state-of-the-art. Overall, this is an impactful and beautifully written paper that breaks new ground for the use of sparse recovery algorithms in a myriad of practically useful applications.”
Article by Akshay Anantharaman
Here is the original link to the paper: