Topics on LASSO and Approximate Message Passing

dc.contributor.advisorBaraniuk, Richard G.en_US
dc.contributor.committeeMemberVeeraraghavan, Ashoken_US
dc.contributor.committeeMemberZhang, Yinen_US
dc.creatorMousavi, Alien_US
dc.date.accessioned2014-09-22T19:57:53Zen_US
dc.date.available2014-09-22T19:57:53Zen_US
dc.date.created2014-05en_US
dc.date.issued2014-04-25en_US
dc.date.submittedMay 2014en_US
dc.date.updated2014-09-22T19:57:54Zen_US
dc.description.abstractThis thesis studies the performance of the LASSO (also known as basis pursuit denoising) for recovering sparse signals from undersampled, randomized, noisy measurements. We consider the recovery of the signal $$x_o \in \mathbb{R}^N$$ from $$n$$ random and noisy linear observations $$y= Ax_o + w$$, where $$A$$ is the measurement matrix and $$w$$ is the noise. The LASSO estimate is given by the solution to the optimization problem $$x_o$$ with $$\hat{x}_{\lambda} = \arg \min_x \frac{1}{2} \|y-Ax\|_2^2 + \lambda \|x\|_1$$. Despite major progress in the theoretical analysis of the LASSO solution, little is known about its behavior as a function of the regularization parameter $$\lambda$$. In this thesis we study two questions in the asymptotic setting (i.e., where $$N \rightarrow \infty$$, $$n \rightarrow \infty$$ while the ratio $$n/N$$ converges to a fixed number in $$(0,1)$$): (i) How does the size of the active set $$\|\hat{x}_\lambda\|_0/N$$ behave as a function of $$\lambda$$, and (ii) How does the mean square error $$\|\hat{x}_{\lambda} - x_o\|_2^2/N$$ behave as a function of $$\lambda$$? We then employ these results in a new, reliable algorithm for solving LASSO based on approximate message passing (AMP). Furthermore, we propose a parameter-free approximate message passing (AMP) algorithm that sets the threshold parameter at each iteration in a fully automatic way without either having an information about the signal to be reconstructed or needing any tuning from the user. We show that the proposed method attains the minimum reconstruction error in the least number of iterations . Our method is based on applying the Stein unbiased risk estimate (SURE) along with a modified gradient descent to find the optimal threshold in each iteration. Motivated by the connections between AMP and LASSO, it could be employed to find the solution of the LASSO for the optimal regularization parameter. To the best of our knowledge, this is the first work concerning parameter tuning that obtains the smallest MSE in the least number of iterations with theoretical guarantees.en_US
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationMousavi, Ali. "Topics on LASSO and Approximate Message Passing." (2014) Master’s Thesis, Rice University. <a href="https://hdl.handle.net/1911/77216">https://hdl.handle.net/1911/77216</a>.en_US
dc.identifier.urihttps://hdl.handle.net/1911/77216en_US
dc.language.isoengen_US
dc.rightsCopyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.en_US
dc.subjectLASSOen_US
dc.subjectSparsityen_US
dc.subjectMessage passingen_US
dc.titleTopics on LASSO and Approximate Message Passingen_US
dc.typeThesisen_US
dc.type.materialTexten_US
thesis.degree.departmentElectrical and Computer Engineeringen_US
thesis.degree.disciplineEngineeringen_US
thesis.degree.grantorRice Universityen_US
thesis.degree.levelMastersen_US
thesis.degree.nameMaster of Scienceen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
MOUSAVI-DOCUMENT-2014.pdf
Size:
1.83 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 2 of 2
No Thumbnail Available
Name:
LICENSE.txt
Size:
2.6 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
5.83 KB
Format:
Plain Text
Description: