Distributed Algorithms for Computing Very Large Thresholded Covariance Matrices
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Computation of covariance matrices from observed data is an important problem, as such matrices are used in applications such as PCA, LDA, and increasingly in the learning and application of probabilistic graphical models. One of the most challenging aspects of constructing and managing covariance matrices is that they can be huge and the size makes then expensive to compute. For a p-dimensional data set with n rows, the covariance matrix will have p(p-1)/2 entries and the naive algorithm to compute the matrix will take O(np^2) time. For large p (greater than 10,000) and n much greater than p, this is debilitating. In this thesis, we consider the problem of computing a large covariance matrix efficiently in a distributed fashion over a large data set. We begin by considering the naive algorithm in detail, pointing out where it will and will not be feasible. We then consider reducing the time complexity using sampling-based methods to compute to compute an approximate, thresholded version of the covariance matrix. Here “thresholding” means that all of the unimportant values in the matrix have been dropped and replaced with zeroes. Our algorithms have probabilistic bounds which imply that with high probability, all of the top K entries in the matrix have been retained.
Description
Advisor
Degree
Type
Keywords
Citation
Gao, Zekai. "Distributed Algorithms for Computing Very Large Thresholded Covariance Matrices." (2014) Master’s Thesis, Rice University. https://hdl.handle.net/1911/87863.