Jermaine, Christopher2016-01-152016-01-152014-122014-09-26December 2Gao, Zekai. "Distributed Algorithms for Computing Very Large Thresholded Covariance Matrices." (2014) Master’s Thesis, Rice University. <a href="https://hdl.handle.net/1911/87863">https://hdl.handle.net/1911/87863</a>.https://hdl.handle.net/1911/87863Computation of covariance matrices from observed data is an important problem, as such matrices are used in applications such as PCA, LDA, and increasingly in the learning and application of probabilistic graphical models. One of the most challenging aspects of constructing and managing covariance matrices is that they can be huge and the size makes then expensive to compute. For a p-dimensional data set with n rows, the covariance matrix will have p(p-1)/2 entries and the naive algorithm to compute the matrix will take O(np^2) time. For large p (greater than 10,000) and n much greater than p, this is debilitating. In this thesis, we consider the problem of computing a large covariance matrix efficiently in a distributed fashion over a large data set. We begin by considering the naive algorithm in detail, pointing out where it will and will not be feasible. We then consider reducing the time complexity using sampling-based methods to compute to compute an approximate, thresholded version of the covariance matrix. Here “thresholding” means that all of the unimportant values in the matrix have been dropped and replaced with zeroes. Our algorithms have probabilistic bounds which imply that with high probability, all of the top K entries in the matrix have been retained.application/pdfengCopyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.Distributed algorithmscovariance matricesDistributed Algorithms for Computing Very Large Thresholded Covariance MatricesThesis2016-01-15