Palem, Krishna V.2020-11-032020-11-032020-082020-10-30August 202Jiang, Mingchao. "Influence based bit-quantization for machine learning: Cost Quality Tradeoffs." (2020) Master’s Thesis, Rice University. <a href="https://hdl.handle.net/1911/109494">https://hdl.handle.net/1911/109494</a>.https://hdl.handle.net/1911/109494Due to the significant computational cost associated with machine learning architectures such as neural networks or network for short, there has been significant interest in quantizing or reducing the number of bits used. Current quantization approaches treat all of the network parameters equally by allocating the same bit width budget to all of them. In this work we are proposing a quantization approach which allocates bit budgets to parameters preferentially based on their influence. Here, our notion of influence is inspired by the traditional definition of this concept from the Fourier analysis of Boolean functions. We show that guiding investment of bit budgets using influence can get acceptable accuracy with lower overall bit budgets when compared to approaches that do not use quantization. We show that by trading 4.5% in accuracy, we can gain in bit budgets by a factor of 28. To better understand our approach, we also considered allocating bit budgets through random allocations and found that an our influence based approach outperforms most of the time by noticeable margins. All of these results are based on the MNIST data set and our algorithm for computing influence is based on a simple and easy to implement greedy approach.application/pdfengCopyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.Neural NetworksQuantizationBit InfluenceInfluence based bit-quantization for machine learning: Cost Quality TradeoffsThesis2020-11-03