Inference by Reparameterization using Neural Population Codes

dc.contributor.advisorPitkow, Xaqen_US
dc.contributor.committeeMemberAazhang, Behnaamen_US
dc.contributor.committeeMemberErnst, Philipen_US
dc.contributor.committeeMemberJosic, Kresimiren_US
dc.creatorVasudeva Raju, Rajkumaren_US
dc.date.accessioned2016-01-27T17:46:45Zen_US
dc.date.available2016-01-27T17:46:45Zen_US
dc.date.created2015-12en_US
dc.date.issued2015-12-04en_US
dc.date.submittedDecember 2015en_US
dc.date.updated2016-01-27T17:46:45Zen_US
dc.description.abstractBehavioral experiments on humans and animals suggest that the brain performs probabilistic inference to interpret its environment. Here we present a general-purpose, biologically plausible implementation of approximate inference based on Probabilistic Population Codes (PPCs). PPCs are distributed neural representations of probability distributions that are capable of implementing marginalization and cue-integration in a biologically plausible way. By connecting multiple PPCs together, we can naturally represent multivariate probability distributions, and capture the conditional dependency structure by setting those connections as in a probabilistic graphical model. To perform inference in general graphical models, one convenient and often accurate algorithm is Loopy Belief Propagation (LBP), a ‘message-passing’ algorithm that uses local marginalization and integration operations to perform approximate inference efficiently even for complex models. In LBP, a message from one node to a neighboring node is a function of incoming messages from all neighboring nodes, except the recipient. This exception renders it neurally implausible because neurons cannot readily send many different signals to many different target neurons. Interestingly, however, LBP can be reformulated as a sequence of Tree-based Re-Parameterization (TRP) updates on the graphical model which re-factorizes a portion of the probability distribution. Although this formulation still implicitly has the message exclusion problem, we show this can be circumvented by converting the algorithm to a nonlinear dynamical system with auxiliary variables and a separation of time-scales. By combining these ideas, we show that a network of PPCs can represent multivariate probability distributions and implement the TRP updates for the graphical model to perform probabilistic inference. Simulations with Gaussian graphical models demonstrate that the performance of the PPC-based neural network implementation of TRP updates for probabilistic inference is comparable to the direct evaluation of LBP, and thus provides a compelling substrate for general, probabilistic inference in the brain.en_US
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationVasudeva Raju, Rajkumar. "Inference by Reparameterization using Neural Population Codes." (2015) Master’s Thesis, Rice University. <a href="https://hdl.handle.net/1911/88182">https://hdl.handle.net/1911/88182</a>.en_US
dc.identifier.urihttps://hdl.handle.net/1911/88182en_US
dc.language.isoengen_US
dc.rightsCopyright is held by the author, unless otherwise indicated. Permission to reuse, publish, or reproduce the work beyond the bounds of fair use or other exemptions to copyright law must be obtained from the copyright holder.en_US
dc.subjectProbabilistic Inferenceen_US
dc.subjectProbabilistic Population Codesen_US
dc.subjectTree-based Re-parameterizationen_US
dc.subjectneural networken_US
dc.titleInference by Reparameterization using Neural Population Codesen_US
dc.typeThesisen_US
dc.type.materialTexten_US
thesis.degree.departmentElectrical and Computer Engineeringen_US
thesis.degree.disciplineEngineeringen_US
thesis.degree.grantorRice Universityen_US
thesis.degree.levelMastersen_US
thesis.degree.nameMaster of Scienceen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
VASUDEVARAJU-DOCUMENT-2015.pdf
Size:
5.9 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 2 of 2
No Thumbnail Available
Name:
PROQUEST_LICENSE.txt
Size:
5.85 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
LICENSE.txt
Size:
2.62 KB
Format:
Plain Text
Description: