Browsing by Author "Johnson, Don"
Now showing 1 - 20 of 37
Results Per Page
Sort Options
Item Analog-to-Digital Conversion(Rice University, 2013-05-18) Johnson, DonBasic analog-to-digital (A/D) conversion and digital filtering of analog signals.Item Analysis of noise reduction in redundant expansions under distributed processing requirements(2005-03-01) Rozell, Chris; Johnson, Don; Digital Signal Processing (http://dsp.rice.edu/)We considered signal reconstruction with redundant expansions under distributed processing in noisy environments. Redundant expansions have the ability to reduce noise corrupting the coefficients, but distributed processing schemes will not be able to take full advantage of the redundancy present. We apply frame theory and a generalization called â frames of subspacesâ to find conditions when distributed reconstruction suffers no loss in noise reduction ability, and we bound performance loss in more general cases.Item Analyzing the robustness of redundant population codes in sensory and feature extraction systems(2005-07-01) Rozell, Chris; Johnson, DonSensorineural systems often use groups of redundant neurons to represent stimulus information both during transduction and population coding of features. This redundancy makes the system more robust to corruption in the representation. We approximate neural coding as a projection of the stimulus onto a set of vectors, with the result encoded by spike trains. We use the formalism of frame theory to quantify the inherent noise reduction properties of such population codes. Additionally, computing features from the stimulus signal can also be thought of as projecting the coefficients of a sensory representation onto another set of vectors specific to the feature of interest. The conditions under which a combination of different features form a complete representation for the stimulus signal can be found through a recent extension to frame theory called "frames of subspaces". We extend the frame of subspaces theory to quantify the noise reduction properties of a collection of redundant feature spaces.Item Asymptotic rates of the information transfer ratio(2002-05-20) Sinanovic, Sinan; Johnson, Don; Digital Signal Processing (http://dsp.rice.edu/)Information processing is performed when a system preserves aspects of the input related to what the input represents while it removes other aspects. To describe a system's information processing capability, input and output need to be compared in a way invariant to the way signals represent information. Kullback-Leibler distance, an information-theoretic measure which reflects the data processing theorem, is calculated on the input and output separately and compared to obtain information transfer ratio. We consider the special case where input serves several parallel systems and show that this configuration has the capability to represent the input information without loss. We also derive bounds for asymptotic rates at which the loss decreases as more parallel systems are added and show that the rate depends on the input distribution.Item Basics of information processing(2002-01-20) Johnson, Don; Digital Signal Processing (http://dsp.rice.edu/)Basic probability theory, statistical signal processing and information theory, and inter-relationships among these disciplines form the foundations of a theory of information processing. Examples are drawn from point-process applications.Item Unknown Broadcast Detection Structures with Applications to Sensor Networks(2007) Johnson, Don; Lexa, Michael; Digital Signal Processing (http://dsp.rice.edu/)Data broadcasting is potentially an effective and efficient way to share information in wireless sensor networks. Broadcasts offer energy savings over multiple, directed transmissions, and they provide a vehicle to exploit the statistical dependencies often present in distributed data. In this paper, we examine two broadcast structures in the context of a distributed detection problem whose inputs are statistically dependent. Specifically, we develop a suboptimal approach to maximize the Kullback-Leibler divergence over a set of binary quantization rules. Our approach not only leads to simple parameterizations of the quantization rules in terms of likelihood ratio thresholds, but also provides insight into the inherent constraints distributed structures impose. We then present two examples in detail and compare the performance of the broadcast structures to that of a centralized system and a noncooperative system. These examples suggest that in situations where the detection problem is difficult (small input divergence), broadcasting solitary bits (or even nothing at all) may be nearly as effective as broadcasting real-valued observations.Item Unknown Dialogue Concerning Neural Coding and Information Theory(2003-08-20) Johnson, Don; Digital Signal Processing (http://dsp.rice.edu/)The relations between information theory and neural coding are discussed by two researchers, one knowledgeable in information theory, the other in neuroscience. The classic information-theoretic notions of entropy, mutual information, and channel capacity are clarified and possible research applications proposed.Item Unknown A Different First Course in Electrical Engineering(1998-05-20) Johnson, Don; Wise, J D; Digital Signal Processing (http://dsp.rice.edu/)Traditional introductory courses in electrical engineering are typically circuit theory courses, which may include both analog and digital hardware and possibly software. The alternatives have focused on how to teach (using discrete-time signals rather than analog) than on what to teach. We developed a top-down course sequence that uses as its underlying principle the transmission and manipulation of information. Students are given a broad perspective of both analog and digital approaches, with the goals of helping students appreciate electrical and computer engineering and framing a context for advanced courses. Laboratories stress construction of analog systems and analysis with signal processing tools.Item Unknown A Different First Course in Electrical Engineering(1999-09-20) Johnson, Don; Wise, J D; Digital Signal Processing (http://dsp.rice.edu/)A Different Course in Electrical EngineeringItem Unknown Discrete-Time Fourier Analysis(Rice University, 2013-05-18) Johnson, DonDefinitions and basic algorithms for Fourier analysis for discrete-time signals.Item Unknown Elements of Detection Theory(Rice University, 2009-06-11) Johnson, DonIntroduction to the theory of detection.Item Unknown The Equivalent Circuit Concept: The Current-Source Equivalent(2003-05-20) Johnson, Don; CITI (http://citi.rice.edu/)The equivalent circuit concept derives from the Superposition Principle and Ohm's Law. Two forms of the equivalent circuit, the Thevenin equivalent and the Norton equivalent, distill any linear circuit into a source and an impedance. The development of these equivalents spans almost seventy-five years, with others than the eponymous people assuming equally important roles. This report describes the pertinent biographies of Mayer and Norton, and provides the relevant sections from their original papers on equivalent circuits.Item Unknown Examining methods for estimating mutual information in spiking neural systems(2005-06-01) Rozell, Chris; Johnson, Don; Center for Multimedia Communications (http://cmc.rice.edu/); Digital Signal Processing (http://dsp.rice.edu/); CITI (http://citi.rice.edu/)Mutual information enjoys wide use in the computational neuroscience community for analyzing spiking neural systems. Its direct calculation is difficult because estimating the joint stimulus-response distribution requires a prohibitive amount of data. Consequently, several techniques have appeared for bounding mutual information that rely on less data. We examine two upper bound techniques and find that they are unreliable and can introduce strong assumptions about the neural code. We also examine two lower bounds, showing that they can be very loose and possibly bear little relation to the mutual information's actual value.Item Unknown Four top reasons why mutual information does not assess neural information processing(2002-07-20) Johnson, Don; Digital Signal Processing (http://dsp.rice.edu/)Mutual information between stimulus and response has been advocated as an information theoretic measure of a neural system's capability to process information. Once calculated, the result is a single number that supposedly captures the system's information characteristics over the range of stimulus conditions used to measure it. I show that mutual information is a flawed measure, the standard approach to measuring it has theoretical difficulties, and that relating capacity to information processing capability is quite complicated.Item Unknown Fundamentals of Electrical Engineering I(Rice University, 2015-08-17) Johnson, DonThe course focuses on the creation, manipulation, transmission, and reception of information by electronic means. Elementary signal theory; time- and frequency-domain analysis; Sampling Theorem. Digital information theory; digital transmission of analog signals; error-correcting codes.Item Unknown Improving the Resolution of Bearing in Passive Sonar Arrays by Eigenvalue Analysis(1981-08-01) Johnson, Don; DeGraaf, StuartA method of improving the bearing-resolving capabilities of a passive array is discussed. This method is an adaptive beamforming method, having many similarities to the minimum energy approach. The evaluation of energy in each steered beam is preceded by an eigenvalue-eigenvector analysis of the emperical correlation matrix. Modification of the computations according to the eigenvalue structure result in improved resolution of the bearing of acoustic sources. The increase in resolution is related to the time-bandwidth product of the computation of the correlation matrix. However, this increased resolution is obtained at the expense of array gain.Item Unknown An Information processing approach to distributed detection(2003-09-20) Lexa, Michael; Johnson, Don; Digital Signal Processing (http://dsp.rice.edu/)We apply the recent theory of information processing to a hybrid distributed detection architecture that combines the traditional parallel and tandem architectures. Central to this theory is the Kullback-Leibler discrimination distance and quantity known as the information transfer ratio, defined as defined as the ratio of the KL distances between the distributions characterizing the input and output of a system. We characterize the asymptotic performance of proposed hybrid system and compare it with the performance of the parallel, tandem and centralized architectures. We conclude with an illustrative example.Item Unknown Information processing during transient responses in the crayfish visual system(2003-06-01) Rozell, Chris; Johnson, Don; Glantz, Raymon; Digital Signal Processing (http://dsp.rice.edu/)We analyzed sustaining fiber responses in the crayfish visual system to light pulses using information processing techniques. The light pulse stimuli elicited a transient and a steady-state component in the EPSP input and in the firing rate of the spike train output. The overall information transfer of the system was very low (10-4), with a sharp increase during the transient portion of the response followed by a steady decrease. The change in the information transfer rate is related to the difference in communication rates possible in spike trains with varying rates. This analysis corroborates the observed light reflex behavior.Item Unknown Information processing of linear block decoders(2002-10-20) Lexa, Michael; Johnson, Don; Digital Signal Processing (http://dsp.rice.edu/)This paper develops a systematic method of studying the benefits of soft decoding for linear block codes by applying the concepts of information processing. We show that soft decoding uniformly improves decoder performance in terms of the information transfer ratio.Item Unknown Limits of population coding(2003-07-20) Johnson, Don; Digital Signal Processing (http://dsp.rice.edu/)To understand whether the population response expresses information better than the aggregate of the individual responses, the sum of the individual contributions is frequently used as a baseline against which to assess the population's coding capabilities. Using information processing theory, we show that this baseline is illusory: the independent baseline case is theoretically impossible to apply consistently to any population. Instead, we use as a baseline the noncooperative population, in which each neuron processes a common input independently of the others. Using the information transfer ratio, the ratio of Kullback-Leibler distances evaluated at a population's input and output to measure a population's coding ability, we show that cooperative populations can perform either better or worse than this baseline. Furthermore, we show that population coding is effective only when each neuron poorly codes information when considered out of context of the population.