DESCRIPTION:(provided by applicant) Everybody knows that the purpose of the brain is to process information. But what does that mean? How should neuroscientists quantify such information processing? Surely the brain cannot be understood as a digital computer. And although information theory has something to say about how signals should or should not be passed between neurons, by itself there is much we find in the brain that Shannon's basic information theory does not seem to explain. For example how will we be able to compare the computations performed by one type of neuron with those of another type of neuron? Why are certain anatomies and physiologies preferred for one type of computation versus another type of computation? The proposed research seeks appropriate measures to quantify microscopic neuronal function that will make sensible quantitative aspects of neurons and their physiology. If we can successfully quantify and measure computation in a way that explains and predicts a somewhat diverse set of quantitative observations, then these measures will qualify as an appropriate language for quantifying information processing performed by the nervous system. The proposed approach will merge information theory with biologically inescapable issues; the principle issue being the cost of computation and communication. To establish the appropriate measures, the research will answer questions such as: Why are resting potentials around -70 mV? Why not smaller; why not larger? Why aren't energetically wasteful resting conductances smaller? Why not have brains half the size that compute twice as fast? Why do neurons fire in the frequency ranges observed? Why do synaptic failures occur in some systems and not in others and what is the explanation for the observed quantal failure rates? In answering these questions the research will advance some measures as conduits of our understanding while disqualifying other measures. Such qualification, or disqualification, arises from successful, or unsuccessful, quantitative matching of different sets of biological data. The essential organizing and interrelating principle is: identify those aspects of biology that quantitatively limit information processing in the brain. That is, the brain is a costly organ: food and water must be consumed to keep it working properly and, in terms of its absolute size, the brain is a burden for us to carry around. In performing such research, we will use mathematical analysis, computer-based calculations, and biophysical simulations. All of this work will draw on the most basic data about axons, dendrites, and synapses in the published literature. The proposed research promises to tie together diverse sets of anatomical and physiological observations, some of which are over fifty years old, well observed, often used, but never fully explained. The proposed research is necessarily theoretical theory being what is needed to produce a quantitative language for describing and understanding information processing. Because higher brain functions are built out of simpler bits of' computation such a solid foundation will benefit how neuroscientists study and understand higher order brain functions.