Files in This Item:
File Format
b1243176.mp4Streaming VideoView/Open
Title: First Passage through a Fluctuating Boundary: A Phase Transition with Implications for Neural Coding
Originating Office: IAS
Speaker: Magnasco, Marcelo
Issue Date: 19-Jul-2013
Event Date: 19-Jul-2013
Group/Series/Folder: Record Group 8.15 - Institute for Advanced Study
Series 3 - Audio-visual Materials
Location: 8.15:3 EF
Notes: StatPhysHK Conference. Talk no. 18
Title from title slide.
IAS Program on Statistical Physics and Computational Neuroscience, held 2-19 July, 2013, at Hong Kong University of Science and Technology. Sponsors, Hong Kong Baptist University, Croucher Foundation, K. C. Wong Education Foundation.
StatPhysHK Conference, a satellite of STATPHYS 25, held 17-19 July, 2013, at Hong Kong University of Science and Technology and Hong Kong Baptist University.
Abstract: Finding the first time a fluctuating quantity reaches a given boundary is a deceptively simple-looking problem of vast practical importance in physics, biology, chemistry, neuroscience, economics, and industrial engineering. Problems in which the bound to be traversed is itself a fluctuating function of time include widely studied problems in neural coding, such as neuronal integrators with irregular inputs and internal noise. We show that the probability p(t) that a Gauss-Markov process will first exceed the boundary at time t suffers a phase transition as a function of the roughness of the boundary, as measured by its Hölder exponent H. The critical value occurs when the roughness of the boundary equals the roughness of the process, so for diffusive processes the critical value is H[subcript c]=1/2. For smoother boundaries, H> 1/2, the probability density is a continuous function of time. For rougher boundaries, H< 1/2, the probability is concentrated on a Cantor-like set of zero measure: the probability density becomes divergent, almost everywhere either zero or infinity. The critical point H[subcript c]= 1/2 corresponds to a widely studied case in the theory of neural coding, in which the external input integrated by a model neuron is a white-noise process, as in the case of uncorrelated but precisely balanced excitatory and inhibitory inputs. We argue that this transition corresponds to a sharp boundary between rate codes, in which the neural firing probability varies smoothly, and temporal codes, in which the neuron fires at sharply defined times regardless of the in tensity of internal noise.
Duration: 56 min.
Appears in Series:8.15:3 - Audio-visual Materials
Videos for Public -- Distinguished Lectures