Intrinsic Computation of a Monod-Wyman-Changeux Molecule

Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process' "intrinsic computation". We discuss how statistical complexity changes with slight...

Full description

Bibliographic Details
Main Author: Marzen, Sarah E. (Contributor)
Other Authors: Massachusetts Institute of Technology. Department of Physics (Contributor)
Format: Article
Language:English
Published: MDPI AG, 2018-08-27T14:43:53Z.
Subjects:
Online Access:Get fulltext
LEADER 01341 am a22001573u 4500
001 117535
042 |a dc 
100 1 0 |a Marzen, Sarah E.  |e author 
100 1 0 |a Massachusetts Institute of Technology. Department of Physics  |e contributor 
100 1 0 |a Marzen, Sarah E.  |e contributor 
245 0 0 |a Intrinsic Computation of a Monod-Wyman-Changeux Molecule 
260 |b MDPI AG,   |c 2018-08-27T14:43:53Z. 
856 |z Get fulltext  |u http://hdl.handle.net/1721.1/117535 
520 |a Causal states are minimal sufficient statistics of prediction of a stochastic process, their coding cost is called statistical complexity, and the implied causal structure yields a sense of the process' "intrinsic computation". We discuss how statistical complexity changes with slight changes to the underlying model– in this case, a biologically-motivated dynamical model, that of a Monod-Wyman-Changeux molecule. Perturbations to kinetic rates cause statistical complexity to jump from finite to infinite. The same is not true for excess entropy, the mutual information between past and future, or for the molecule’s transfer function. We discuss the implications of this for the relationship between intrinsic and functional computation of biological sensory systems. Keywords: statistical complexity; intrinsic computation; excess entropy 
655 7 |a Article 
773 |t Entropy