Comptes Rendus

Is there chaos in the brain? II. Experimental evidence and related models
Comptes Rendus. Biologies, Volume 326 (2003) no. 9, pp. 787-840.


The search for chaotic patterns has occupied numerous investigators in neuroscience, as in many other fields of science. Their results and main conclusions are reviewed in the light of the most recent criteria that need to be satisfied since the first descriptions of the surrogate strategy. The methods used in each of these studies have almost invariably combined the analysis of experimental data with simulations using formal models, often based on modified Huxley and Hodgkin equations and/or of the Hindmarsh and Rose models of bursting neurons. Due to technical limitations, the results of these simulations have prevailed over experimental ones in studies on the nonlinear properties of large cortical networks and higher brain functions. Yet, and although a convincing proof of chaos (as defined mathematically) has only been obtained at the level of axons, of single and coupled cells, convergent results can be interpreted as compatible with the notion that signals in the brain are distributed according to chaotic patterns at all levels of its various forms of hierarchy.

This chronological account of the main landmarks of nonlinear neurosciences follows an earlier publication [Faure, Korn, C. R. Acad. Sci. Paris, Ser. III 324 (2001) 773–793] that was focused on the basic concepts of nonlinear dynamics and methods of investigations which allow chaotic processes to be distinguished from stochastic ones and on the rationale for envisioning their control using external perturbations. Here we present the data and main arguments that support the existence of chaos at all levels from the simplest to the most complex forms of organization of the nervous system.

We first provide a short mathematical description of the models of excitable cells and of the different modes of firing of bursting neurons (Section 1). The deterministic behavior reported in giant axons (principally squid), in pacemaker cells, in isolated or in paired neurons of Invertebrates acting as coupled oscillators is then described (Section 2). We also consider chaotic processes exhibited by coupled Vertebrate neurons and of several components of Central Pattern Generators (Section 3). It is then shown that as indicated by studies of synaptic noise, deterministic patterns of firing in presynaptic interneurons are reliably transmitted, to their postsynaptic targets, via probabilistic synapses (Section 4). This raises the more general issue of chaos as a possible neuronal code and of the emerging concept of stochastic resonance Considerations on cortical dynamics and of EEGs are divided in two parts. The first concerns the early attempts by several pioneer authors to demonstrate chaos in experimental material such as the olfactory system or in human recordings during various forms of epilepsies, and the belief in ‘dynamical diseases’ (Section 5). The second part explores the more recent period during which surrogate-testing, definition of unstable periodic orbits and period-doubling bifurcations have been used to establish more firmly the nonlinear features of retinal and cortical activities and to define predictors of epileptic seizures (Section 6). Finally studies of multidimensional systems have founded radical hypothesis on the role of neuronal attractors in information processing, perception and memory and two elaborate models of the internal states of the brain (i.e. ‘winnerless competition’ and ‘chaotic itinerancy’). Their modifications during cognitive functions are given special attention due to their functional and adaptive capabilities (Section 7) and despite the difficulties that still exist in the practical use of topological profiles in a state space to identify the physical underlying correlates. The reality of ‘neurochaos’ and its relations with information theory are discussed in the conclusion (Section 8) where are also emphasized the similarities between the theory of chaos and that of dynamical systems. Both theories strongly challenge computationalism and suggest that new models are needed to describe how the external world is represented in the brain.

Published online:
DOI: 10.1016/j.crvi.2003.09.011
Keywords: neuronal dynamics, neurochaos, networks, chaotic itinerancy, winnerless competition, representation, neuronal code

Henri Korn 1; Philippe Faure 1

1 ‘Récepteurs et Cognition’, CNRS 2182, Institut Pasteur, 25, rue du Docteur-Roux, 75724 Paris cedex 15, France
     author = {Henri Korn and Philippe Faure},
     title = {Is there chaos in the brain? {II.} {Experimental} evidence and related models},
     journal = {Comptes Rendus. Biologies},
     pages = {787--840},
     publisher = {Elsevier},
     volume = {326},
     number = {9},
     year = {2003},
     doi = {10.1016/j.crvi.2003.09.011},
     language = {en},
AU  - Henri Korn
AU  - Philippe Faure
TI  - Is there chaos in the brain? II. Experimental evidence and related models
JO  - Comptes Rendus. Biologies
PY  - 2003
SP  - 787
EP  - 840
VL  - 326
IS  - 9
PB  - Elsevier
DO  - 10.1016/j.crvi.2003.09.011
LA  - en
ID  - CRBIOL_2003__326_9_787_0
ER  - 
%0 Journal Article
%A Henri Korn
%A Philippe Faure
%T Is there chaos in the brain? II. Experimental evidence and related models
%J Comptes Rendus. Biologies
%D 2003
%P 787-840
%V 326
%N 9
%I Elsevier
%R 10.1016/j.crvi.2003.09.011
%G en
%F CRBIOL_2003__326_9_787_0
Henri Korn; Philippe Faure. Is there chaos in the brain? II. Experimental evidence and related models. Comptes Rendus. Biologies, Volume 326 (2003) no. 9, pp. 787-840. doi : 10.1016/j.crvi.2003.09.011. https://comptes-rendus.academie-sciences.fr/biologies/articles/10.1016/j.crvi.2003.09.011/

Version originale du texte intégral

1 Introduction

There is growing evidence that future research on neural systems and higher brain functions will be a combination of classical (sometimes called reductionist) neuroscience with the more recent nonlinear science. This conclusion will remain valid despite the difficulties in applying the tools and concepts developed to describe low dimensional and noise-free mathematical models of deterministic chaos to the brain and to biological systems. Indeed, it has become obvious in a number of laboratories over the last two decades that the different regimes of activities generated by nerve cells, neural assemblies and behavioral patterns, their linkage and their modifications over time cannot be fully understood in the context of any ‘integrative’ physiology without using the tools and models that establish a connection between the microscopic and the macroscopic levels of the investigated processes.

Part I of this review [1] was focused on briefly presenting the fundamental aspects of nonlinear dynamics, the most publicized aspect of which is chaos theory. More fundamental text books can also be consulted by mathematically oriented reader [2–5]. After a general history and definition of this theory we described the principles of analysis of time series in phase spaces and the general properties of dynamic trajectories as well as the ‘coarse-grained’ measures, which permit a process to be classified as chaotic in ideal systems and models. We insisted on how these methods need to be adapted for handling biological time series and on the pitfalls faced when dealing with non stationary and most often noisy data. Special attention was paid to two fundamental issues.

The first was whether, and how, one can distinguish deterministic patterns from stochastic ones. This question is particularly important in the nervous system where variability is the rule at all levels of organization [6] and where for example time series of synaptic potentials or trains of spikes are often qualified as conforming to Poisson distributions on the basis of standard inter event histograms (see also [7]). Yet this conclusion can be ruled out if the same data are analyzed in depth with nonlinear tools such as first or second order return maps and using the above-mentioned measures confronted with those of randomly shuffled data called surrogates. The critical issue is here to determine if intrinsic variability, which is an essential ingredient of successful behavior and survival in living systems, reflects true randomness or if it is produced by an apparently stochastic underlying determinism and order. In other words how can the effects of ‘noise’ be distinguished from those resulting from a small number of interacting nonlinear elements. In the latter case they also appear as highly unpredictable but their advantage is that they can be dissected out and the physical correlates of their interacting parameters can be identified physiologically.

The second issue concerned the possible benefits of chaotic systems over stochastic processes, namely of the possibility to control the former. Theoretically such a control can be achieved by taking advantage of the sensitivity of chaotic trajectories to initial conditions and to ‘redirect them’, with a small perturbation, along a selected unstable periodic orbit, toward a desired state. A related and almost philosophical problem, which we will not consider further, is whether the output of a given organism can be under its own control as opposed to being fully determined by ‘in-principle-knowable causal factors’ [6]; The metaphysical counterpart of this query consists in speculating, as did number of authors, about the existence and nature of free-will [8,9]...

In the present part II of this review, we will critically examine most of the results obtained at the level of single cells and their membrane conductances, in real networks and during studies of higher brain functions, in the light of the most recent criteria for judging the validity of claims for chaos. These constraints have become progressively more rigorous particularly with the advent of the surrogate strategy (which however can also be misleading (references in [1])). Thus experts can easily argue that some early ‘demonstrations’ of deterministic chaos founded on weak experimental evidence were accepted without sufficient analysis [9]. But this is only one side of the story. Indeed we will see that the tools of nonlinear dynamics have become irreplaceable for revealing hidden mechanisms subserving, for example, neuronal synchronization, periodic oscillations and also for studies of cognitive functions and behavior viewed as dynamic phenomena rather than processes that one can study in isolation from their environmental context.

The history of the search for chaos in the nervous system, of its successes and its errors, and of the advent of what has become neurodynamics is truly fascinating. It starts in the 1980s (see [10]) with the observation that when rabbits inhale an odorant, their EEGs display oscillations in the high-frequency range of 20–80 Hz that Bressler and Freeman [11] named ‘gamma’ in analogy to the high end of the X-ray spectrum! Odor information was then shown to exist as a pattern of neural activity that could be discriminated whenever there was a change in the odor environment or after training. Furthermore the ‘carrier wave’ of this information was aperiodic. Further dissection of the experimental data led to the conclusion that the activity of the olfactory bulb is chaotic and may switch to any desired) perceptual state (or attractor) at any time. To compensate for experimental limitations the olfactory bulb was then simulated by constructing arrays of local oscillators interconnected by excitatory synapses that generated a common waveform. The inclusion of inhibitory cells and synapses facilitated the emergence of amplitude fluctuations in the waveform. Learning could strengthen the synapses between oscillators and favored the formation of Hebbian nerve cell assemblies in a self-regulatory manner which opened new ways of thinking about the nature of perception and of storing ‘representations’ of the outside world.

This complementary, experimental and theoretical approach of Freeman and his collaborators was similar to that of other authors searching for chaos, during the same period, in the temporal structure of the firing patterns of squid axons, of invertebrate pacemaker cells and of temporal patterns of human epileptic EEGs. We will show that regardless of today's judgment on their hasty conclusions and naive enthusiasm that relied on ill-adapted measures for multidimensional and noisy systems these precursors had amazingly sharp insights. Not only were their conclusions often vindicated with more sophisticated methods but they blossomed, more recently, in the form of the dynamical approach of brain operations and cognition.

We have certainly omitted several important issues from this general overview which is largely a chronological description of the successes, and occasional disenchantments, of this still evolving field. One can mention the problem of the stabilization of chaos by noise, of the phylogeny and evolution of neural chaotic systems, whether or not coupled chaotic systems behave as one and the nature of their feedbacks, to name a few. These issues will most likely be addressed in depth in the context of research on the complex systems, to which the brain obviously belongs.

2 Subcellular and cellular levels

Carefully controlled experiments during which it was possible to collect large amounts of stationary data have unambiguously demonstrated chaotic dynamics at the level of neurons systems This conclusion was reached using classical intracellular electrophysiological recordings of action potentials in single neurons, with the additional help of macroscopic models. These models describe the dynamical modes of neuronal firing and enable a comparison of results of simulations with those obtained in living cells. On the other hand and at a lower level of analysis, the advent of patch clamp techniques to study directly the properties of single ion channels did not make it necessary to invoke deterministic equations to describe the opening and closing of these channels which show the same statistical features as random Markov processes [12,13], although deterministic chaotic models may be consistent with channel dynamics [7,14,15].

It is generally believed that information is secured in the brain by trains of impulses, or action potentials, often organized in sequences of bursts. It is therefore essential to determine the temporal patterns of such trains. The generation of action potentials and of their rhythmic behavior are linked to the opening and closing of selected classes of ionic channels. Since the membrane potential of neurons can be modified by acting on a combination of different ionic mechanisms, the most common models used for this approach take advantage of the Hodgkin and Huxley equations (see [16–18] as first pioneered and simplified by FitzHugh [19] in the FitzHugh–Nagumo model [20]).

Briefly, knowing the physical counterpart of the parameters of these models, it becomes easy to determine for which of these terms, and for what values the firing mode of the simulated neurons undergoes transformations, from rest to different attractors, through successive bifurcations. A quick reminder of the history and the significance of the mathematical formalism proposed by Hodgkin and Huxley and later by other authors is necessary for clarifying this paradigm.

2.1 Models of excitable cells and of neuronal firing

2.1.1 The Hodgkin and Huxley model

It is the paving-stone upon which most conductance-based models are built. The ionic mechanisms underlying the initiation and propagation of action potentials have been beautifully elucidated by applying to the squid giant axon the voltage clamp technique, in which the membrane potential can be displaced and held to a new value by an electronic feedback (for a full account see [21–23]). As shown in Fig. 1A, it was found that the membrane potential of the axon is determined by three conductances, i.e. gNa, gK and gL, which are placed in series with their associated batteries VNa, VK and VL and in parallel with the membrane capacitance C. Before activation the membrane voltage, V, is at rest and the voltage-dependent channels permeable to sodium (Na+) and potassium (K+), which can be viewed as closed. Under the effect of a stimulation, the capacitor is led to discharge so that the membrane potential is shifted in the depolarizing direction and due to the subsequent opening of channels, a current is generated. This current consists in two phases. First sodium moves down its concentration gradient thus giving rise to an inward current and a depolarization. Second, this transient component is replaced by an outward potassium current and the axon repolarizes (Fig. 1B).

Fig. 1

Ionic currents involved in the generation of action potentials. (A) Equivalent circuit of a patch of excitable membrane. There are two active conductances gNa and gK, and a third passive ‘leak’ conductance gL which is relatively unimportant and which carries other ions, including chloride. Each of them is associated to a battery and is placed in parallel with the capacitance C (see text for explanations). Vertical arrows (labeled I) point the direction of the indicated ionic currents. (B) Theoretical solution for a propagated action potential (V, broken line) and its underlying activated conductances (gNa and gK), as a function of time; note their good agreement with those of experimentally recorded impulses. The upper and lower horizontal dashed lines designate the equilibrium potential of sodium and potassium, respectively. (Adapted from [21], with permission of the Journal of Physiology.)

To describe the changes in potassium conductances Hodgkin and Huxley assumed that a channel has two states, open and closed, with voltage-dependent rate constants for transition between them. That relation is formally expressed as,

where n is the probability that a single particle is in the ‘right place’, V is the voltage, αn and βn are rate constants.

Fitting the experimental data to this relationship revealed gKgKn4 where gK is the maximal conductance. Thus it was postulated that four particles or sensors need to undergo transitions for a channel to open. Similarly, for the sodium channel, it was postulated that three events, each with a probability m, open the gate and that a single event, with a probability (1−h) blocks it. Then

The important point here is that this formalism rests on nonlinear differential equations which, in addition, are coupled by the membrane potential, V. It follows, that the total membrane current density is:
I=CdVdt+g¯Kn4(V-VK)+g¯ Na m3h(V-V Na )+g¯L(V-VK)(4)
Eqs. (1) to (4), which underly the generation of action currents in a limited patch of membrane, can be completed to account for the propagation of action potentials along the core of axons by including to this formalism equations pertaining to their specific cable properties (see [24,25]).

The Hodgkin and Huxley model has been, and remains extremely fruitful for the studies of neurons as it reproduces with great accuracy the behavior of excitable cells such as their firing threshold, steady state activation and inactivation, bursting properties, bistability, to name a few of their characteristics. For example it has been successfully used, with some required adjustments of the rate constants of specific conductances, to reproduce the action potentials of cardiac cells (whether nodal or myocardial) and of cerebellar Purkinje cells (for details about authors and equations, see [26]). However, its implementation requires an exact and prior knowledge of the kinetics of each of the numerous conductances acting in a given set of cells. Furthermore the diversity of ionic currents in various cell types coupled with the complexity of their distribution over the cell, implies that number of parameters are involved in the different neuronal compartments, for example, in dendrites (see [23,27]). This diversity can preclude simple analytic solutions and further understanding of which parameter is critical for a particular function.

To avoid these drawbacks and to reduce the number of parameters, global macroscopic models have been constructed by taking advantage of the theory of dynamical systems. One can then highlight the qualitative features of the dynamics shared by numerous classes of neurons and/or of ensemble of cells such as their bistability, their responses to applied currents or synaptic inputs, their repetitive firing and oscillatory processes. This topological approach yields geometrical solutions expressed in term of limit cycles, basins of attraction and strange attractors, as defined in [1]. For more details, one can consult several other comprehensive books and articles written for physiologists [18,28,29].

2.1.2 The FitzHugh–Nagumo model: space phase analysis

A simplification of the Hodgkin and Huxley model is justified by the observation that, changes in the membrane potential related to (i) sodium activation, and (ii) sodium inactivation and potassium activation, evolve during a spike on a fast and slow time course, respectively. Thus the reduction consists of taking two variables into account instead of four, a fast (V) and a slow (W) one, according to:

dWdt=ϕ(V+a- bW )(6)
which, again, are nonlinear coupled differential equations where Eq. (5) is polynomial and where the terms a,b and φ in Eq. (6) are dimensionless and positive [20,29].

An important aspect of the FitzHugh–Nagumo formalism is that since it is a two-variable model it is well suited for phase plane studies in which the variables V and W can be shown as functions of time (however, it can be noted that although models based on Hodgkin and Huxley equations can generate chaos, single two dimensional FitzHugh–Nagumo neurons cannot). These plots called ‘phase plane portraits’ provide a geometrical representation, which illustrates qualitative features of the solution of differential equations. The basic relationships were derived by Van der Pol [30] who was interested in nonlinear oscillators and they were first applied to the cardiac pacemaker [31]. It is therefore not surprising that this model was used later on to study the bursting behavior of neurons, sometimes linked with the Hodgkin and Huxley equations in the form of a mosaic, as proposed by Morris and Lecar [32] to describe the excitability of the barnacle muscle fiber (see [18]). Specifically, when an appropriate family of currents is injected into the simulated ‘neurons’ the behavior of the evoked spike trains appears in the phase space to undergo a transition from a steady state to a repetitive limit cycle via Hopf bifurcations which can be smooth and unstable (supercritical, Fig. 2A) or abrupt (subcritical, Fig. 2B), or via homoclinic bifurcations, i.e. at saddle nodes and regular saddles (not shown, see [33]), with the possible hysteresis when the current I varies from one side to the other of its optimal values (Fig. 2C).

Fig. 2

Transitions from a steady state to an oscillatory firing mode. (A) Left. Diagram bifurcation of a supercritical Hopf bifurcation. The abscissa represents the intensity of the control parameter, in this case an ‘intracellularly applied current’, I. The ordinate is the membrane potential. The repetitive firing state is indicated by the maximal (max) and minimal (min) amplitudes of the oscillations. Note that for a critical value of I (arrow) the system shifts from a steady state to an oscillatory mode (solid curve) on either side of an unstable point (dashed line). Right. Corresponding firing pattern of a neuron (upper trace) produced by a current, I, of constant intensity (lower trace). (B) Left. Same presentation as above of events in the case of a subcritical Hopf bifurcation. The stable oscillatory branch is preceded by an unstable phase (vertical dashed line in shaded area) during which the steady state and the oscillations coexist. Right. Current pulses of low amplitude can reset the oscillations during this unstable state (bistability). (C) Plot of the frequency of firing (f, ordinate) versus the intensity of the applied current (I, abscissa). (Adapted from [33], with permission of the MIT Press.)

2.1.3 Definitions

A few definitions of some of the events observed in the phase space become necessary. Their description is borrowed from Hilborn [34]. A bifurcation is a sudden change in the dynamics of the system; it occurs when a parameter used for describing it takes a characteristic value. At bifurcation points the solutions of the time-evolution equations are unstable and in many ‘real’ systems (other than mathematical) these points can be missed because they are perturbed by noise. There are several types of fixed points (that is of points at which the trajectory of a system tends to stay). Among them nodes (or sinks) attract close by trajectories while saddle points attract them on one side of the space but repel them on the other (see also Section 6.1 for the definition of a saddle). There are also repellors (sources) that keep away nearby trajectories. When for a given value of the parameter, a point gives birth to a limit cycle it is called a Hopf bifurcation, a common bifurcation, which can be supercritical if the limit cycle takes its origin at the point itself (Fig. 2A) or subcritical if the solution of the equation is at a finite distance (Fig. 2B) due to amplification of instabilities by the nonlinearities [35]. To get a feeling for what are homoclinic and heteroclinic bifurcations and orbits one has to refer to the invariant stable (insets) and unstable (outsets) manifolds and to the saddle cycles which are formed by trajectories as they head, according to strict mathematical rules, toward and away from saddle points, respectively (for more details see [1] and Section 6.1). Specifically, a homoclinic intersection appears on Poincaré maps when a control parameter is changed and insets and outsets of a saddle point intersect; there is a heteroclinic intersection when the stable manifold of one saddle point intersects with the stable manifold of another one. Once these intersections occur they repeat infinitely and connected points form homoclinic or heteroclinic orbits that eventually lead to chaotic behavior.

2.1.4 The Hindmarsh and Rose model of bursting neurons

This algorithm becomes increasingly popular in neuroscience. It is derived from the two-variable model of the action potential presented earlier by the same authors [36], which was a modified version of the FitzHugh–Nagumo model and it has the important property of generating oscillations with long interspike intervals [37,38]. It is one of the simplest mathematical representation of the widespread phenomenon of oscillatory burst discharges that occur in real neuronal cells. The initial Hindmarsh and Rose model has two variables, one for the membrane potential, V, and one for the the ionic channels subserving accommodation, W. The basic equations are:

where I is the applied current, α, β, γ and δ are rate constants, and where f(V) is cubic and g(V) is not a linear function. This model allows to take into account actual data: its right-hand side (vector field) can fit the observed current/voltage relationship for the cells it describes. Hence, it is possible to determine how many degrees of freedom are needed to make polynomial fit to the I/V characteristics. These equation generate bistability and to produce bursting, a slow adaptation current, z, which moves the voltage in and out of the bistable regime and which terminates spike discharges is added. Changing variables V and W into x and y (details in [37]), one obtains the three-variable model:
where r is the time scale of the slow adaptation current and h1 is the scale of the influence of the slow dynamics [39], which determines whether the neuron fires in a tonic or in a burst mode when it is exposed to a sustained current input [38].

Despite some limitations in describing every property of spike-bursting neurons, for example the relation between bursting frequency and amplitude of the rebound potential versus current observed in some real data [40], the Hindmarsh and Rose model has major advantages for studies of: (i) spikes trains in individual cells, and (ii) the cooperative behavior of neurons that arises when cells belonging to large assemblies are coupled with each other [40,41].

First, as shown in Fig. 3A, and depending on the values of parameters in the equations above, the neurons can be in a steady state or they can generate a periodic low-frequency repetitive firing, chaotic bursts or high-frequency discharges of action potentials (an example of period-doubling of spike discharges of a Hindmarsh and Rose neuron, as a function of the injected current is illustrated in Fig. 14 displayed in Part I of this review [1]).

Fig. 3

Different firing patterns of Rose and Hindmarsh neurons. (A) For increased values of an injected current, I (as indicated, from top to bottom), the model cell produced short, long and irregular (chaotic) bursts of action potentials. (B) Example of out of phase sequences of bursts generated by two strongly reciprocally coupled inhibitory neurons (after Faure and Korn, unpublished).

Fig. 14

Synchronization of two electrically coupled Rose and Hindmarsh neurons. The membrane potentials x1(t) and x2(t) are in antiphase for a low value of the coupling parameter ε which is a conductance of the ‘wire’ connecting them (A1, ε=0.02). As this parameter is increased, the synchronization is incomplete and nearly in phase (A2, ε=0.4) and it is finally complete and in phase (A3, ε>0.5). (From [40], with permission of Neural Computation.)

Second, Rose and Hindmarsh neurons can be easily linked using equations accounting for electrical and/or chemical junctions (the latter can be excitatory or inhibitory) which underlie synchronization in theoretical models as they do in experimental material (references in [39]). Such a linkage can lead to out of phase (Fig. 3B) or to in phase bursting in neighboring cells or to a chaotic behavior, depending on the degree of coupling between the investigated neurons.

2.2 Experimental data from single cells

2.2.1 Isolated axons

The nonlinear behavior of axons and the potential for deterministic chaos of excitable cells have been well documented both experimentally and with extensive theoretical models of the investigated systems. The results obtained with intracellular recordings of action potentials in the squid giant axon are particularly convincing. Specifically, by changing the external concentration of sodium (Na), it is possible to produce a switch from the resting state to a state characterized by (i) self sustained oscillations and (ii) repetitive firing of action potentials that mimic the activity of a pacemaker neuron (Fig. 4A). The resting and oscillatory states were found to be thermodynamically equivalent to an asymptotically stable equilibrium point and a stable limit cycle, respectively, with an unstable equilibrium point between them (Fig. 4B). Simulations based upon modified Hodgkin and Huxley equations successfully predicted the range of external ionic concentrations accounting for the bistable regime and the transition between the unstable and the stable periodic behavior via a Hopf bifurcation [42,43].

Fig. 4

Periodic and non-periodic behavior of a squid giant axon. (A) Periodic oscillations (left) and membrane potential at rest (right) after exposure of the preparation for 0.25 and 6.25 min to an external solution containing the equivalent of 530 and 550 mM NaCl, respectively. (B) Bistable behavior of an axon placed in a 1/3.5 mixture of NSW 550 mM NaCl. Note the switch from subliminal (left) to supraliminal (right) self-sustained oscillations, produced by a stimulating pulse of increasing intensity. (C) Chaotic oscillations in response to sinusoidal currents. The values of the natural oscillating frequency and the stimulating frequency (fn) were 136 and 328 Hz (left) and 228 and 303 Hz (right). In each panel, the upper and lower traces represent the membrane potential and the activating current, respectively. (A and B are modified from [42], C is from [44], with permission of the Journal of Theoretical Biology.)

Extending their work on the squid giant axon, Ahira and his collaborators [43,44] have studied the membrane response of both this preparation and a Hodgkin and Huxley oscillator to an externally applied sinusoidal current with the amplitude and the frequency of the stimulating current taken as bifurcation parameters. The forced oscillations were analyzed with stroboscopic and Poincaré plots. The results showed that, in agreement with the experimental results, the forced oscillator exhibited not only periodic but also non-periodic motions (i.e. quasi-periodic or chaotic oscillations) depending on the amplitude and frequency of the applied current (Fig. 4C). Further, several routes to chaos were distinguished, such as successive period-doubling bifurcations or intermittent chaotic waves (as defined in Part I, [1]).

With a somewhat similar rationale, Hayashi and Ishizuka [45] used as a control parameter a dc current applied intracellularly through a single electrode to study the dynamical properties of discharges of the membrane of the pacemaker neuron of a marine mollusk, Onchidium verraculatum. Again, a Hodgkin and Huxley model did show a sequence of period-doubling bifurcations from a beating mode to a chaotic state as the intensity of the inward current was modified. The different patterns shared a close similarity with those observed experimentally in the same conditions (Fig. 5A1–A3).

Fig. 5

Discharge patterns of a pacemaker neuron caused by a dc current (A1–A3) representative samples of the recorded membrane potential. (B1–B3) One-dimensional Poincaré maps of the corresponding sequence of spikes constructed using the delay method (see [1] for explanations). (A1–B1) Regular discharges of action potentials. (A2–B2) Periodic firing with two spikes per burst. (A3–B3) Chaotic bursting discharges. (Adapted from [45], with permission of the Journal of Theoretical Biology.)

Another and interesting report by Jianxue et al. [46] showing that action potentials along a nerve fiber can be encoded chaotically, needs to be mentioned. ‘Spontaneous’ spikes produced by injured fibers of the sciatic nerve of anaesthetized rats were recorded and studied with different methods. Spectral analysis and calculations of correlation dimensions were implemented first, but with limited success due to the influence of spurious noise. However other approaches turned out to be more reliable and fruitful. Based on a study of interspike intervals (ISI), they included return (or Poincaré) maps (ISI(n+1) versus ISI(n); Fig. 5B1–B3) and a nonlinear forecasting method combined with gaussian scaled surrogate data. Conclusions that the time series were chaotic found additional support in the calculations of Lyapunov exponents after adjusting the parameters in the program of Wolf et al. [47], which is believed to be relatively insensitive to noise.

2.2.2 Chaos in axonal membranes: comments

General self criticism by Aihiara et al. [44] as to which ‘chaos’ with dimensions of the strange attractors between 2 and 3 in their experiments was observed under rather artificial conditions is important. This criticism applies to all forms of nonlinear behavior reported previously: in every instance the stimulations, whether electrical or chemical, were far from physiological. However chaotic oscillations can be produced by both the forced Hodgkin and Huxley oscillator and the giant axon when a pulse train [44] or a sinusoidal current [43] are used. This already implies that, as will be confirmed below, nonlinear neuronal oscillators connected by chemical or electrical synapses can supply macroscopic fluctuations of spike trains in the brain.

2.3 Single neurons

It is familiar to electrophysiologists that neuronal cells possess a large repertoire of firing patterns. A single cell can behave in different modes i.e. such as a generator of single or repetitive pulses, bursts of action potentials, or as a beating oscillator, to name a few. This richness of behavioral states, which is controlled by external inputs, such as variations in the ionic environment caused by the effects of synaptic drives and by neuromodulators, has prompted number of neurobiologists to investigate if, in addition to these patterns, chaotic spike trains can also be produced by individual neurons. If so, such spike trains would become serious candidate neural codes as postulated previously for other forms of signals thought to play a major role as information carriers in the nervous system [48,49]. Analytical ‘proof’ that this hypothesis is now well grounded has been presented for the McCulloch and Pitts neuron model [50].

Puzzled by the variability of activities in the buccal-oral neurons of a sea slug Pleurobranchae californica, Mpitsos et al. [51] recorded from individual cells with standard techniques and analyzed the responses generated in deafferented preparations in order to study the temporal patterns of signals produced by the central nervous system itself. The recorded cells, called BCN (for buccal-cerebral neurons) were particularly interesting since they can act as either an autonomous group or as part of a network that produces coordinated rhythmic movements of all buccal-oral behaviors. Several criteria of chaos were apparently satisfied by the analysis of the spike trains. These tests included the organization of the phase portraits and Poincaré maps which revealed attractors with clear expansions and contractions between space trajectories, positive Lyapunov exponents (assessed with the program of Wolf et al. [47]) and relatively constant correlation dimensions. The authors recognized however the limitations of these conclusions since their time series were quite short and often non-stationary. In addition surrogates were not used in their study.

Chaotic regimes were described with mathematical models of neuron R15 of another mollusk, Aplysia Californica, but their reality has only been confirmed directly with recordings from the actual cell. Neuron R15 had been known for long to fire in a normal, endogeneous, bursting mode [52] and in a beating (i.e. tonic) mode if a constant depolarizing current is injected onto the cell or if the sodium potassium pump is blocked. These activities were first mimicked qualitatively by Plant and Kim [53] with the help of a modified version of the Hodgkin and Huxley model. When implemented further for additional conductances and their dynamics by Canavier et al. [54–56], the algorithms predicted different modes of activity and, more importantly, that a chaotic regime exists between the bursting and beating modes of firing. That is, chaotic activity could well be the result of intrinsic properties of individual neurons and need not be an emergent property of neural assemblies. Furthermore the model approached chaos from both regimes via period doubling bifurcations. It was also suggested that these as well as other modes of firing, such as periodic bursting (bursts of spikes separated by regular periods of silence) correspond, in a phase space, to stable multiple attractors (Fig. 6A1–A3 and B1–B3). These attractors coexisted at given sets of parameters for which there was more than one mathematical solution (bistability). Finally, it was predicted that variations in external ionic concentration (of sodium or calcium), transient synaptic inputs and modulatory agents (serotonin) can switch the activity of the cell from one stable firing pattern to the other.

Fig. 6

Sensitivity of bursts to external stimuli. (A1–A3 and B1–B3) Control of model responses. (A1–A3) A short (1 s, 4 Hz) train of ‘synaptic’ inputs (arrow) delivered immediately after a burst (A1) induces after a brief initial transient a transition to a beating mode (A2) which persists during an hour. In the phase plane projection (A3), the original pattern is shown in cyan and the final attractor is in red. (B1–B3) Identical initial conditions as in A1, but the stimulus that is delivered earlier (B1) induces a prolonged shift into a new mode of firing (apparently chaotic according to the authors) (B2). In the corresponding phase plane (B3), the initial attractor is cyan and the final one is magenta. (C1–C2) Bistability in a intracellularly recorded R15 neuron. Shift of the cell from a bursting to a beating mode of activity. (C1) A brief current pulse (bottom) delivered during an interburst hyperpolarization is followed by a sustained beating after which the spiking activity returns to the original bursting pattern. (C2) Successive transitions between identical bursting and beating episodes (above), whether current pulses (bottom) are in the depolarizing or the depolarizing direction. (A1–B3 from [56]; C1–C2 from [60], with permission of the Journal of Neurophysiology.)

Experiments confirmed these prophecies in part. For example, transitions between bursting and beating had already been observed in R15 in response to the application of the blocker 4-aminopyridine (4-AP), suggesting that potassium channels may act as a bifurcation parameter [57]. Also transitions from beating to doublet and triplet spiking and finally to a bursting regime were described in response to another K+ channel blocker, tetraethyl ammonium which, in addition to this pharmacological property, was credited to induce ‘chaotic-like’ discharges in identified neurons of the mollusc Lymnae Stagnalis [58,59]. More critically, recordings from R15 were performed by Lechner et al. [60] to determine whether multistability is indeed an intrinsic property of the cell and if it could be regulated by serotonin. It was found that R15 cells could exhibit two modes of oscillatory activity (instead of eight in models) and that brief perturbations such as current pulses induced abrupt and instantaneous transitions from bursting to beating which lasted from several seconds to tens of minutes (Fig. 6C1 and C2). In presence of low concentrations of serotonin the probability of occurrence of these transitions and the duration of the resulting beating periods were gradually increased.

The contribution of ionic channels in the dynamic properties of isolated cells has been demonstrated by important studies of the anterior burster (AB) neuron of the stomatogastric ganglion of the spiny lobster, Pancibirus Interruptus. In contrast to ‘constitutive’ bursters, which continue to fire rhythmic impulses when completely isolated from all synaptic input, this neuron is a ‘conditional’ burster, meaning that the ionic mechanisms that generate its rhythmic firing must be activated by some modulatory input. It is the primary pacemaker neuron in the central pattern generator (see Section 3.2) for the pyloric rhythm in the lobster stomach. With the help of intracellular recordings, Harris–Warrick and Flamm [61] have shown that the monoamines dopamine, serotonin and octopamine convert silent AB neurons into bursting ones, the first two amines acting primarily upon Na+ entry and the latter on the calcium currents, although each cell can burst via more than one ionic channel (see also [61,62]). These experimental results were exploited on by Guckenheimer et al. [63] who characterized the basic properties of the involved channels in a model combining the formulations of Hodgkin and Huxley, and of Rinzel and Lee [64]. Specifically, changes in the intrinsic firing and oscillatory properties of the model AB neuron were correlated with the boundaries of Hopf and saddle-node bifurcations on two dimensional maps for specific ion conductances. Complex rhythmic patterns, including chaotic ones, were observed in conditions matching those of the experimental protocols. In addition to demonstrating the efficacy of dynamical systems theory as a means for describing the various oscillatory behaviors of neurons, the authors proposed that there may be evolutionary advantages for a nerve cell to operate in such regions of the parameter space: bifurcations then locate sensitive points at which small alterations in the environment result in qualitative changes in the system's behavior. Thus, using a notion introduced by Thom [65] the nerve cell can function as a sensitive signal detector when operating at a point corresponding to an ‘organizing center’.

The above mentioned studies met a rewarding conclusion when Abarbanel et al. [40] analyzed the signals produced in the isolated LP cells from the lobster stomatogastric ganglion. The data consisted of intracellularly recorded voltage traces from neurons subjected to an applied current of different amplitudes. As the intensity of the current was varied, the pattern of firing shifted via bifurcations, from a periodic (Fig. 7A and B) to a chaotic like (Fig. 7C–E) structure. The authors could not mathematically distinguish chaotic behavior from a nonlinear amplification of noise. Yet, several arguments strongly favored chaos, such as the robust substructure of the attractors in Fig. 7C and D. The average mutual information and the test of false nearest neighbors allowed to distinguish between noise (high-dimensional) and chaos (low-dimensional). This procedure was found to be more adequate than the Wolf method which is only reliable for the largest exponents.

Fig. 7

Dynamic changes of the membrane potential of a LP neuron. Left column: intracellularly monitored slow oscillations and spikes in the presence of the indicated values of directly applied currents. Right column: corresponding state phase reconstructions obtained with the time delay method (see text for explanations). The original coordinates are rotated so that the fast spiking motion takes place in the xy plane and the slow bursting motion moves along the z-axis. (Modified from [40], with permission of the Journal of Neurophysiology.)

Recent investigations on isolated cells have shown that dynamical information can be preserved when a chaotic input, such as a ‘Rössler’ signal, is converted into a spike train [66]. Specifically, the recorded cells were in vitro sensory neurons of rat's skin subjected to a stretch provided by a Rössler system, and, for the sake of comparison, to a stochastic signal consisting of phase randomized surrogates. The determinism of the resulting inter spike intervals (monitored in the output nerve) was tested with a nonlinear prediction algorithm, as described in [1]. The results indicated that a chaotic signal could be distinguished from a stochastic one (Fig. 8). That is, and quoting the authors, for prediction horizons up to 3–6 steps, the normalized prediction error (NPE) value for the stochastically evoked ISI series were all near 1.0, as opposed to significantly smaller values for the chaotically driven ones. Thus sensory neurons are able to encode the structure of high-dimensional external stimuli into distinct spike trains.

Fig. 8

Normalized prediction error as a function of the predicted horizon for chaotic and random (surrogate) signals, in a sensory neuron. An embedding dimension of three was used, significance was assessed with two-tailed tests. Inset: results from the statistical analysis: σs is the standard deviation of the normalized prediction error (NPE) for the surrogate trials; dashed line: significance level corresponding to the indicated p value. (From [66], with permission of the Physical Review Letters.)

Although based on studies of non isolated cells recorded in vitro, another report can be mentioned here, at least, as a reminder of the pitfalls facing the analysis of large neuronal networks with nonlinear mathematical tools. It represents an attempt to characterize chaos in the dynamics of spike trains produced by the caudal photoreceptor in the sixth ganglion of the crayfish Procambarus clarkii subjected to visual stimuli. The authors [67] rely on the sole presence in their time series of first order unstable periodic orbits statistically confirmed with gaussian surrogates, despite evidence that this criterion alone is far from convincing [68].

3 Pairs of neurons and ‘small’ neuronal networks

A familiar observation to most neurobiologists is that ensembles of cells often produce synchronized action potentials and/or rhythmical oscillations. Experimental data and realistic models have indicated that for some geometrical connectivity of the network (closed topologies) and for given values of the synaptic parameters linking the involved neurons, the cooperative dynamics of cells can take the form of a low dimensional chaos. Yet a direct confirmation of this notion, validated by unambiguous measures for chaos, has only been obtained in a limited sample of neural circuits. In principle, as noted by Selverston et al. [69], network operations depend upon the interactions of numerous geometrical synaptic and cellular factors, many of which are inherently nonlinear. But since these properties vary among different classes of neurons, it follows that although often taken as an endpoint by itself a ‘reductionist’ determination of their implementation can be useful for a complete description of network's global activity patterns. So far, such a detailed analysis has only been achieved successfully in but a few invertebrate and lower vertebrate preparations.

3.1 Principles of network organization

In an extensive review of the factors that govern network operations, Getting [70] remarked that individual conductances are not as important as the properties that they impart. Instead, he insists on two main series of elements. The first defines the ‘functional connectivity’. It includes the sign (excitatory or inhibitory) and the strength of the synaptic connections, their relative placement on the postsynaptic cell (soma or dendritic tree) and the temporal properties of these junctions. The second, i.e. the ‘anatomical connectivity’ determines the constraints on the network and ‘who talks to whom’. Despite the complexity and the vast number of possible pathways between large groups of neurons, several elementary anatomical building blocks which contribute to the nonlinearity of the networks can be encountered in both invertebrate and vertebrate nervous system. Such simple configurations have mutual (or recurrent) excitation (Fig. 9A) which produces synchrony in firing, and reciprocal (Fig. 9B) or recurrent (Fig. 9C) inhibitions which regulate excitability and can produce patterned outputs. Recurrent cyclic inhibition corresponds to a group of cells interconnected by inhibitory synapses (Fig. 9D), and it can generate oscillatory bursts with as many phases as there are cells in the ring [71]. In addition cells can be coupled by electrical junctions either directly (Fig. 9E) or by way of presynaptic fibers (Fig. 9F). Such electrotonic coupling favors synchrony between neighboring and/or synergistic neurons [72].

Fig. 9

Simple ‘building blocks’ of connectivity. (A) Recurrent excitation. (B) Mutual inhibition. (C) Recurrent inhibition. (D) Cyclic inhibition. (E) Coupling by way of directly opposed electrotonic junctions. (F) Electrical coupling via presynaptic fibers. Symbols: triangles and dots indicate excitatory and inhibitory synapses, respectively; resistors correspond to electrical junctions.

A number of systems can be simplified according to these restricted schemes [73], which remain conserved throughout phylogeny. As described below, such is the case in the Central Pattern Generators (CPGs) involved in specific behaviors that include rhythmic discharges of neurons acting in concert when animals are feeding, swimming or flying. One prototype is the lobster stomatogastric ganglion [74], in which extensive studies have indicated that (i) a single network can subserve several different functions and participate in more than one behavior, (ii) the functional organization of a network can be substantially modified by modulatory mechanisms within the constraints of a given anatomy, and (iii) neural networks acquire their potential by combining sets of ‘building blocks’ into new configurations which however, remain nonlinear and are still able to generate oscillatory antiphasic patterns [75]. These three features run contrary to the classical view of neural networks.

3.2 Coupled neurons

When they are coupled, oscillators, such as electronic devices, pendula, chemical reactions, can generate nonlinear deterministic behavior (Refs. [7,76]) and this property extends to oscillating neurons, as shown by models (Fig. 10) and by some experimental data.

Fig. 10

Determistic behavior of coupled formal neurons. (A) Two excitable cells modeled according to Rose and Hindmarsh generate periodic action potentials at the rate of 26 and 33 Hz, respectively (left) and each of these frequencies is visualized on a return map (right). (B) Same presentation as above showing that when the neurons are coupled, for example by an electrotonic junction, their respective frequency is modified and the map exhibits a chaotic-like pattern. Note the driving effect of the faster cell on the less active one. (After Faure and Korn, unpublished.)

Makarenko and Llinas [77] provided one of the most compelling demonstration of chaos in the central nervous system. The experimental material, i.e. guinea-pig inferior olivary neurons was particularly favorable for such a study. These cells give rise to the climbing fibers that mediate a complex activation of the distant Purkinje cells of the cerebellum. They are coupled by way of electrotonic junctions, and slices of the brainstem which contain their somata can be maintained in vitro for intracellular recordings. Subthreshold oscillations resembling sinusoidal waveforms with a frequency of 4–6 Hz and an amplitude of 5–10 mV were found to occur spontaneously in the tested cells and to be the main determinant of spike generation and collective behavior in the olivo-cerebellar system [78]. Nonlinear analysis of prolonged and stationary segments of those oscillations, monitored in single and/or in pairs of IO neurons was achieved with strict criteria based on the average mutual information, calculation of the global embedding dimensions and of the Lyapunov exponent. It unambiguously indicated a chaos with a dimension of ∼2.85 and a chaotic phase synchronization between coupled adjacent cells which presumably accounts for the functional binding of theses neurons when they activate their cerebellar targets.

Rather than concentrating on chaos per se, Elson et al. [79] clarified how two neurons which can individually generate slow oscillations underlying bursts of spikes (that is spiking bursting and seemingly chaotic activities) may or may not synchronize their discharges when they are coupled. For this purpose they investigated two electrically connected neurons (the pyloric dilatators, PDs) from the pyloric CPG of the lobster stomatogastric ganglion (STG). In parallel to the natural coupling linking these cells, they established an artificial coupling using a dynamic clamp device that enabled direct injections of, equal and opposite currents in the recorded neurons, different from to the procedure described in [80], in that they used an active analog device which allowed the change in conductivity, including sign, and thus varied the total conductivity between neurons. The neurons had been isolated from their input as described in Bal et al. [81]. The authors found that with natural coupling, slow oscillations and fast spikes are synchronized in both cells despite complex dynamics (Fig. 11A). But in confirmation of earlier predictions from models [40], uncoupling with additional negative current (taken as representing an inhibitory synaptic conductance) produced bifurcations and desynchronized the cells (Fig. 11B). Adding further negative coupling conductance caused the neurons to become synchronized again, but in antiphase (Fig. 11C). Similar bifurcations occurred for the fast spikes and slow oscillations, but at a different threshold for both types of signals. The authors concluded from these observations that the mechanism for the synchronization of the slow oscillations resembled that seen in dissipatively coupled chaotic circuits [82] whereas the synchronization of the faster occurring spikes was comparable to the so-called ‘threshold synchronization’ in the same circuits [83]. The same experimental material and protocols were later exploited by Varona et al. [87,91] who suggested, after using a model developed by Falke et al. [84], that slow subcellular processes such as the release of endoplasmic calcium could also be involved in the synchronization and regularization of otherwise individual chaotic activities. It can be noted here that the role of synaptic plasticity in the establishment and enhancement of robust neural synchronization has been recently explored in details [85] with Hodgkin and Huxley models of coupled neurons showing that synchronization is more rapid and more robust against noise in case of spike timing plasticity of the Hebbian type [86] than for connections with constant strength.

Fig. 11

Phase portraits of the slow oscillations in two coupled PD neurons as a function of the indicated external conductance ga. The projections on the two planes of variables VF1(t), VF2(t) in the left column, that is of the low-pass filtered (5 Hz) of the membrane potential V of cells 1 and 2 and of VF1(t), VF1(t+td) in the right column characterize the level of synchrony of bursts in the neurons, and the complexity of the bursts dynamics, respectively. (From [79], with permission of the Physical Review Letters.)

Conversely, isolated, non regular and chaotic neurons can produce regular rhythms again once their connections with their original networks are fully restored. This was demonstrated by Szucs et al. [87] who used an analog electronic neuron (EN) that mimicked firing patterns observed in the lobster pyloric CPG. This EN was a three degree of freedom analog device that was built according to the model of Hindmarsh and Rose. When the anterior burster (AB) which is one of the main pacemakers of the STG was photoinactivated and when synaptic connections between the cells were blocked pharmacologically, the PD neurons fired irregularly (Fig. 12A1 and A2) and nonlinear analysis indicated high-dimensional chaotic dynamics. However, synchronized bursting, at a frequency close to that seen in physiological conditions, appeared immediately after bidirectional coupling was established (as with an electrotonic junction) between the pyloric cells and the EN, previously set to behave as a replacement pacemaker neuron (Fig. 12B1). Furthermore switching the sign of coupling to produce a negative conductance that mimicked inhibitory chemical connections resulted in an even more regular and robust antiphasic bursting which was indistinguishable from that seen in the intact pyloric network (Fig. 12B2). These data confirmed earlier predictions obtained with models suggesting the regulatory role of inhibitory coupling once chaotic cells become members of larger neuronal assemblies [88,89].

Fig. 12

Connecting an electronic neuron to isolated neurons via artificial synapses restores regular bursting. Above: experimental setup. The pyloric pacemaker group of the lobster consists in four electronically coupled neurons. These are the anterior burster (AB), which organizes the rhythm, two coupled pyloric dilatator (PD) and the ventral dilatator (VD). Here AB is replaced by an electrotonic neuron (EN), set to behave in a state of chaotic oscillations. (A1–A2) Neurons disconnected exhibit chaotic discharges of action potentials. (B1–B2) Generation of a bursting pattern in the mixed network after coupling the cells. IPD is the current flowing into PD from EN. Note that the bursts are in phase, or out of phase, in EN and PD depending whether the coupling conductance is positive (B1) or negative (B2), respectively. (Adapted from Szucs et al. [87], with permission of NeuroReport.)

The LP neuron receives strong inhibitory inputs from three electrically coupled pacemaker neurons of the STG. These are the anterior burster (AB) and two pyloric dilator (PD) cells. As shown above, this setting had already been exploited by Elson et al. [79] to strengthen the notion that the intrinsic instabilities of circuit neurons may be regulated by inhibitory afferents. Furthermore, in control conditions [90], the spontaneous bursts generated by the LP neuron are irregular, as illustrated by the superimposed traces of Fig. 13A. However forcing inhibitory inputs had a strong stabilizing effect. When the latter were activated at 65 Hz the bursts were relatively stable and periodic and their timing and duration were both affected (Fig. 13B). This means that inhibition is essential in small assemblies of cells for producing the regulation of the chaotic oscillations prevalent in the dynamics of the isolated neurons (see also [91]). Equally important is that in confirmation, when cells were isolated from all their synaptic inputs their ‘free-running’ activity resembled that of a typical nonlinear dynamic system showing chaotic oscillations with some additive noise, a property that could account for the exponential tail of their computed variance (Fig. 13C).

Fig. 13

Control of bursting by inhibitory inputs in a LP neuron. Left column: superimposed traces, with individual bursts (n∼30% of a total of 165) sweeps are aligned at time 0 ms which corresponds to the point of minimum variance. Negative times indicate the hyperpolarizing phase preceding each burst's onset. Right column: variance as a function of time between the voltage traces calculated fom the entire sample of recordings in each condition. (A–C) See text for explanations. Open circles mark the mean time of burst termination. The arrow in C signals the exponential tail of the plot (from Elson et al. [90], with permission of the Journal of Neurophysiology).

3.3 Lessons from modeling minimal circuits (CPGs)

The role of the different forms of coupling between two chaotic neurons has been carefully dissected by Abarbanel et al. [40] in studies based on the results obtained with the Hindmarsh and Rose model. Although the values of some of the coupling parameters may be out of physiological ranges, interesting insights emerged from this work: for a high value of the coupling coefficient ε, synchronization of identical chaotic motions can occur. This proposition has been verified for coupling via electrical synapses (Fig. 14A1–A3) with measurements of the mutual information and of Lyapunov exponents. Similarly, progressively higher values of symmetrical inhibition, or of excitatory coupling, lead to in phase and out of phase synchronization of the bursts of two generators which can then exhibit the same chaotic behavior as one. This phenomenon is called ‘chaotic synchronization’ [82,92]. The authors extended these conclusions to moderately ‘noisy’ neurons and to non symmetrically and non identical coupled chaotic neurons.

Sensory dependent dynamics of neural ensembles have been explored by Rabinovich et al. [93] who described the behavior of individual neurons present in two distinct circuits, modeled by conductance based equations of the Hodgkin Huxley type. These formal neurons belonged to an already mentioned ‘CPG’, the STG (Section 3.2) and to coupled pairs of interconnected thalamic reticular (RE) and thalamo cortical (TC) neurons that were previously investigated by Steriade et al. [94]. Although the functional role played by these networks is very different (the latter passes information to the cerebral cortex), both of them are connected by antagonistic coupling (Fig. 15A1 and A2). They exhibit bistability and hysteresis in a wide range of coupling strengths. The authors investigated the response of both circuits to trains of excitatory spikes with varying interspike intervals, Tp, taken as simple representations of inputs generated in external sensory systems. They found different responses in the connected cells, depending upon the value of Tp. That is, variations in interspike intervals led to changes from in-phase to out-of-phase oscillations, and vice-versa (Fig. 15B1 and B2). These shifts happened within a few spikes and were maintained in the reset state until a new input signal was received.

Fig. 15

Responses of formal circuits to train of excitatory inputs. (A1–A2) Diagrams of the STG circuits (A1) and of the thalamo-cortical network (A2). Solid and empty dots indicate inhibitory and excitatory coupling connections. The resistor symbol in the CPG denotes a gap junction between the two neurons. External signals were introduced at loci indicated by arrows. (B1–B2) Time series (upper traces) showing the effect of external forcing by 1-s period trains of spikes (lower traces) at the indicated interspike intervals (Tp), in the CPG (B1) and in the RE-TC (B2) circuits. Action potentials from each of the two cells are indicated by solid and dashed vertical lines, respectively. (Adapted from [93], with permission of Physical Review E.)

Since bistability occurs in the CPG when there are two distinct solutions to the conductance-based equations within a given range of electrical coupling [93], the authors further investigated the range of the strength of the inhibitory coupling over which the RE–TC cells act in the same fashion. It turned out that there were two distinct phase portraits in the state space, each one for a solution set (Fig. 16). Here they illustrate two distinct attractors, and the one that ‘wins’ depends on the initial conditions of the system. The two basins of attraction are close to each other, supporting the fact that a switch between them can be easily produced by new spike trains. This behavior corresponds to what the authors call ‘calculation with attractors’ [91].

Fig. 16

State space portrait of two coexisting attractors of the RE–TC system. The solid line is the orbit in [V(t),IT(t),Ih(t)] space of the in-phase oscillations. The dotted line is the path taken in the same state space by the out of phase oscillations. Note that the two attractors are close to each other, supporting the notion that spike trains with appropriate intervals can induce transitions between them. Abbreviations: V membrane potential, IT and Ih: activation and inactivation of ionic channels. (From [93], with permission of Physical Review E.)

Larger cortical assemblies aimed at mimicking cortical networks were also modeled in order to characterize the irregularities of spike patterns in a target neuron subjected to balanced excitatory and inhibitory inputs [95]. The model of neurons was a simple one, involving two state units sparsely connected by strong synapses. They were either active or inactive if the value of their inputs exceeded a fixed threshold. Despite the absence of noise in the system, the resulting state was highly irregular, with a disorderly appearance strongly suggesting a deterministic chaos. This feature was in a good agreement with experimentally obtained histograms of firing rates of neurons in the monkey prefrontal cortex.

3.4 Comments on the role of chaos in neural networks

Most of the above reported data pertain to CPGs in which every neuron is reciprocally connected to other members of the network. This is a ‘closed’ topology, as opposed to an ‘open’ geometry where one or several cells receive inputs but do not send output to other ones, so that there are some cells without feedback. This case was examined theoretically by Huerta et al. [96] using a Hindmarsh and Rose model. Taking as a criterion the ability of a network to perform a given biological function such as that of a CPG, they found that although open topologies of neurons that exhibit regular voltage oscillations can achieve such a task, this functional criterion ‘selects’ a closed one when the model cells are replaced by chaotic neurons. This is consistent with previous claims that (i) a fully closed set of interconnections are well fit to regularize the chaotic behavior of individual components of CPGs [41] and (ii) real networks, even if open, have evolved to exploit mechanisms revealed by the theory of dynamical systems [97].

What is the fate of chaotic neurons which oscillate in a regular and predictable fashion once they are incorporated in the nervous system? Rather than concentrate on the difficulties of capturing the dynamics of neurons in three or four degrees of freedom Rabinovich et al. [89] addressed a broader and more qualitative issue in a ‘somewhat’ opinionated fashion. That is, they asked how is chaos employed by natural systems to accomplish biologically important goals, or, otherwise stated, why ‘evolution has selected chaos’ as a typical pattern of behavior in isolated cells. They argue that the benefit of the instability inherent to chaotic motions facilitates the ability of neural systems to rapidly adapt and to make transitions from one pattern to another when the environment is altered. According to this viewpoint, chaos is ‘required’ to maintain the robustness of the CPGs while they are connected to each other, and it is most likely suppressed in the collective action of a larger assembly, generally due to inhibition alone.

4 Neural assemblies: studies of synaptic noise

In all central neurons the summation of intermittent inputs from presynaptic cells, combined with the unreliability of synaptic transmission produces continuous variations of membrane potential called ‘synaptic noise’ [98]. Little is known about this disconcerting process, except that it contributes to shape the input–output relation of neurons (references in [99,100]). It was first attributed to a ‘random synaptic bombardment’ of the neurons and the view that it degrades their function has remained prevalent over the years [101]. More important, it has been commonly assumed to be stochastic [102–104] and is most often modeled as such [95,105,106]. Therefore the most popularized studies on synaptic noise have mostly concentrated on whether or not, and under which conditions, such a Poisson process contributes to the variability of neuronal firing [107–109]. Yet recent data which are summarized below suggest that synaptic noise can be deterministic and reflect the chaotic behavior of inputs afferent to the recorded cells. These somewhat ‘unconventional’ studies were motivated by a notion which has been and remains too often overlooked by physiologist, i.e. that at first glance, deterministic processes can take the appearance of stochasticity, particularly in high-dimensional systems. This question is addressed in details in [1]. As will be shown in the remaining sections of this review, this notion brings about fundamental changes to our most common views of mechanisms underlying brain functions.

4.1 Chaos in synaptic noise

Conventional histograms of the time intervals separating synaptic potentials and/or currents comprising synaptic noise suggest random distributions of this measure. However since a chaotic process can appear stochastic at first glance (see [1]), the tools of nonlinear dynamics have been used to reassess the temporal structure of inhibitory synaptic noise recorded, in vivo, in the Mauthner (M-)cell of teleosts, the central neuron which triggers the animal's vital escape reaction.

Several features of chaos were extracted from the differentiated representation of the original time series (Fig. 17A). Recurrence plots obtained with the time delay method already suggested the existence of non random motion [110]. Return (or Poincaré) maps were also constructed with subsets of events selected according to their amplitude by varying a threshold θ (Fig. 17B) and plotting each interval (n) against the next one (n+1). As θ was progressively lowered, the maps first disclosed a striking configuration which took the form of a triangular motif, with its apex indicating a dominant frequency, fp, of the inhibitory post-synaptic potentials that build up synaptic noise (Fig. 17C1). Subtracting events associated with fp in the initial time series further revealed at least three populations of IPSPs of progressively smaller amplitudes having in consecutive return maps, distinct periodicities πp,πs,πt (Fig. 17C1 and C2), all in the so-called gamma range commonly observed in higher vertebrates. Two series of observations were compatible with chaotic patterns, (i) mutual interactions and correlations between the events associated with these frequencies were consistent with a weak coupling between underlying generating oscillators and, (ii) unstable periodic orbits (Fig. 17D) as well as period 1, 2 and 3 orbits (see also Section 6.1) were detected in the return maps [39]. The notion of a possible ‘chaos’ was strengthened by the results of measures such as that of the % of determinism and of the Kolmogorov–Sinai entropy [111] combined with the use of surrogates, which confirmed the nonlinear properties of synaptic noise (Fig. 17E).

Fig. 17

Evidence for non random patterns in the synaptic noise of a command neuron. (A) Consecutive IPSPs observed as depolarizing potentials (dots) recorded at a fast sweep speed (V(t), above) and their derivative (dV/dt, below). The dashed line delineates the background instrumental noise. (B) Derivative of a segment of synaptic noise, recorded at a slow sweep speed: fast events, each corresponding to an IPSP were selected by a threshold θ, having different values (from top to bottom θ1,θ2,θ3); intervals between each selected event, are labeled I(n) and I(n+1). (C1–C2) Return maps constructed with events selected by θ2, i.e. above a level corresponding to an intermediate value of the threshold. The density was calculated by partitioning the space in 50×50 square areas (i.e. with a resolution of 0.42×0.42 ms) and by counting the number of points in each of these boxes. Areas in blue, green, red and yellow indicate regions containing less than 4 points, between 4 and 8, 8 and 12 or more points, respectively. (C1) The principal and secondary periods, πP=16.25 ms and πS=14.4 ms fit the highest density of points at the lower edge of the triangular pattern. (C2) A third period, πT=13.3 ms is unmasked at the base of another triangle obtained after the events used to construct C1 have been excluded. (D) Unstable period orbits (n=5) with stable and unstable manifolds determined by sequences of points that converge towards, and then diverge from, the period-1 orbit (labeled 2), in the indicated order. (E) Variations of the significance level of two measure of determinism, the %det and μ(ε), as a function of θ (see [1] for definitions). The vertical dashed line indicates a confidence level at 2e−5 after comparison with surrogates (A–D) form. (Modified from [39], with permission of the Journal of Neurophysiology.)

A model of coupled Hindmarsh and Rose neurons, generating low-frequency periodic spikes at the same frequencies as those detected in synaptic noise (Fig. 18A) produced return maps having features similar to those of the actual time series providing, however, that their terminal synapses had different quantal contents (Fig. 18C1 and B1 versus C1 and B2). In these simulations the quantal content varied in the range determined experimentally for a representative population of the presynaptic inhibitory interneurons which generate synaptic noise in the M-cell [112]. The involvement of synaptic efficacies in the transmission of dynamical patterns from the pre- to the postsynaptic side was verified experimentally, taking advantage of the finding that the strength of the M-cells inhibitory junctions are modified, in vivo, by long-term tetanic potentiation (LTP), a classical paradigm of learning that can be induced in teleosts by repeated auditory stimuli. It was found (not illustrated here) that this increase of synaptic strength enhances measures of determinism in synaptic noise without affecting the periodicity of the presynaptic oscillators [39].

Fig. 18

Contribution of synaptic properties to the transmission of presynaptic complex patterns. (A) Modeled Hindmarsh and Rose neurons (labeled 1 to 4), coupled by way of inhibitory junctions and set to fire at 57, 63, 47 and 69 Hz, respectively. (B1–B2) Analysis of postsynaptic potentials produced by uniform junctions. (B1) Top. Same neurons as above implemented with terminal synapses having different releasing properties but the same quantal content, np. Bottom. Superimposed IPSPs generated by each of the presynaptic cells and fluctuating in the same range. As a consequence they are equally selected by the threshold, θ. (B2) The resulting return map appears as random. (C1–C2) Same presentation as above but, terminal synapses now have distinct quantal contents. Thus θ detects preferentially IPSPs from oscillators 1 and 2 (C1) and the corresponding map exhibits a triangular pattern centered on the frequency of the larger events (C2). Note also the presence of UPOs (n=6). (From [39], with permission of the Journal of Neurophysiology.)

4.2 ‘Chaos’ as a neural code

The nature of the neural code has been the subject of countless speculations (for reviews, see [113–115]) and, despite innumerable schemes, it remains an almost intractable notion (for a definition of the term and its history, see [116]). For example, it has been proposed [48,117–119] that the coding of information in the Central Nervous System (CNS) emerges from different firing patterns. As noted by Perkel [49] ‘non classical’ codes involve several aspects of the temporal structure of impulse trains (including burst characteristics) and some cells are measurably sensitive to variations of such characteristics, implying that the latter can be ‘read’ by neurons (review in [120]). Also, a rich repertoire of discharge forms, including chaotic ones, have been disclosed by applying nonlinear analysis (dimensionality, predictability) to different forms of spike trains (references in [121]). Putative codes may include the rate of action potentials [104,122], well defined synchronous activities of the ‘gamma’ type (40 Hz), particularly during binding [123] and more complex temporal organization of firing in large networks [124,125]. The role of chaos as well as the reality of a code ‘itself’ will be further questioned in Section 8.3.

Relevant to this issue, it has been suggested that chaos, found in several areas of the CNS [67,126], may contribute to the neuronal code [95,127–129]. But the validation of this hypothesis required a demonstration that deterministic patterns can be effectively transmitted along neuronal chains. Results summarized in the preceding section indicate that, surprisingly, the fluctuating properties of synapses favor rather than hamper the degree to which complex activities in presynaptic networks are recapitulated postsynaptically [39]. Furthermore, they demonstrate that the emergence of deterministic structures in a postsynaptic cell with multiple inputs is made possible by the non-uniform values of synaptic weights and the stochastic release of quanta.

4.3 Stochastic resonance and noise

The emerging concept in neurosciences of stochastic resonance (SR), which assigns a useful role to random fluctuations, must be mentioned. It refers to a cooperative phenomenon in nonlinear systems, where an intermediate level of activity improves the detection of subthreshold signals (and their time reliability [130]) by maximizing the signal-to-noise ratio (references in [131]). The theory of SR has mostly been developed with the simplifying assumption of a discrete two state model [132,133]. It is described as the consequence of interactions between nonlinearity, stochastic fluctuations and a periodic (i.e. sinusoidal) force [134] and it applies to the case of integrate-and-fire models (references in [135]). The basic concepts underlying this process are illustrated in Fig. 19A and B.

Fig. 19

Modulation of a periodic signal by stochastic resonance. (A) Modeled subthreshold sinusoidal signal with added Gaussian noise. Each time the sum of the two voltages crosses the threshold (horizontal line) a spike is initiated (above) and a pulse is added to the time series, as indicated by vertical bars (below). (B) Power spectrum density (ordinates) versus signal frequency (abscissae), with a sharp peak located at 0.5 kHz (arrow). Inset: signal to noise ratio (SNR-ordinates) as a function of noise intensity (abscissae) showing that the ability to detect the frequency of the sine wave is optimized at intermediate values of noise. (Modified from [131], with permission of Nature.)

Data from several experimental preparations have confirmed that SR can influence firing rates in sensory systems, such as crayfish [136] and rat [137] mechanoreceptors, the cercal sensory apparatus of paddlefish [138], and frog cochlear hair cells [139]. It can also play a positive role in rat hippocampal slices [140] and in human spindles [141], tactile sensation [142] and vision [143]. SR is also likely to occur at the level of ionic channels [144] and it could favor synchronization of neuronal oscillators [145].

Several aspects of SR call for deeper investigations, particularly since noise, a ubiquitous phenomenon at all levels of signal transduction [146], may embed nonrandom fluctuations [147]. Enhancement of SR has been demonstrated in a Fitzhugh–Nagumo model of neuron driven by colored (1/f) noise [148], while periodic perturbations of the same cells generate phase locking, quasiperiodic and chaotic responses [149]. In addition, a Hodgkin and Huxley model of mammalian peripheral cold receptors, which naturally exhibits SR in vitro, has revealed that noise smooths the nonlinearities of deterministic spike trains, suggesting its influence on the system's encoding characteristics [150]. A SR effect termed ‘chaotic resonance’ appears in the standard Lorentz model in the presence of a periodic time variation of the control parameters above and below the threshold for the onset of chaos [151]. It also appears in the KIII model [152] involving a discrete implementation of partial differential equations. Here noise not only stabilizes aperiodic orbits, since an optimum noise level can also act as a control parameter, that produces chaotic resonance [153] which is believed to be a feature of self organization [153]. Finally SR has been reported in a simple noise-free model of paired inhibitory-excitatory neurons, with piece-wise linear function [154].

5 Early EEG studies of cortical dynamics

An enormous amount of efforts has been directed in the last three decades towards characterizing cortical signals in term of their dimension in order to ascertain chaos. However, with time, the mathematical criteria for obtaining reliable conclusions on this matter became more stringent, particularly with the advent of surrogates aimed at distinguishing random from deterministic time series [155]. Therefore despite the astonishing insights of their authors, who opened new avenues for research, the majority of the pioneer works (only some of which will be alluded to below), are outdated today and far from convincing.

5.1 Cortical nets

Most of the initial investigations have relied upon the analysis of single channel electroencephalographic (EEG) signals, with attempts to estimate dimension with the Grassberger–Procaccia algorithm, the average pointwise dimension [156], the Lyapunov exponent, the fractal dimension [157] or the mutual information content [158]. But in addition to the conflict between the requirement of long time series and the non-stationarity of actual data, serious difficulties of such measures (such as artefacts or possible misinterpretations) have been pointed out [159,160]. That is, refined tests comparing measures of segments of EEGs led to the conclusion [160] that the actual data could not be distinguished from gaussian random processes, pending support of the view that EEGs are linearly filtered noise [161], either because they are not truly chaotic or because they are high dimensional and determinism is difficult to detect with current methods. This rather strong and negative statement was later on moderated by evidence that, as pointed out by Theiler [155], despite the lack of proof for a low-dimensional chaos, a nonlinear component is apparent in all analyzed EEG records [160,162–167]. This notion is illustrated in Fig. 20, where recordings obtained from a human EEG, were analyzed with a method that combined the redundancy approach (where the redundancy is a function of the ‘Kolmogorov–Sinai’ entropy [166]) with the surrogate data technique. The conclusion of this study was that, at least, nonlinear measures can be employed to explore the dynamics of cortical signals [168]. This view has been strongly vindicated by later investigations ([169], see Section 6.2.2).

Fig. 20

Nonlinearity of human EEG. (A,B) Redundancy assessed on a 90 s recording session (during the sleep state) at one location of the scalp (A), and on its surrogates (B), as a function of the time lag. The four curves in each panel correspond to a different embedding dimension, n=2 to 5 (from bottom to top). Note the lack of qualitative differences between the tests computed from the EEG and its surrogates. (C–D) Linear (C) and nonlinear (D) redundancy statistics for the same EEG record. Note that several highly significant differences (tens of SDs) were detected in the nonlinear statistics in contrast with the low difference in the linear ones. Note also the different scales, in C and D. (From [166], with permission of Biological Cybernetics.)

5.2 Experimental background

Freeman and his collaborators took advantage of the relative simplicity of the olfactory bulb electrogenesis and of the ability to insert arrays of electrodes in this structure in conscious rabbits, to (i) search for EEG patterns during odor recognition and discrimination, and (ii) investigate the effects of learning. Both were expected to stabilize the distribution of the recorded electrical activities. In presence of a learned odor, a distinctive pattern was observed on the entire bulbar surface, suggesting that each neuron in this structure participated in every discriminative response [170,171]. Furthermore a mathematical model of the bulb was constructed with nonlinear differential equations [172,173], possibly because dimensionality could not be measured due to limited data sets. It generated time series that resembled the surface EEG obtained experimentally (for details, see Section 7.1). These belonged to four classes which are illustrated in Fig. 21A, namely, (i) total silence, as in deep anaesthesia, (ii) a ‘normal’ state, with fast and desynchronized traces, which were recorded in waking but unmotivated animals, suggesting a chaotic activity with a correlation of 5.5 (in the model) and 5.9 (in the experimental data), (iii) in reaction to a learned odor, the EEG was characterized by inspiratory bursts of oscillations that disappeared during expiration and simulations suggested that this state corresponds to a limit cycle attractor [173] that was specific to a given odor, with a dimension decreasing from 2.3 to 1.13 during its presentation. That is, this irregular pattern was interrupted by oscillatory bursts following activation of the olfactory receptors. Finally, (iv) a last type of activity resembled that of an epileptic seizure; it occurred after an intense electrical simulation of the lateral olfactory tract, and it had a dimension of ∼2.6 in both experimental and simulated data [174]. The corresponding attractor was toroidal shaped. The authors believed that the shift from one state to the next could occur abruptly, via bifurcations, and they concluded that when placed in a given learned input ‘domain’, the neural system has a tendency to generate a qualitatively distinctive form of ordered behavior that emerges from a chaotic background state [127].

Fig. 21

Dynamical states identified in the rat olfactory bulb by EEG recordings. (A) From bottom to top. Recordings obtained in the indicated behavioral phase (see text for explanations). (B) Comparison of attractors computed using experimental data (left) and using a model (right) from the prepyriform cortex (CP) and the olfactory bulb (BO), suggesting a torus. Note that although related, the two maps are not identical. (From [173], with permission of Biological Cybernetics.)

Other analysis of experimental data include those of Rapp et al. [175], who reported low-dimensional firing rates of (sometime injured) neurons in the anaesthetized squirrel monkey and of Röschke and Basar [176], who observed low (within 4 to 5) correlation dimensions of slow waves recorded with chronically implanted electrodes in the auditory cortex, the hippocampus and the reticular formation of sleeping cats. Studies of extracellularly recorded spike trains obtained in the optic tectum of awake pigeon [177] and in the thalamus and substantia nigra of anaesthetized rats [178] suggested chaos, with evidence that sensory (auditory) stimulations strongly affected the ‘chaotic’ behavior in the latter preparation. ‘High-dimensional’ nonlinear structures of interspike intervals with predictability were reported in nigral dopaminergic neurons [179].

5.3 First struggles with human data

Babloyantz and her collaborators [180] were the first to study the human EEG with the tools of nonlinear dynamics that they applied to recordings obtained during sleep. Chaos was assumed on the basis of low dimensions (4–5) and positive Lyapunov exponents computed during stages 2 and 4, characterized by α and γ waves, respectively. In contrast, no attractor was detected in the awake state or during the rapid eye movement (REM) phases of sleep. Numerous reports followed this observation but they were liable to strong criticisms despite the fact that the number of subjects examined increased and the algorithms as well as comparative statistics became more rigorous [181]. Conflicting conclusions were also obtained relative to (i) whether dimension is higher when eyes are opened than when they are closed and α rhythm is more pronounced [182–184], and (ii) defining a ‘resting’ state in the sole presence of a low dimensionality [185,186]. On the other hand, results obtained with a variety of tasks cutting across different sensory modalities and various states of attention supported the idea that nonlinear analysis is a valid approach for characterizing aspects of brain dynamics that cannot be seen with classical spectral methods (references in [168]; see also [187,188]).

5.4 Pathological processes and chaos

Although models of neural networks had already indicated that bifurcation sequences were involved in transitions from steady states to chaotic activities [189], the first dimensional analysis of an epileptic (petit mal) EEG was, again, provided by Babloyantz [190] who postulated the existence of a chaotic attractor being the direct consequence of the “deterministic nature of brain activity”. Phase portraits of attractors was constructed (Fig. 22), and measures of the dimensionality (which was low), of the Lyapunov exponent, evaluation of the autocorrelation function and comparisons of the derived values with those of ‘normal’ EEGs seemed to be in agreement with the author's conclusions. This work was followed by investigations of human epileptic EEGs [191,192] and rat [193] with measures of Lyapunov exponents and of the correlation dimensions, which suggested the emergence of chaotic attractors during seizures.

Fig. 22

Human epileptic seizure activity. The illustrated petit mal episode, lasting ∼5 s, was the longest and the least contaminated by noise during a 24 hour recording session. Channels 1 and 3, which measured potential drops between the frontal and parietal regions of the scalps, were used to construct the space phase trajectories and for further analysis (see text) suggesting chaotic-like components in the signals. Inset: phase portrait constructed with channel 1. (Adapted from [190], with permission of the Proceedings of the National Academy of Sciences (USA).)

Investigations of other diseases such as Creutzfeld–Jakob, schizophrenia and finnitus were inconclusive (see details in [168]) but they reinforced the belief in ‘dynamical diseases’ [7] and a potential usefulness of a nonlinear approach for diagnostic purposes (see also Section 6.3.2).

6 Recent approaches of cortical dynamics

Despite serious pitfalls and limitations that have been dissected out in several reports [164,194–196], studies of brain signals have greatly benefited from the method of surrogate-data testing for nonlinearity [155]. As detailed in Part I of this review [1] the basic principle here is that nonlinearity can be established by a comparison of a nonlinearity measure of the data on the one side and of a collection of surrogate data sets on the other side, the latter sharing the data's linear properties but otherwise being random [196]. Although the null hypothesis for linearity of this test can still be rejected by noisy or intrinsically unstable time series [197], the availability of surrogates opened a new era in the studies of signals generated by single neurons and/or by neuronal assemblies.

6.1 Unstable periodic orbits

An alternative to conventional measures of chaos has been to search in the reconstructed space phase or in return maps for geometric figures called unstable periodic orbits (UPOs) which constitute the skeleton of chaotic systems [198,199]. Chaotic trajectories wander endlessly around unstable fixed points in sequence of close approaches to, and departures from them, along characteristic directions called stable and unstable manifold, respectively. The structure of this dynamics is known mathematically as that of a ‘saddle point’. This analogy refers to the behavior of a ball placed on a saddle. Specifically, if placed at its center the ball will remain there until a small perturbation displaces it to one side or the other, but always in the transverse direction towards one of the stirrups (unstable manifold). Conversely, if the ball is placed in front or the back of the saddle, it will roll along the center line (stable manifold) towards the unstable equilibrium point located at the center of the saddle. This simple metaphor, which was proposed by Weiss et al. [200] helps to understand a basic property of chaotic systems: due to their critical sensitivity to initial conditions, they can be controlled by an external minimal perturbation [200–202]. Several methods are available for this purpose [203] and they all take advantage of the fact that a chaotic trajectory can be stabilized on a desired orbit or otherwise stated, that the ‘ball’ can be pushed back to the center of the saddle, near an unstable equilibrium point (see also Fig. 17 of Faure et al. [1]).

Thus, theoretically, one could detect chaos in natural systems, by using (i) the search for UPOs, and (ii) chaos control techniques, as pioneered in arrhythmic cardiac tissue by Garfinkel et al. [204]. Such a control was successfully achieved in rat hippocampal slices exposed to high external K+ perfusion which triggered irregular burst discharges of pyramidal cells, occurring at irregular intervals and resembling epileptic interictal spike foci [126]. The bursts became increasingly periodic following electrical stimulations of the Schaffer collaterals at frequencies determined by previous identification of unstable fixed points using extracellularly recorded time series converted into first return maps which exhibited well defined UPOs. An inverse procedure termed ‘anticontrol’ was also effective in moving the system away from these orbits and reduced its periodicity.

The Schiff et al. [126] interpretation of their data has been strongly challenged by Christini and Collins [205]. Implementation of the Fitz–Nagumo model allowed these authors to demonstrate that chaos criteria of the form used in [126] could be reproduced by a noise-driven, non chaotic neuronal system. They obtained similar results when they applied chaos control to a single stochastic system, suggesting that this procedure can be applied to a wider range of experimental systems than previously assumed.

Statistically significant evidence (as demonstrated with surrogates) of the existence of UPOs has been obtained in time interval scatter plots of spike discharges of the caudal photoreceptor of the crayfish [67]. Rules for a strict definition of UPOs were in fact established in this preparation where their presence was taken as an indicator of low dimensional dynamics. They were also found in the inhibitory synaptic noise of the teleost M-cell [110] and in periodic bursts of action potentials recorded with EMGs in intact swimming lampreys, or with suction electrodes during lamprey's fictive swimming [206]. In addition UPOs were carefully tested against surrogates in series of time intervals between successive spike discharges recorded with EEG electrodes on the scalp of a patient suffering from epileptic focal seizures [207]. The recordings were taken in three consecutive experimental conditions, i.e. at rest, and during the performance of a visual or an auditory discrimination task requiring a finger response. Only one UPO was detected at rest whereas two specific ones emerged, in a striking one to one association, following a given sensory stimulus presentation (visual or auditive). Finally, So et al. [208] applied the transform technique previously described by them (which uses the local dynamics of the system such that the transformed data are concentrated about distinct UPOs and identify complex higher period orbits) to (i) single cell and (ii) network burst firing activity recorded in rat hippocampal slices as well as to (iii) digitized human EEG, collected from epileptic patients. They were able to unravel the ‘hierarchy’ of low periodic orbits (Fig. 23A and B) present in dynamical systems [209,210] and to establish that the estimated dynamics near the UPOs have predictive properties, thus confirming that close trajectories near them have similar behavior (Fig. 23B).

Fig. 23

Hierarchy of UPOs from a single hippocampal cell. (A) Above: Sample recorded sweep with consecutive action potentials. Below: return maps with a family of period-2 (left) and period-3 (right) orbits. Colors indicate probability of the transform data density being outside the distribution of maximum peaks observed from 100 transformed surrogate densities. The most significant spots, with probability ⩾0.95 are shown in red; note the presence of two of them (out of 6 possible period-2 orbits) and of three (out of 5 possible -3 orbits) strongly significant ones (B) Local dynamics around three representative orbits. They are illustrated by green and red lines, which indicate their stable and unstable manifolds, respectively. From left to right: period-1, -2, and -3 orbits. Deviations between pairs of representative trajectories are plotted in successive time steps and indicated by connecting lines. Symbols = ▵, initial points; ∘ subsequent steps; + predicted positions. (Adapted from [129], with permission of The Biophysical Journal.)

6.2 Period-doubling bifurcations

Some neuronal systems can undergo transitions from a steady state to an oscillatory firing mode (Fig. 2) or from producing periodic to bursting clusters of action potentials (Figs. 5 and 6). And a universal evidence of chaos is provided by complete sequences of period-doubling cascades, as first evidenced by Feigenbaum [211] references in [1]. This ‘road’ to chaos can be induced by injecting intracellularly various intensities of DC currents which act as a ‘control parameter’, as in the case of the squid giant axon illustrated in Fig. 4. Similarly, noise-mediated oscillators contained in the electrosensory organs of the paddlefish Polydon Spathula can be forced to generate nearly periodic spiking patterns, with frequency locking in different modes, by external periodic and electric fields [212]. More physiological stimuli, such as sensory inputs, can act in the same way and produce profound changes in the behavior of ‘integrative’ structures of the brain.

A striking example of this scheme is found in the work of Crevier and Meister [213] who subjected the retina of the larval tiger salamander and the human visual system to flicker patterns of varying frequency and contrasts and recorded in vivo the accompanying electro retinograms (ERG). They found that “during rapid flicker, the eye reports to the brain only every other flash of light”. Retinal ganglion cells, fire spikes on alternating flashes, resulting in period-doubling bifurcation in visual processing. Specifically, at slow frequency, a volley of spike was observed at both the onset and offset of each flash. As frequency increased above 4 Hz, the ‘on’ volley disappeared (Fig. 24A1) and above 9 Hz, every other flash was followed by a volley of action potentials whereas the intervening ones failed to elicit a response (Fig. 24A2). Another bifurcation occurred at 12 Hz with no more than 1 response every 4 stimulations. Finally, above 15 Hz, a seemingly chaotic pattern was recognized in the ERG and in the nerve fibers (Fig. 24A3). More strikingly, the entire population of retinal cells acted in synchrony rather than ‘choosing’ flashes independently of each other. At higher frequencies, this ‘synchronous period-doubling’ reversed until the signal was again periodic for values >30 Hz (Fig. 24B1). The authors observed a similar scenario when they varied the contrast of the flashes, while keeping the frequency constant (Fig. 24B2).

Fig. 24

Synchronous period-doubling of responses of the salamander retina. (A1–A3) In each panel recordings of the ERG (above) and optic nerve fibers (middle) to uniform flash stimuli (below) delivered at the indicated frequencies. Filled circles designate response magnitude; note that the response amplitude repeats on every cycle (A1), every two cycles (A2) and becomes chaotic (A3) as the stimulus frequency increases from 4 to 15 Hz (B1–B2). Bifurcation plots of ERG amplitude as a function of flash frequency at constant contrast (B1) and as a function of contrast at constant flash frequency. (C1–C3) Period-doubling in a model of nonlinear feedback where ‘input impulses’ are scaled by a variable gain g(y) that depends on the time constant τ of the decay of the amplitude B of a preceding impulse (C1). Bifurcation plots of the model output pulse amplitude displayed as in B1 and B2, are shown in C2 and C3, respectively. (Adapted from [213], with permission of the Journal of Neurophysiology.)

Accelerating sequences of period-doubling lead to chaotic regimes in neural models [54] and many nonlinear systems that exhibit such a behavior contain some form of negative feedback by which responses affect the subsequent ones. Thus Crevier and Meister constructed a single model of nonlinear feedback where the peak amplitude of the ERG was taken to be proportional to the amplitude of the light flash, C, and to a gain factor, g(y) (Fig. 24C1). With just two parameters (i.e. gain and its exponential decay τ as a function of the recent response), the predicted bifurcation plots matched the approximate locations of the experimental branch points for both flash frequency (Fig. 24C2) and contrast (Fig. 24C3). This confirmed experimental evidence that the critical feedback interactions require only cone photoreceptors and off-bipolar cells.

The possible mechanisms of this period-doubling (which may involve synaptic and/or presynaptic conductances) remain unclear and one can also note that the chaotic sequences of the bifurcation plots were not fully characterized with measures of chaos. Finally, Canavier and Meister demonstrated an analogous regime of period doubling in human ERGs occurring between 30 and 70 Hz. When measured by the power in the subharmonic components of flash frequencies the degree of period-doubling of the human visual evoked potentials was even greater in the latter than in the ERGs. The authors further suggested that this process is related, at the retinal level, with illusionary flicker patterns.

6.3 Epileptic activities and electroencephalograms

EEGs represent the integral output of a large number of neurons, with complex underlying dynamics or, otherwise stated, of subsystems with numerous degrees of freedom. In addition, the presence of noise of unknown origin makes it hopeless to reinterpret the data within the frame of chaos theory. This was conclusively demonstrated by Theiler [214] who reexamined a published case of an epileptic EEG that had previously yielded ‘evidence’ of chaos. The measures (Lyapunov exponent and correlation dimension) turned out to be closely matched by surrogates. Accordingly most authors failed to demonstrate ‘true’ chaos in the resting brain on the basis of EEG recordings that are barely distinguishable from those of linearly filtered noise [166,215–218].

6.3.1 Epilepsy and chaos-indicators

Despite the above mentioned difficulties in characterizing EEGs which ruined initial hopes (see also Section 5.3), epilepsy continued to be considered as a privileged material for ascertaining chaos because it is a widely recognized model of neuronal synchronization and it is commonly believed that seizure episodes are characterized by bifurcations to system states of low complexity [219,220]. In line with this notion, a number of available data indicate that decreasing dimensionality is an essential characteristic of sensory information processing [221]. The complexity of the EEG is decreased during the P3 event-related potential in a task dependent and area specific way [222], as shown with the method of ‘point-correlation’ dimension, [223] which is capable of tracking changes in epochs with non stationary features. Thus, whatever their conclusions about ‘chaos’ per se, results of studies of animal and human epileptic brains deserve particular attention because they brought a new light to the relevance of nonlinear tools for comparative studies, (i.e. state by state), of neuronal dynamics. This concept is further discussed in Section 8.4.

Finally, epileptic bursts were produced in the CA3 region of the rat hippocampal slices exposed to a K+ enriched extracellular medium by electrical stimulations of the mossy fiber inputs. Time series of the evoked field potentials were analyzed and the conclusion was that of ‘undoubted’ evidence for chaos [224]. These were bifurcations leading to a chaotic state, strange attractors in the tri-dimensional phase space, positive Lyapunov exponents estimated using the Wolf's algorithm and non-invertible strobomaps with unstable fixed points (Fig. 25A1 and A3). In addition the phase diagrams delineated regions with several values of phase-locking and irregular responses having features of intermittency (Fig. 25B). Given the strength of these findings it is unfortunate that the authors did not validate their conclusions with surrogates, particularly since Schiff et al. [225] had not long before applied tests for determinism to time series of population spikes also monitored in the rat CA3 region in presence of a high K+ medium; three tests had been applied (an adapted version of the local flow approach, a local dispersion and a nonlinear prediction), and in all instances the surrogates pleaded in favor of stochasticity.

Fig. 25

‘Chaotic’ responses of epileptiform bursts. (A1–A2) Left. Sample recordings of field potentials evoked in a slice of the CA3 region (upper traces) by a mossy fiber stimulation and successive current pulses (lower traces) of constant amplitude and duration. Note the amplitude fluctuations of the responses, with a 1:1 (A1) a 1:2 (A2) phase locking and a chaotic pattern (A3) when the intervals between pulses were 0.5, 0.9 and 1.9 Hz, respectively. Middle. Attractors reconstructed from the corresponding time series of 40 s each (delay time: τ=10 ms). Right. One dimensional strobmaps constructed with 71, 201 and 201 evoked responses; the maps show a non-invertible function and a slope at their fixed points more negative than −1. (B) Phase diagram of the field potentials responses. In areas marked 1:n, one burst occurs each n=1,2,3,4 current pulses and in the regions marked by a filled triangle, a square and by a diamond, intermittency occurs. In the area signaled by a star, the responses are not deterministic (abscissa: inverse of the interpulse intervals, 1/T; ordinate: amplitude of the current pulses, I). (Adapted from [224], with permission of Brain Research.)

6.3.2 Prediction of epileptic seizures

Past efforts to identify systematic changes in the preictal to ictal EEG with linear measures [226,227] were resurrected with the presumptions that such recordings have nonlinear properties. This finding prompted numerous groups to search for preseizure patterns that could precede the onset of electrical and/or clinical manifestations of the disease and help localize the epileptic locus [193,219,220,228–238]. As expected, several nonlinear measures, particularly the effective correlation dimension, Deff, and the Lyapunov exponents, can help ‘anticipate’ the occurrence of seizures up to about twenty minutes before their onset. These results were taken as justifying hopes that one can construct an ‘in vivo warning device’ that could help control drug-resistant epilepsies. On the experimental side, intracellular recordings of neurons from guinea pig CA3 hippocampal slices indicated that the slow depolarization, which precedes the critical ‘paroxysmal depolarizing shift’, is accompanied by clear modifications of Deff in some models of epilepsies (i.e. induced by xanthine or penicillin) whereas a similar loss of complexity could not be detected in low-veratridine models [239]. However as in the case of other complex functions and syndromes, different reports have tempered this optimism. For example, warnings were given about artifacts that appear at high value of the correlation integral and erroneous interpretations of the phase randomization of the data encountered during a study where only one out of six seizures yielded ‘high-quality’ attractors [240]. Furthermore, careful comparisons of methods (power spectral density, cross correlation, principal components analysis, phase correlation, wavelet packet analysis, correlation integral and mutual prediction) for detecting, in intracranially recorded EEGs, the earliest dynamical changes leading to seizures, found no predictive advantages of the nonlinear over the linear ones, and wisely warranted ‘addressing the problem from a variety of viewpoints’ [241]. A rather similar conclusion had been reached by Schreiber [197] who used the reversibility test in one case of epilepsy and found no definite proof for chaos despite the ‘rejection’ of the null hypothesis by surrogates.

7 Dynamics of large scale networks

7.1 Possible role of chaos

Application of the tools and concepts of nonlinear dynamics to studies of the neural correlates of ‘higher’ brain functions have motivated several hypothesis regarding biological attractors and their role in information processing, perception, motor behavior, memory and cognition. These functions involve enormous populations of cells and multiple positive as well as negative feedbacks. These features, together with the striking variability of the signals obtained by recording neuronal activity [242] have been taken as arguments favoring the notion that the dynamics of the nervous system are nonlinear, and even chaotic. This hypothesis has continued to attract numerous researchers despite unconvincing experimental results (since there are no definite tests for chaos when it comes to analyzing multidimensional and fluctuating biological data ([9], see also [1])). Also, there have been suggestions that, rather than chaotic, some experimental series may be better described using other terms (1/f long-range scaling, fractal, multifractal) which, however, have no clear implications with respect to underlying mechanisms [9].

A number of formal models with dynamic and/or chaotic properties have served as analogs for, and as alternatives to, physiological networks. Their efficiency in pattern recognition, storing and retrieval, and their effectiveness, for example in employing Cantor sets for coding information have been the subject of extensive research, based on various mathematical tools [10,89,95,114,243–250]. Yet, none of these descriptions fully capture the features of complex systems. As this highly specialized field is rapidly evolving, the most relevant propositions of only some of these models will be considered.

The first, and perhaps strongest advocates of chaos in the brain, namely Freeman and his collaborators, have replicated features of the EEG of the olfactory system and its dynamics (see also Section 5.2) including during perceptual processing and recognition of a known odor, or learning of a new one [127,173,174,251–255]. Their model of the olfactory bulb, denoted KIII is composed of nonlinear ordinary differential equations (ODEs) with feedback delay. Each ODE stands for a neural mass, having either an excitatory or inhibitory output to the ODEs with which it connects. Connected mutually inhibitory or excitatory pairs mimic populations of similar neurons that form one structure (called KI). A KII set of excitatory or inhibitory KI sets portrays one of the relay stations of the olfactory system (i.e. the olfactory bulb, the anterior olfactory nucleus and the prepyriform cortex, respectively). Both basal and stimulated states were mimicked and the results were compared with actual recordings from rats and rabbits, using amplitude histograms, power spectra, attractor reconstruction, visual inspection of traces and correlation. ‘With proper settings’ the model yielded sustained chaotic activity that was indistinguishable from that of background EEG of the resting animals. Furthermore, with stimulations, it produced ‘bursts’ of oscillations that forced the system through a change from the aperiodical basal state to a more periodic pattern similar to that observed during inhalation (see also Fig. 21).

These experiments led Freeman's group to postulate specific roles of chaos in memory and brain functions and to apparently refute the classical views of information processing and of representations advocated by connexionist neuroscientists. Rather, internal states would correspond to low dimensional attractor with multiple ‘wings’ [255–257]. According to this scheme, the central part of the attractor is its basal chaotic activity. It provides the system with a ‘ready state’ and each of the wings may be either a near-limit cycle or a broad band chaos which stands for many templates or neural cell assemblies. Pattern recognition is then considered as the transition from one wing to another, whereas a novel input (with no template) activates the system to a non-reproducible near-limit cycle wing, or a new broad band. The scale invariance of the system and its independence of the initial conditions at the transitions enable the system to classify an uninterrupted sequence of stimuli. That is, and taking a specific example, a chaotic well provides an escape from an established attractor so that an animal can recognize an odorant as novel with no greater delay than for any known sample with, however, the freedom to maintain its activity while building the new attractor [127]. Therefore chaos confers the system with a deterministic ‘I don't know’ state within which new activity patterns can emerge.

The strong and weak points of the model have been repeatedly debated by the authors themselves and by others. It is interesting that the main uncertainties concerning the ‘physiological’ requirements of the algorithm that were postulated at the onset of this work [127] have been validated for a large part by recent electrophysiological studies. These were (i) excitation of the periglomerular cells by each other and to mitral cells [258], (ii) mutual inhibitory interneurons in the olfactory bulb [259] and cortex, and, (iii) mutual excitatory connections among mitral cells in the bulb [260]. Conversely, it is now demonstrated that, contrary to earlier beliefs of the authors [255] inhibitory synapses can undergo changes with learning.

The endless controversy about the respective virtues of chaotic models versus the connectionnist ones, developed by others [249,261–266] remains unsolved. According to Freeman [127] the connectionnist algorithms tend to be ‘pattern completion devices’ the successful task of which can only be achieved when interactions between units are appropriately weighted. Therefore, chaotic systems would be well designed for preventing convergence and for an easy ‘destabilization’ of their activity by a novel input (odor). They could also be ideally fit for accommodating the neural networks with a new and still unlearned stimulus. In an extension of this model, and to account for phase modulations of chaotic waves in the EEG gamma range, the term of intervening ‘mesoscopic’ domains, extending over large areas of the cortex, has been borrowed by Freeman from physics [10,152].

7.2 More about olfaction and neural networks: winnerless competition

Working primarily on the processing of olfactory information at the first two stages of its transformation, that is at the level of the receptors and their postsynaptic targets, G. Laurent and his collaborators saught to understand how the brain solves the double and contradictory task of recognizing odors as unitary percepts (in a synthetic sense) and categorizing them (with the ability to recognize, in a noisy environment, small differences between odors). Their credo was that these early olfactory circuits (and other sensory networks as well) should be viewed as a system and that our current thinking about sensory integration in terms of ‘responses’ following a stimulus is too often linear and passive: one should rather consider them as active and internal processes where a major role is devoted to the dynamics of the brain circuits themselves [267,268]. Specifically, two objectives are accomplished in parallel by the nervous system. The first is to create ‘through spatio-temporal patterns of neuronal activation’ a large coding space in which representational clusters which allow the storage and recall of unpredictable ‘items’ can spread. The second is to confer stability to this space in the face of noise and to optimize it [267].

This group's research has been focused on the dynamical properties of individual and ensembles of neurons firing in response to odor presentations, in vivo, in both insects (locusts Schistocerca Americana) and zebrafish (Danio Rerio). In insects, the neurons are, the antennal (AL) or olfactory lobe projection neurons (PN), which are activated by the olfactory receptor neurons (ORNs), and whose signals (triggered by odorants in broad and overlapping peripheral regions) are transmitted to the mushroom body (a center for learning and memory). The ALs and PNs are organized according to the same anatomical principles as the olfactory bulb and the mitral-tufted cells of vertebrates, respectively. In addition, local GABAergic inhibitory neurons, i.e. periglomerular or granule cells (GCs) in vertebrates and local neurons (LNs) in insects can act on local or distant connections between the ORNs and mitral cells, or their equivalents in other species.

According to these authors, several dynamical properties of the olfactory system justify the choice of the so-called winnerless competition model, which will be defined below. First, as in other species (references in [267]) it was found that individual odors evoke complex temporal response patterns in many (but not all) of the insect PNs [269] and in zebrafish MCs [270]. The responses differ across odors for a given neuron and across neurons for a given odor thereby causing, in each case, the formation of specific neural assemblies. Fig. 26A1 illustrates these findings and shows that, in addition, some neurons respond by a period of inhibition preceding a delayed spiking. All these responses were stable and reliable following repeated stimulations (Fig. 26A2). They were superimposed on one of several epochs of the oscillations of the extracellular local field potential (LFP) that signals a coherent and synchronized population activity (Fig. 26B), with reproducible and reliable periods of phase-locking for each neuron. But unlike the firing pattern in individual units, the oscillation frequency (20–30 Hz) are independent of odor identity. One way to summarize these data is to consider [271] that overall the macroscopic oscillation is caused by a stimulus-specific ‘message’ which is distributed in space (the odor-singular sets of synchronized cells) and in time (the periods when the cells synchronize and desynchronize in an odor-particular manner), reflecting the fact that the odor representation in the olfactory bulb is distributed and combinatorial. Clustering (correlation) followed by declustering of cells during odor representation changes continuously throughout a stimulus in a manner that progressively reduces the similarities between ensembles coding for related odors [270].

Fig. 26

Odor-evoked responses and their behavior in the phase state. (A1–A2). Firing patterns of three simultaneously recorded locust PNs labeled 1 to 3. (A1) Spike discharges and inhibitory postsynaptic potentials (arrows) evoked by the presentation of two different odors, at times indicated by the horizontal bars. (A2) Reliability and specificity of these dynamic responses shown by the poststimulus time histograms of successive events in PN1 and PN2 (n=12 successive trials). (B) Hypothesis for odor coding in the olfactory system. Above. Local field potential (LFP) in the mushroom body (MB). Below. Temporal changes in activity of 16 projection neurons (PNs circles) in the antennal lobe (AL). PN can be silent or inhibited (white), spiking but not synchronized (barred) or active and phase-locked with the LFP (dark). Note that in response to an odor pulse (on) the LFP oscillations are caused by ensembles of synchronized cells. (C1–C4) Schematic illustrations in a 3D phase space of four types of dynamical behavior produced by systems with symmetrical (C1) or assymmetrical (C2–C4) coupling. (C1) multistability (C2) weak competition (C3) winner take all (C4) winnerless competition (see text). (Modified from, A1, A2: [275], with permission of Neuron; B: [271], with permission of the Journal of Neuroscience; C1–C4: [267], with permission of the Annual Review of Neuroscience.)

Secondly, synaptic inhibition plays a major role in the patterning of the odor-evoked neural assemblies. Experimental evidence indicates that blocking the fast inhibition mediated by LNs with GABA antagonists abolishes the oscillatory synchronization [272–274] without however affecting the slow phases of inhibition observed before, during and after bursting in individual neurons or impairing the ability to discriminate stimuli. Accordingly, simulations with a Hodgkin and Huxley type model [275,276] clarified the respective roles of the fast and still unidentified slow inhibitory mechanisms in forming the dynamical PN assemblies illustrated in Fig. 26A1 and A2.

A common nonlinear model of olfactory processing is that of coding with attractors, or Hopfield nets [263,277], where each odor is represented by an autonomous and specific attractor. Each one has its own basin of attraction and is created through training with a stimulus which modifies a particular set of connections in the olfactory bulb until a steady state is obtained. The resulting picture of this process is that of several coexisting attractors in a multistable system, as illustrated in Fig. 26C1. It was argued [267] that although the Hopfield systems are dynamic, they become static after convergence and furthermore they have hard capacity limits. On the other hand, Freeman's models do not ‘explicitly’ take into consideration individual neurons or network topology. Thus another classical paradigm, called the ‘winnerless competition model’ (WLC), is advocated by G. Laurent and his collaborators [268,278].

Like other nonlinear models, WLC is based on simple nonlinear equations of the Lotka–Volterra type where (i) the functional unit is the neuron or a small group of synchronized cells and (ii) the neurons interact through inhibitory connections. Several dynamics can then arise, depending for a large part on the nature of this coupling and the strength of the inhibitory connections. If the connections are symmetrical, and in some conditions of coupling [267,278], the system behaves as a Hopfield network (Fig. 26C1) or it has only one favored attractor if all the neurons are active (Fig. 26C2). If the connections are only partly asymmetrical, one attractor (which often corresponds to the activity of one neuron) will emerge in a ‘winner-takes-all’ type of circuit (Fig. 26C3). Finally a ‘weakly chaotic’ WLC arises when all the inhibitory connections are nonsymmetrical; then, the system, with N competitive neurons, has different heteroclinic orbits (see Section 2.1.3) in the phase space (Fig. 26C4). In this case, and for various values of the inhibitory strengths, the system's activity ‘bounces off’ [267] between groups of neurons: if the stimulus is changed, another orbit in the vicinity of the heteroclinic orbit becomes a global attractor. In such a manner, WLC encodes many stimuli: the capacity of an N node Hopfield network is N/7, while that of a WLC network is e(N−1)!. Furthermore the latter is strongly dissipative (i.e. it quickly forgets its initial state) and represents information by transient orbits rather than by attractors, per se.

Rabinovich et al. [278] implemented a WLC with nine FitzHugh–Nagumo neurons having synaptic inhibitory currents modeled by first order kinetics. Their numerical simulations indicated that the network produced firing patterns which were different for different stimuli in the mimicked PNs and in manners that was furthermore consistent with those observed experimentally [267,269,279]. Full information about the inputs (their ‘representation’) was found in the output sequences. In dynamical terms, the WLC networks “produce identity-temporal or spatio-temporal coding” in the form of deterministic trajectories moving along heteroclinic orbits that connect saddle fixed points or limit cycles as in the phase space. These ‘saddle states’ correspond to the activity of specific neurons or groups of cells, with sequential switching from one state to another. The advantages of this model are global stability, sensitivity of the dynamics to the forcing stimulus, insensitivity to noise and a larger capacity than that of other classical models.

The same chaotic model has been hypothesized to be at the origin of the hunting behavior of a mollusk Clione Limacina. This predator is a gastropod which lacks a visual system and finds its prey during a search behavior characterized by a circular motion whose plane and radius change in a chaotic-like manner [280] that produces random changes of direction in the gravitational field. Clione has been used extensively for studies of basic mechanisms underlying orientation and locomotion (references in [281,282]). It swims by using rhythmical oscillations (about 1 Hz) of two wings and its direction in a three dimensional space is governed by the bending of its tail (Fig. 27A). When swimming, it maintains a vertical (head up) position [281]. Driven by signals from gravity sensing organs, the statocysts, the network causes correction to deviations from this position by producing flexions of the tail. The statocyst (which contains a stone-like structure statolith) has an internal wall with about ten mechanoreceptors (SRNs) that are excited by the statolith. The SRNs send axons to the pathways controlling wing and tail motions [283] and they form a network in which a fraction of them (30%) are coupled with GABAergic inhibitory nonsymmetrical connections.

Fig. 27

Winnerless competition between sensory neurons. (A) Above: spatial orientation of free swimming Cliones. Body profiles during passive sinking (a and b) and when locomotion is resumed (c and d). Below: drawings of the animal excited by a contact with its prey (e) and swimming against the bottom (f). Note that a, b, e and c, d, f, are dorsal and lateral views of Clione, respectively. (B) Schematic representation of the model network with six sensors receptor neurons (SRNs) and the inhibitory connections (thicker lines indicate stronger ‘synaptic’ strengths). (C) Computer-modeled time series of activities in the SRNs (labeled a1 to a6) under the action of the hunting neuron (dimensionless units). (D) Three-dimensional projection in the phase space of the activities in neurons a2, a4 and a6, which were linked by the strongest inhibitory connections. (A from [283], with permission of the Journal of Neurophysiology; B, C and D: adapted from [280], with permission of Chaos.)

The direction of movement or orientation of Clione changes in time in an irregular and unpredictable manner as the animal searches for its food, the small mollusk Limacina helicina [284] which triggers this behavior. Two large cells, the cerebral hunting neuron (CHNs) excite the RSNs and control the activation of the networks. That is, the behavior can be caused by (i) external sensory stimulations, or (ii) internal signals via the CHNs. The intrinsic mechanism, which is essential to the model of Varona et al. [280], was analyzed during in vitro experiments, which showed that the isolated nervous system can indeed produce fictive hunting behavior and generate chaotic like-motor outputs to the tail muscle [278].

The model comprises a statocyst with six SRNs having Lotka–Volterra-type dynamics and inhibitory connections (Fig. 27B). Depending on the latter, the system can exhibit the several dynamics illustrated in Fig. 26C1–C4. However, based on experimental data, it was reasonable to assume that the inhibitory connections are asymmetrical, three of them being strong, three others moderate, and the rest weak.

When there was no activation of the sensory neurons, the statolith induced a high rate of firing in one of them (that may organize the head-up position) and the others were quiet. But the winnerless competition between sensory neurons could override the effects of the statolith for given stimulations of the CHNs. The neurons then displayed a chaotic behavior with activities switching among the receptors (Fig. 27C and D) and with positive Lyapunov exponents. These results support the notion that in the presence of a prey, the SRN network generates new information, i.e. chaotic outputs with positive Kolmogorov–Sinaı̈ entropy, which can organize via the motoneurons the apparently random hunting behavior of Clione [280].

7.3 Chaotic itinerancy

In a series of important articles, Tsuda and his collaborators give critical arguments that favor the notion of chaos in the brain and that it participates in perception and memory. Their demonstration is based on mathematical models of associative memories constructed according to their knowledge of the anatomy of the neural circuits of the cerebral cortex and the hippocampus, as described by Szentagothai [285,286].

Similar to those of McCulloch and Pitts [287], the formal neurons of [288] have two states, +1 (firing) and −1 (reset). If the neuron exceeds the threshold at a given time, it fires with a probability p, which is independent of the activity level and is otherwise set to zero, for convenience. The network is made of pyramidal (excitatory) and stellate or basket (inhibitory) cells; the former send fibers to all the pyramidal cells whereas the latter make only synaptic contacts with one of them (Fig. 28A). A Hebbian synaptic learning is assumed. More generally, this non-equilibrium model consists of two blocks I and II containing a recurrent net and positive feedback connections whose strengths are fixed, and they differ only by the addition of a negative feedback connection in block II.

Fig. 28

Transitions between states through chaotic itinerancy. (A) Architecture of the system for associative memory. Shaded triangles and large circles denote pyramidal and stellate neurons respectively; synapses are symbolized by small filled circles (see text for explanations). (B) Schematic representation of a chaotic itinerancy. The trajectory wanders among attractor ruins; it is attracted by one of them but it leaves it via an unstable direction toward another attractor. (C–D) Two-dimensional view of classical attractors, illustrating from left to right (in C), fixed points, a limit cycle and chaotic orbits, and, (in D), a ‘ruin’ attractor where the dynamical orbits approach a fixed point but then escape from it; this fixed point can be considered as a ruin of a Milnor attractor. (A from [290], with permission of Neural Networks; B and D from Tsuda, 200, with permission of Behavioral and Brain Sciences.)

The successive recalls of stored memories and the consequences of the interplay between the dynamical system and noise were systematically studied. Two types of noise were implemented. One, (called dendritic), results from electric currents randomly leaking from neighboring cells. The other is equivalent to the synaptic noise produced by quantal release of synaptic vesicles (whether spontaneous or spike-triggered) by incoming fibers, as defined in Section 4.1; it was injected in the network to produce a ‘stochastic renewal of dynamics’, since in this model a neuron does not always fire when the sum of the inputs crosses the threshold. Rather, at that point in time, either a threshold dynamics is selected or the previous dynamics is used again. This results [289] in an iterated function system (IFS). The overall dynamics is determined by the instability of the IFS which is due, in this type of network, to the reset caused by specific inhibitory neurons.

Depending on pre-determined probabilities the emerging dynamics was that of a ‘chaotic intermittency’ (Fig. 28B) either between attractors in the usual sense (fixed points, limit cycles, tori or strange attractors – see Fig. 28C) or, due to the activation of the inhibitory neurons (particularly in the block II), between ‘exotic’ Milnor attractors [290,291].

Chaotic itinerancy is a particular form of transitions between states and of chaotic behavior. It has been proposed as a universal class of high dimensional dynamical systems after it was found in a model of neural dynamics. As explained by Tsuda [248,291], in a multi-state high dimensional system, each state corresponds to an attractor but in the case of weak instability, only a ‘trace’ of it remains in the system and unstable directions appear in its neighborhood. Once destabilized, it becomes an attractor ‘ruin’ (Fig. 28D), which is related to a Milnor attractor. Milnor attractors may possess unstable manifolds and a system can escape from them as a consequence of small perturbations. This nice feature may not be sufficient to make them biologically relevant [292] but it is important to note that chaotic itinerancy has been reported in vivo preparations [257,293]. Therefore the model can be used, according to Tsuda [291] for the interpretation of cognitive processes, particularly given some of its striking properties. These include the retention of information [294], the capability to learn and recognize patterns [288], and the ability to simultaneously process learning and recall [295].

In thermodynamic models such as that of Hopfield [263], external noise is essential for helping a system to escape local minima (references in [99]) and to undergo transitions in the landscape between peaks and valleys (Fig. 29A). In Tsuda's model, instability is intrinsic to the system (Fig. 29B). This represents an interesting challenge in brain studies. Specifically, it could be important to determine whether, and how, chaotic behavior is generated by noise, as in the case of the ‘noise-induced order’ and chaos of Matsumoto and Tsuda [296]. That is, even if a low-dimensional chaos (as it is strictly defined by mathematicians), does not exist in the nervous system, the interplay of the latter with noise could be responsible for a topologically and functionally similar behavior.

Fig. 29

Schematic illustration of the difference between transitions produced by chaotic itinerancy and by addition of external noise in an attractor landscape. (A) In a formal model, noise allows the system escape local minima; as the noise level increases so does the probability of a jump between energy levels. (B) In chaotic itinerancy, the instability is intrinsic to the system and transitions from one orbit to the next are autonomous (from [291]).

Another issue raised by Tsuda [291] is (again) that of coding of information in neural sets, particularly in the hippocampus and in olfactory networks driven by a chaotic input. Such ‘chaos-driven contracting’ systems possess attractors represented by so-called SCND functions (for singular-continuous but nowhere differentiable). These functions which can be viewed [297,298] as ‘fractal images’ with Cantor sets, related to the formation of “episodic memory and primitive thought processes” [291]. It has been predicted that they will be found in the membrane potential of inhibitory neurons driven by chaotic afferents [297]. Despite their attractiveness, these proposals are still grounded on mathematical perspectives alone and one can wonder to what extent they are implemented neurologically [299]. On the other, hand elegant studies [300] strongly advocate that chaos intermittency and coding could be more effective in solving the ‘binding’ problem during perception and attention rather than, as believed by so many authors, spike coincidence and neural oscillations.

8 General conclusions

The reluctance of many physiologists to adopt chaos theory has been justified, at least, in part, by evidence that there still is a large gap between the use of topological profiles in a state space to characterize a neural system or function and their use to identify the underlying physical correlates. One has to recognize indeed that so far, the main application of nonlinear systems in neurobiology have been to reconstruct, from a single measurement of the membrane voltage, a ‘proxy’ state space description [301] that gives access to the number of ‘degrees of freedom’, i.e. of dynamical variables involved in the studied process. This strategy gives however a firm experimental basis to the size (that is the degrees of freedom) of the models describing this process [301] and one must bear in mind that this approach has proved quite successful for understanding the dynamics of higher brain functions, as will be discussed below.

8.1 Reality of ‘neurochaos’

As other complex systems the brain is constructed along several overlapping spatial (here anatomical) hierarchies. These range from molecules and neurons to small and large networks. Also, it operates within a broad range of time scales, including milliseconds (spikes and synaptic potentials duration), seconds (networks operations), hours and more (LTP). Probably relating to this multiplicity of scales, chaos has been reported at almost all levels of the CNS that is in invertebrate neurons and spinal giant axons, in central pattern generators, vertebrate olivary and hippocampal neurons, in the olfactory bulb, and at the level of the human brain. Therefore it has been proposed [248] to designate this class of chaotic phenomena under the term ‘neurochaos’. Yet, as repeatedly noted in this review, clear and convincing demonstrations that there is chaos in the nervous system are still scarce because the results of specific measures of invariants such as Lyapunov exponents, entropy, correlation integrals [1] become less reliable as one investigates progressively higher levels of the neural hierarchy and high-dimensional biological systems.

This paucity of firm experimental foundations has been compensated for by theoretical studies most often originating from modified (but still realistic) Hodgkin–Huxley equations which have predicted and/or confirmed neurochaos. These studies helped to dissect its underlying components (coupled neuronal oscillators and synchronized clusters of cells) and to obtain modifications of the membrane potential including bifurcations and trains of bursting spikes similar to those recorded from the living brain. Several mechanisms at the origin of neurochaos have been considered [173,248,291,302]. Among them the most common are (i) the presence of slow channels or of a delay (refractory period) that affect the input–output relation of neurons, (ii) feedback and coupling between excitatory and inhibitory drives at the cellular level or in the design of networks, (iii) neuronal circuits behaving as damped nonlinear oscillators [173], and (iv) the more theoretical noise-induced chaos already mentioned in Section 4.3 of this review (references in [248]). Strictly speaking, chaos has been unambiguously demonstrated but in a few and privileged cases only, particularly given the presence of bifurcations, at the axonal and single cell levels (Sections 2.2 and 2.3) and at pairs of coupled neurons or in small circuits (Section 3.2).

In contrast, if viewed in a broader perspective than just the identification of chaos as strictly defined mathematically, the use of nonlinear tools has been quite fruitful at both extremes of the scale or complexity during investigations of neural functions.

8.2 Functions of inhibition in chaotic systems

At the elementary or ‘reductionist’ level we have learned a few lessons. One is a confirmation of the critical role played by inhibitory processes on the dynamics of neuronal assemblies. Inhibitory interneurons couple other neurons, both anatomically and functionally, and therefore they participate in the shaping of dynamical assemblies and/or oscillators that can generate chaos. This has been demonstrated (i) in the case of the presynaptic neurons that produce in the Mauthner cell the non random components of its inhibitory synaptic noise (Section 4.1), (ii) in hippocampal and olfactory systems, where inhibition is essential for the formation of clusters of oscillatory cells, for their transitions to new states along their road to chaos and for the temporal patterning of activity in neural assemblies or the formation of dynamic ‘memories’ (Sections 5.2 and 7.2), as well as (iii) the dynamics and the resetting of models of higher brain functions (Section 7.3). Inhibition has opposite effects in invertebrate's CPGs where it contributes to the stabilization of the spontaneous chaotic oscillations that prevail in the behavior of isolated neurons: this chaotic behavior disappears once the cells are reembedded in the entire network (Sections 3.2 and 3.3).

Another lesson is that, as shown in CPGs and in the olfactory bulb, neither a neuron, nor a cell assembly, is designed to serve a single purpose. Rather, they can implement several functions depending upon the internal states of the brain and the constraints placed upon it by environmental factors.

8.3 Chaos, information theory and the neural code

In a more global and ‘integrative’ perspective, nonlinear studies may seriously challenge some of our strongest beliefs in neurosciences. In addition there could be considerable benefits for the nervous system to choose chaotic regimes given their wide range of behaviors and their aptitude to quickly react to changing conditions [1].

Important recent progress has been made in studies of the relationship between chaos and information theory, i.e. the possible role of chaotic dynamics in ‘information’ processing and coding in the brain (see also Section 4.2). This issue was already raised by pioneers in the early eighties [303–307] and subsequent theoretical studies have suggested that a dynamic preservation of information in the brain can be achieved in the presence of chaotic activities in neurons as well as in coupled chaotic systems [248]. Specifically, and although the Lyapunov exponents and other measures of invariants such as entropy indicate a loss of information or unidirectional transmission, information is preserved by a process named information-mixing whereby at least part of this information survives from one time step to the next, even if most of its contents included in previous digits or sequences has been lost.

Particular attention has been paid, with the help of information theory, to the description and quantification of the nature and quality of information in linear and nonlinear (but not in chaotic) neurons and in neuronal networks [301,308–311], see also [312]. A major concern has been to assess rigorously the relationship between a stimulus set and the subsequent neural response and to characterize the behavior of dynamic systems under continuously varying input conditions. This problem had been considered to be quite difficult since identical stimuli can yield variable trains of spikes in the recorded axons [311]. Further developments based on the Wiener–Volterra methods for measuring response functions have been obtained [212] during investigations aimed at clarifying how the brain recovers (i.e. ‘decodes’) information about events in the world transmitted (or ‘coded’) by spike trains. Such studies are certainly fundamental for our understanding of the nonlinear input–output functions that prevail in the nervous system [301], although the question has also been raised whether there truly is a need for decoding and binding instead of simply recreating distinct sets of states of dynamical assemblies ([313], see also below). Finally information theoretic methods have proven useful for identifying system nonlinearities and for the validation of nonlinear models and their expectations [310].

8.4 Representation of the outside world

Chaos theory is the most spectacular branch of dynamical systems theory which may prove, in a near future, to be a most fruitful avenue for understanding brain operations and higher functions. Both theories are based on the use of the same nonlinear tools and language, starting with constructing a phase space that provides a topological description of the system under scrutiny and of its behavior (see [1]). Dynamical theory is a serious and almost irreducible challenge to the computational framework still favored by the majority of neuroscientists who believe that neuronal networks are internally organized to function as ‘computational’ devices [314–316].

The notion that the brain is a device that generates outputs resulting from transformations of its inputs (or symbolic messages) according to given rules (or algorithms) dates back to the first models of neurons by McCulloch and Pitts [287], the advent of Wiener's cybernetics [317] and Shannon's information theory [318]. In its simplest version this view states that the sensory organs deliver messages to the nervous system which ‘computes’ the appropriate output after series of manipulations carried out in successive and arbitrary time steps. The assumption that the nervous system acts as a machine, or as a ‘digital brain’ has been pursued by numerous studies aimed at solving the coding problem and by decades of modeling of neural networks (often of the connectionist type). Representative models of this kind take the form of layers of neuron-like elements that are trained to deal with numerical input–output transformations; here the critical factors are the network's architecture and the learning algorithms [316,319–322]. Cognitive and decision-making ‘computational’ processes are treated as a succession of calculations of the value of each alternative outcome, or choice, after which the system (or the brain) ‘chooses’ the best value out of all possible ones. A rather static and passive idea of internal representations that would be ‘carved’ in the brain by actions of the outside world is inherent to this traditional belief: some symbols encode (or ‘represent’) information, and a given input is systematically associated to a specific output The related models have a variety of anatomo-functional architectures and they use synaptic and cellular constraints such as Hebb's rules or conditions for LTP induction to produce increases of synaptic efficacy that are required for creating novel representations in neural nets.

Serious objections have been advanced against computationalism. They posit that the brain and the cognitive system are not computers that can be neatly divided in distinct modules, each of which accounts for a different symbol-processing task and communicates its solution to other modules [323]. In contrast and rather than manipulating frozen and formal representational units at whatever level (it can be a spike code or an architecture), the nervous system evolves continuously and in real time in conjunction with changes in the surrounding world. A pioneer of this way of thinking has been Freeman [127] for whom studies on the patterns generated by the olfactory system (and how) was a necessary prelude for understanding the higher order processes by which “they are assembled into Gestalts in the larger apparatus of the forebrain” [10]. In contrast to man-made systems designed to perform particular tasks, the brain would rely on self-organized (and chaotic) processes to generate relevant activity patterns and perception would be a creative course of actions “rather than a look-up table of imprinted data”. Similarly, the more recent dynamical systems theory seeks to understand the unfolding of cognitive processes over time and to identify what external and internal factors influence this unfolding [314].

As documented by van Gelder and Port [324], dynamicists are concerned with the behavior of large ensembles of neurons. Whereas classical neurobiologists focus their attention on single cells or on small networks, dynamicists construct low-dimensional models suggesting that the CNS is a single system with numerous (and interactive) state variables. When it is analyzed with the mathematical tools of dynamics, the behavior of such an ensemble is the totality of its overall states ie that of a sequence of points that delineate attractors, trajectories and bifurcations and various geometrical landscapes in the phase space. Remarkably, all the components of the system are modified at the same time and according to well-defined rules. This evolution must be understood as a process during which changes depend on forces that (i) operate within the system, and (ii) are best described with differential equations. Inputs cease to uniquely determine the internal state of the brain they merely perturb its intrinsic dynamics and in this context the term ‘representation’ no longer applies to the well-defined cast of an internal or external scenario but rather may include internal dynamical states (as they have been defined above namely attractors, limit cycles and so on…).

The history and the main principles of the theory of dynamical systems their connections with other fields of neuroscience and, more recently with the self-organization and emergence of new patterns in the so-called complex systems can be found in several reports and books [324–328]. This theory has quickly proven successful for dealing with, for example, neural circuits [18], the control of movement [329,330], language [331], perception and action [332], cognition [322,333,334], operations leading to decision-making [335] and for the successful implementation of autonomous agents (references in [314]).

The relations between chaos theory and the dynamical systems theory receive increasing attention. As pointed out by van Gelden and Port [324] when they mention the work of Pollack [336] on the structure of natural languages (in their introductory chapter to the book Mind as Motion), “there has been fascinating initial explorations of the idea that highly distinctive kinds of complexity in cognitive performance, such as the productivity of linguistic capacities, might be grounded in chaotic or near chaotic behavior”. Similar remarks certainly apply to Chaotic Itinerancy (see Section 7.3): as explained in [248] implementation of this model was preceded by the introduction of Coupled Map Lattices (CMLs) by Kaneko [337–339] for the studies of spatio-temporal chaos. A CML is a dynamical system with discrete time steps (‘maps’), localized (‘lattice’) and a continuous state. It consists of dynamical elements like, for example, neurons on a lattice which interacts (is ‘coupled’) with other sets of elements. It can display switching between clusters of activity that have been interpreted as equivalent to ‘choosing’ among several perspectives in a sensory field [340].

Despite obvious progress, numerous and fundamental topics remain under discussion. Among them, and in addition to those mentioned in the course of this review, Werner [116] notes “whether problems generally classified as undecidable can be approached by means of chaotic processors. Would computing with chaos be capable of dealing with mathematically undecidable functions? What kind of functionality might chaotic regimes have? Can they outperform non-chaotic regimes?…” These questions are close to those raised by Tsuda (references in [248]) when he proposes that there is a chaotic “Hermeneutics of the Brain” that allows us to ‘know’ the activity of the nervous system and that the ways we recognize the latter as well as the outside world are both interpretative. This view, which was inspired by Marr's theory [341] of the internal representation of visual information with symbols, means that the brain does not directly map its environment nor free itself from it. Rather it interprets symbols or states produced within it, and chaotic dynamic systems are well fit for this purpose. The brain not only perceive but it also creates new realities, this is why it is an hermeneutic device [342]. This possible interface between logic and the self-organization of symbol sequences has been considered elsewhere [305,343] (references in [288]).


We thank H. Abarbanel (Institute for Nonlinear Science, University of San Diego, La Jolla) and D.S. Faber (Department of Neuroscience, Einstein College of Medicine, Bronx, N.Y.) for their critical reading of the manuscript and for their precious scientific comments and R. Miles (INSERM, EMI 224, Cortex et Épilepsie, CHU Pitié-Salpétrière) for his generous help and patient assistance. This work was supported in part by the Defense advance Projects Agency (DARPA) contract No. 66001-00-C-8012.


[1] P. Faure; H. Korn Is there chaos in the brain? I. Concepts of nonlinear dynamics and methods of investigation, C. R. Acad. Sci. Paris, Ser. III, Volume 324 (2001), pp. 773-793

[2] E. Ott Chaos in Dynamical Systems, Cambridge University Press, Cambridge, UK, 2002

[3] S. Strogatz Nonlinear Dynamics and Chaos: With Applications in Physics, Biology, Chemistry, and Engineering, 1st ed., Studies in Nonlinearity, Perseus Publishing, 1994

[4] H. Kantz; T. Schreiber Nonlinear Time Series Analysis, Cambridge Nonlinear Science Series, 7, Cambridge University Press, Cambrigde, UK, 1997

[5] H. Abarbanel Analysis of Observed Chaotic Data, Springer Verlag, 1996

[6] P. Grobstein Variability in brain function and behavior (V.S. Ramachandran, ed.), The Encyclopedia of Human Behavior, Academic Press, 1994, pp. 447-458

[7] L. Glass; M.C. Mackey From Clocks to Chaos, Princeton University Press, 1988

[8] P. Grobstein Directed movement in the frog: motor choice, spatial representation, free will? (J. Kien; C.R. McCrohan; W. Winlow, eds.), Neurobiology of Motor Program Selection, Pergamon Press, New York, 1992, pp. 250-279

[9] L. Glass Chaos in biological systems (M.A. Arbib, ed.), Handbook of Brain Theory and Neural Networks, MIT Press, 2003, pp. 205-208

[10] W.J. Freeman Neurodynamics: An Exploration in Mesoscopic Brain Dynamics, Perspectives in Neural Coding, Springer, 2000

[11] S.L. Bressler; W.J. Freeman Frequency analysis of olfactory system EEG in cat, rabbit and rat, EEG Clin. Neurophysiol., Volume 50 (1980), pp. 9-24

[12] L.J. DeFelice; J.R. Clay Electrophysiological recordings from Xenopus oocytes (B. Sakmann; E. Neher, eds.), Single-Channel Recording, Kluwer Academic, New York, 1983 (Ch. 15, pp. 323–342)

[13] D. Colquhoun; A.G. Hawkes A Q-matrix cookbook: How to write only one program to calculate the single-channel and macroscopic predictions for any kinetic mechanism (B. Sakmann; E. Neher, eds.), Single Channel Recording, Kluwer Academic, New York, 1995, pp. 397-482

[14] L.J. DeFelice; A. Isaac Chaotic states in a random world: relationships between the nonlinear differential equations of excitability and the stochastic properties of ion channels, J. Stat. Phys., Volume 70 (1993), pp. 339-354

[15] J.B. Bassingthwaighte; L.S. Liebovitvh; B.J. West Fractal Physiology, Oxford University Press, New York, 1994

[16] W.R. Foster; L.H. Ungar; J.S. Schwaber Significance of conductances in Hodgkin–Huxley models, J. Neurophysiol., Volume 70 (1993), pp. 2502-2518

[17] J. Rinzel Bursting oscillations in a excitable membrane model (B.D. Sleeman; R.J. Jarvis, eds.), Ordinary and Partial Differential equations: Proc. 8th Dundee Conference, Lecture Notes in Math., 1151, 1985, pp. 304-316

[18] J. Rinzel; G.B. Ermentrout Analysis of neural excitability and oscillations (C. Koch; I. Segev, eds.), Methods in Neuronal Modeling: From Synapses to Networks, MIT Press, Cambridge, MA, USA, 1989, pp. 135-169

[19] R. FitzHugh Impulses and physiological states in theoretical models of nerve membrane, Biophys. J., Volume 1 (1961), pp. 445-466

[20] J.S. Nagumo; S. Arimoto; S. Yoshizawa An active pulse transmission line simulating nerve axon, Proc. IRE (1962), pp. 2061-2070

[21] A.L. Hodgkin; A.F. Huxley A quantitative description of membrane current and its application to conduction and excitation in nerve, J. Physiol., Volume 117 (1952), pp. 500-544

[22] A.L. Hodgkin The Conduction of Nerve Impulses, Liverpool University Press, UK, 1967

[23] C. Koch; O. Bernander Axonal modeling (M.A. Arbib, ed.), Handbook of Brain Theory and Neural Networks, MIT Press, 1998, pp. 129-134

[24] J.J.B. Jack; D. Noble; R.W. Tsien Electric Current Flow in Excitable Cells, Clarendon Press, Oxford, 1983

[25] Methods in Neuronal Modeling (C. Koch; O. Bernander, eds.), MIT Press, Cambridge, MA, USA, 1989

[26] J. Keener; J. Sneyd Mathematical Physiology, Springer, New York, 1998

[27] R.R. Llinas The intrinsic electrophysiological properties of mammalian neurons: insights into central nervous system function, Science, Volume 242 (1988), pp. 1654-1664

[28] L. Glass Chaos in neural systems (M.A. Arbib, ed.), Handbook of Brain Theory and Neural Networks, MIT Press, 1998, pp. 186-189

[29] C. Koch Biophysics of Computation. Information Processing in Single Neurons, Oxford University Press, Oxford, UK, 1999

[30] B. Van der Pol On relaxation oscillations, Phil. Mag., Volume 2 (1926), pp. 922-978

[31] B. Van der Pol; J. Van der Mark The heartbeat considered as a relaxation oscillation and an electrical model of the heart, Phil. Mag. (Suppl.), Volume 6 (1928), pp. 763-775

[32] C. Morris; H. Lecar Voltage oscillations in the Barnacle giant muscle fiber, Biophys. J., Volume 193 (1981), pp. 193-213

[33] X.J. Wang; J. Rinzel Oscillatory and bursting properties of neurons (M. Arbib, ed.), Handbook of Brain Theory and Neural Networks, MIT Press, 1998, pp. 686-691

[34] R.C. Hilborn Chaos and Nonlinear dynamics: An Introduction for Scientists and Engineers, Oxford University Press, Oxford, New York, 1994

[35] P. Bergé; Y. Pomeau; C. Vidal L'ordre dans le chaos, Hermann, Paris, 1984

[36] J.L. Hindmarsh; R.M. Rose A model of the nerve impulse using two first-order differential equations, Nature, Volume 286 (1982), pp. 162-164

[37] J.L. Hindmarsh; R.M. Rose A model of neuronal bursting using three coupled first order differential equations, Proc. R. Soc. Lond. B Biol. Sci., Volume 221 (1984), pp. 87-102

[38] R.M. Rose; J.L. Hindmarsh A model of a thalamic neuron, Proc. R. Soc. Lond. B Biol. Sci., Volume 225 (1985), pp. 161-193

[39] P. Faure; D. Kaplan; H. Korn Probabilistic release and the transmission of complex firing patterns between neurons, J. Neurophysiol., Volume 84 (2000), pp. 3010-3025

[40] H.D.I. Abarbanel; R. Huerta; M.I. Rabinovich; N.F. Rulkov; P.F. Rowat; A.I. Selverston Synchronized action of synaptically coupled chaotic model neurons, Neural Comput., Volume 8 (1996), pp. 1567-1602

[41] M. Bazhenov; R. Huerta; M.I. Rabinovich; T. Sejnowski Cooperative behavior of a chain of synaptically coupled chaotic neurons, Physica D, Volume 116 (1998), pp. 392-400

[42] K. Aihara; G. Matsumoto Temporally coherent organization and instabilities in squid giant axons, J. Theor. Biol., Volume 95 (1982), pp. 697-720

[43] K. Aihara; G. Matsumoto Chaotic oscillations and bifurcations in squid giant axons (A.V. Holden, ed.), Chaos, University Press, Princeton, NJ, 1986, pp. 257-269

[44] K. Aihara; G. Matsumoto; Y. Ikegaya Periodic and non-periodic responses of a periodically forced Hodgkin–Huxley oscillator, J. Theor. Biol., Volume 109 (1984), pp. 249-269

[45] H. Hayashi; S. Ishizuka Chaotic nature of bursting discharges in the Onchidium Pacemaker neuron, J. Theor. Biol., Volume 156 (1992), pp. 269-291

[46] X. Jianxue; G. Yunfan; R. Wei; H. Sanjue; W. Fuzhou Propagation of periodic and chaotic action potential trains along nerve fibers, Physica D, Volume 100 (1997), pp. 212-224

[47] A. Wolf; J.B. Swift; H.L. Swinney; J.A. Vastano Determining Lyapunov exponents from a time series, Physica D, Volume 16 (1985), pp. 285-317

[48] D.H. Perkel; T.H. Bullock Neural coding, Neurosci. Res. Progr. Bull., Volume 6 (1968) no. 3, pp. 221-347

[49] D. Perkel Spike trains as carriers of information (F. Schmitt, ed.), The Neurosciences Second Study Program, The Rockefeller University Press, 1970, pp. 587-596

[50] L. Andrey Analytical proof of chaos in single neurons and consequences (K. Lehnertz; J. Arnhold; P. Grassberger; C. Elger, eds.), Chaos in Brain?, World Scientific, 1999, pp. 247-250

[51] G.J. Mpitsos; R.M. Burton; H.C. Creech; O.S. Seppo Evidence for chaos in spike trains of neurons that generate rythmic motor patterns, Brain Res. Bull., Volume 21 (1988), pp. 529-538

[52] W.T. Frazier; E.R. Kandell; I. Kupferman; R. Waziri; R. Coggeshall Morphological and functional properties of identified neurons in the abdominal ganglion of Aplysia Californica, J. Neurophysiol., Volume 30 (1967), pp. 1288-1351

[53] R.E. Plant; M. Kim Mathematical description of a bursting pacemaker neuron by a modification of the Hodgkin–Huxley equations, Biophys. J., Volume 16 (1976), pp. 227-244

[54] C.C. Canavier; J.W. Clark; J.H. Byrne Routes to chaos in a model of a bursting neuron, Biophys. J., Volume 57 (1990), pp. 1245-1251

[55] C.C. Canavier; J.W. Clark; J.H. Byrne Simulation of the bursting activity of neuron R15 in Aplysia: role of ionic currents, calcium balance, and modulatory transmitters, J. Neurophysiol., Volume 66 (1991), pp. 2107-2124

[56] C.C. Canavier; D.A. Baxter; J.W. Clark; J.H. Byrne Nonlinear dynamics in a model neuron provide a novel mechanism for transient synaptic inputs to produce long-term alterations of postsynaptic activity, J. Neurophysiol., Volume 69 (1993), pp. 2252-2257

[57] A. Hermann; A.L.F. Gorman Effects on tetraethylammonium on potassium currents in a molluscan neuron, J. Genet. Physiol., Volume 78 (1981), pp. 87-110

[58] A.V. Holden; W. Winlow; P.G. Haydon The induction of periodic and chaotic activity in a molluscan neurone, Biol. Cyber., Volume 43 (1982), pp. 169-173

[59] A.V. Holden; W. Winlow Bifurcation of periodic activity from periodic activity in a molluscan neurone, Biol. Cyber., Volume 42 (1981), pp. 189-194

[60] H.A. Lechner; D.A. Baxter; J.W. Clark; J.H. Byrne Bistability and its regulation by serotonin in the endogenously bursting neuron R15 in Aplysia, J. Neurophysiol., Volume 75 (1996), pp. 957-962

[61] R.M. Harris-Warrick; R.E. Flamm Multiple mechanisms of bursting in a conditional bursting neuron, J. Neurophysiol., Volume 7 (1987), pp. 2113-2128

[62] R.M. Harris-Warrick; E. Marder Modulation of neural networks for behavior, Annu. Rev. Neurosci., Volume 14 (1991), pp. 39-57

[63] J. Guckenheimer; S. Gueron; R.M. Harris-Warrick Mapping the dynamics of a bursting neuron, Phil. Trans. R. Soc. Lond. B, Volume 341 (1993), pp. 345-359

[64] J. Rinzel; Y.S. Lee Dissection of a model for neuronal parabolic bursting, J. Math. Biol., Volume 25 (1987), pp. 653-675

[65] R. Thom Structural Stability and Morphogenesis, W.A. Benjamin, 1975

[66] K.A. Richardson; T.T. Imhoff; P. Grigg; J.J. Collins Encoding chaos in neural spike trains, Phys. Rev. Lett., Volume 80 (1998), pp. 2485-2488

[67] X. Pei; F. Moss Characterization of low-dimensional dynamics in the Crayfish caudal photoreceptor, Nature, Volume 379 (1996), pp. 618-621

[68] D.J. Christini; J.J. Collins Using noise and chaos control to control nonchaotic systems, Phys. Rev. E, Volume 52 (1995), pp. 5806-5809

[69] A.I. Selverston; J.P. Miller; M. Wadepuhl Cooperative mechanisms for the production of rythmic movements (A. Roberts; B. Roberts, eds.), Neural Origin of Rythmic Movements, Cambridge University Press, London, 1983, pp. 55-88

[70] P.A. Getting Emerging principles governing the operation of neural networks, Annu. Rev. Neurosci., Volume 12 (1989), pp. 185-204

[71] W.O. Friesen; G.S. Stent Neural circuits for generating rythmic movements, Annu. Rev. Biophys. Bioeng., Volume 7 (1978), pp. 37-61

[72] H. Korn; D.S. Faber Electrical interactions between vertebrate neurons: field effects and electrotonic coupling (F. Schmitt; F.G. Worden, eds.), The Neurosciences, 4th Study Program, 1, MIT Press, 1979, pp. 333-358

[73] P.A. Getting; M.S. Dekin Mechanisms of pattern generation underlying swimming in Tritonia. IV. Gating of central pattern generator, J. Neurophysiol., Volume 53 (1985), pp. 466-480

[74] J.P. Miller; A.I. Selverston Mechanisms underlying pattern generation in lobster stomatogastric ganglion as determined by selective inactivation of identified neurons. IV. Network properties of pyloric system, J. Neurophysiol., Volume 48 (1985), pp. 1416-1432

[75] P.A. Getting Comparative analysis of invertebrate central pattern generator (A. Cohen; S. Rogsignol; S. Grillner, eds.), Neural Control of Rythmic Movements, John Wiley, New York, 1985, pp. 101-128

[76] G.N. Borisyuk; R.M. Borisyuk; A.I. Khibnik; D. Roose Dynamics and bifurcations of two coupled neural oscillators with different connection types, Bull. Math. Biol., Volume 57 (1995), pp. 809-840

[77] V. Makarenko; R.R. Llinas Experimentally determined chaotic phase synchronization in a neuronal system, Proc. Natl Acad. Sci. USA, Volume 95 (1998), pp. 15747-15752

[78] R. Llinas; Y. Yarom Oscillatory properties of guinea-pig inferior olivary neurones and their pharmacological modulation: an in vitro study, J. Physiol., Volume 376 (1986), pp. 163-182

[79] R.C. Elson; A.I. Selverston; R. Huerta; N. Rulkov; M.I. Rabinovich; H.D.I. Abarbanel Synchronous behavior of two coupled biological neurons, Phys. Rev. Lett., Volume 81 (1998), pp. 5692-5695

[80] A.A. Sharp; L.F. Abbott; E. Marder Artificial electrical synapses in oscillatory networks, J. Neurophysiol., Volume 67 (1993), pp. 1691-1694

[81] T. Bal; F. Naguy; M. Moulins The pyloric central pattern generator in crustacea: a set of conditionnal neuronal oscillators, J. Comput. Physiol., Volume 163 (1988), pp. 715-727

[82] V.S. Afraimovich; N.N. Verichev; M.I. Rabinovich General synchronization, Radiophysic. Quantum Electr., Volume 29 (1986), p. 747

[83] N.F. Rulkov; A.R. Volkovskii; A. Rodriguez-Lozano; E. del Rio; M.G. Velarde Mutual synchronization of chaotic self-oscillators with dissipative coupling, Int. J. Bifurc. Chaos, Volume 2 (1992), pp. 669-676

[84] M. Falcke; R. Huerta; M.I. Rabinovich; H.D.I. Abarbanel; R.C. Elson; A.I. Selverston Modeling observed chaotic oscillations in bursting neurons: The role of calcium dynamics and IP3, Biological Cybernetics, Volume 82 (2000), pp. 517-527

[85] V.P. Zhigulin; M.I. Rabinovich; R. Huerta; H. Abarbanel Robustness and enhancement of neural synchronization by activity-dependent coupling, Phys. Rev. Lett. E, Volume 67 (2003), p. 021901

[86] G. Bi; M. Poo Synaptic modification by correlated activity: Hebb's postulate revisited, Annu. Rev. Neurosci., Volume 24 (2001), pp. 139-166

[87] A. Szucs; P. Varona; A.R. Volkovskii; H.D. Abarbanel; M.I. Rabinovich; A.I. Selverston Interacting biological and electronic neurons generate realistic oscillatory rhythms, NeuroReport, Volume 11 (2000), pp. 563-569

[88] M.I. Rabinovich; H.D.I. Abarbanel; R. Huerta; R. Elson; A.I. Selverston Self-regularization of chaos in neural systems: Experimental and theoretical results, IEEE Trans. Circuits and Systems: Fundamental Theory and Applications, Volume 44 (1997), pp. 997-1005

[89] M.I. Rabinovich; H.D.I. Abarbanel The role of chaos in neural systems, Neuroscience, Volume 87 (1998), pp. 5-14

[90] R.C. Elson; R. Huerta; H.D.I. Abarbanel; M.I. Rabinovich; A.I. Selverston Dynamic control of irregular bursting in an identified neuron of an oscillatory circuit, J. Neurophysiol., Volume 82 (1999), pp. 115-122

[91] M.I. Rabinovich; P. Varona; H.D. Abarbanel Nonlinear cooperative dynamics of living neurons, Int. J. Bifurc. Chaos, Volume 10 (2000), pp. 913-933

[92] J.F. Heagy; T.L. Carroll; L.M. Pecora Synchronous chaos in coupled oscillator systems, Phys. Rev. E, Volume 50 (1994), pp. 1874-1884

[93] M.I. Rabinovich; R. Huerta; M. Bazhenov; A.K. Koslov; H.D.I. Abarbanel Computer simulations of stimulus-dependent state switching in basic circuits of bursting neurons, Phys. Rev. E, Volume 58 (1998), pp. 6418-6430

[94] M. Steriade; D.A. McCormick; T.J. Sejnowski Thalamocortical oscillations in the sleeping and aroused brain, Science, Volume 262 (1993), pp. 679-685

[95] C. van Vreeswijk; H. Sompolinsky Chaos in neuronal networks with balanced excitatory and inhibitory activity, Science, Volume 274 (1996), pp. 1724-1726

[96] R. Huerta; P. Varona; M.I. Rabinovich; H.D. Abarbanel Topology selection by chaotic neurons of a pyloric central pattern generator, Biol. Cyber., Volume 84 (2001), p. L1-L8

[97] J. Guckenheimer; P. Rowat Dynamical analysis of real neuronal networks (P.S.G. Stein; S. Grillner; A.I. Selverston; D.G. Stuart, eds.), Neurons, Networks, and Motor Behavior, MIT Press, London, 1997, pp. 151-163

[98] L. Brock; J. Coombs; J. Eccles The recording of potentials from motoneurones with an intracellular electrode, J. Physiol. Lond., Volume 117 (1952), pp. 431-460

[99] Y. Burnod; H. Korn Consequences of stochastic release of neurotransmitters for network computation in the central nervous system, Proc. Natl Acad. Sci. USA, Volume 86 (1989), pp. 352-356

[100] H. Korn; D.S. Faber Transmission at a central inhibitory synapse. IV. Quantal structure of synaptic noise, J. Neurophysiol., Volume 63 (1990), pp. 198-222

[101] D. Ferster Is neural noise just a nuisance?, Science, Volume 273 (1996), p. 1812

[102] W. Calvin; C. Stevens Synaptic noise as a source of variability in the interval between action potentials, Science, Volume 155 (1967), pp. 842-844

[103] W.R. Softky; C. Koch The highly irregular firing of cortical cells is inconsistent with temporal integration of random EPSPs, J. Neurosci., Volume 13 (1993), pp. 334-350

[104] M.N. Shadlen; W.T. Newsome The variable discharge of cortical neurons: implications for connectivity, computation, and information coding, J. Neurosci., Volume 18 (1998), pp. 3870-3896

[105] Z.F. Mainen; T.J. Sejnowski Reliability of spike timing in neocortical neurons, Science, Volume 268 (1995), pp. 1503-1506

[106] C.F. Stevens; A.M. Zador Input synchrony and the irregular firing of cortical neurons, Nat. Neurosci., Volume 1 (1998), pp. 210-217

[107] M.N. Shadlen; W.T. Newsome Noise, neural codes and cortical organization, Curr. Opin. Neurobiol., Volume 4 (1994), pp. 569-579

[108] W.R. Softky Simple codes versus efficient codes, Curr. Opin. Neurobiol., Volume 5 (1995), pp. 239-247

[109] M.N. Shadlen; W.T. Newsome Is there a signal in the noise?, Curr. Opin. Neurobiol., Volume 5 (1995), pp. 248-250

[110] P. Faure; H. Korn A nonrandom dynamic component in the synaptic noise of a central neuron, Proc. Natl Acad. Sci. USA, Volume 94 (1997), pp. 6506-6511

[111] P. Faure; H. Korn A new method to estimate the Kolmogorov entropy on recurrence plots: its application to neuronal signals, Physica D, Volume 122 (1998), pp. 265-279

[112] H. Korn; D.S. Faber; A. Triller Probabilistic determination of synaptic strength, J. Neurophysiol., Volume 55 (1986), pp. 402-421

[113] G. Buzsaki; R. Llinas; W. Singer; A. Berthoz Temporal Coding in the Brain, Research and Perspectives in Neurosciences – Fondation IPSEN, Springer-Verlag, 1994

[114] H. Fujii; H. Ito; K. Aihara; N. Ichinose; M. Tsukada Dynamical cell assembly hypothesis – Theoretical possibility of spatio temporal coding in the cortex, Neural Networks, Volume 9 (1996), pp. 1303-1350

[115] J.J. Eggermont Is there a neural code?, Neurosci. Biobehav. Rev., Volume 22 (1998), pp. 355-370

[116] G. Werner, Computation in nervous systems, 2000, http://www.ece.utexas.edu/werner/neuralcomputation.html

[117] D. Hebb The Organisation of Behavior – A Neurophysiological Theory, John Wiley, New York, 1949

[118] C. Von der Malsburg The correlation theory of brain function, internal report 81-2, Max Planck Institute for Biophysical Chemistry, 1981

[119] J.J. Hopfield Pattern recognition computation using action potential timing for stimulus representation, Nature, Volume 376 (1995), pp. 33-36

[120] J.P. Segundo; D.H. Perkel The nerve cell as an analyser of spike trains (M.A.B. Brazier, ed.), UCLA Forum in Medical Sciences No. 11, The Interneurons, University of California Press, Berkeley, USA, 1969, pp. 349-390

[121] J.P. Segundo; G. Sugihara; P. Dixon; M. Stiber; L.F. Bersier The spike trains of inhibited pacemaker neurons seen through the magnifying glass of nonlinear analysis, Neuroscience, Volume 87 (1998), pp. 741-766

[122] A.P. Georgopoulos; A.B. Schwartz; R.E. Kettner Neuronal population coding of movement direction, Science, Volume 233 (1986), pp. 1416-1419

[123] W. Singer Synchronization of cortical activity and its putative role in information processing and learning, Annu. Rev. Physiol., Volume 55 (1993), pp. 349-374

[124] M.A. Nicolelis; L.A. Baccala; R.C. Lin; J.K. Chapin Sensorimotor encoding by synchronous neural ensemble activity at multiple levels of the somatosensory system, Science, Volume 268 (1995), pp. 1353-1358

[125] A. Riehle; S. Grun; M. Diesmann; A. Aertsen Spike synchronization and rate modulation differentially involved in motor cortical function, Science, Volume 278 (1997), pp. 1950-1953

[126] S.J. Schiff; K. Jerger; D.H. Duong; T. Chang; M.L. Spano; W.L. Ditto Controlling chaos in the brain, Nature, Volume 8 (1994), pp. 615-620

[127] C. Skarda; W.J. Freeman How brain make chaos in order to make sense of the world, Behav. Brain Sci., Volume 10 (1987), pp. 161-195

[128] C. Skarda; W. Freeman Chaos and the new science of the brain, Concepts in Neurosci., Volume 1 (1990), pp. 275-285

[129] P. So; J.T. Francis; T.I. Netoff; B.J. Gluckman; S.J. Schiff Periodic orbits: a new language for neuronal dynamics, Biophys. J., Volume 74 (1998), pp. 2776-2785

[130] K. Pakdaman; S. Tanabe; T. Shimokawa Coherence resonance and discharges time reliability in neurons and neuronal models, Neural Networks, Volume 14 (2001), pp. 895-905

[131] K. Wiesenfeld; F. Moss Stochastic resonance and the benefits of noise: from ice ages to Crayfish and Squids, Nature, Volume 373 (1995), pp. 33-36

[132] A. Bulsara; L. Gammaitoni Tuning into noise, Phys. Today, Volume 49 (1996), pp. 39-45

[133] A. Longtin; A. Bulsara; F. Moss Time-interval sequences in bistable systems and the noise-induced transmission of information by sensory neurons, Phys. Rev. Lett., Volume 67 (1991), pp. 656-659

[134] R.D. Chialvo; A. Longtin; J. Muller-Gerking Stochastic resonance in models of neuronal ensembles, Phys. Rev. E, Volume 55 (1997), pp. 1798-1808

[135] F. Chapeau-Blondeau Comparison between spike and rate models in networks of integrate-and-fire neurons (R.R. Poznanski, ed.), Biophysical Neural Networks, Mary Ann Liebert, 2000, pp. 303-341

[136] J.K. Douglass; L. Wilkens; E. Pantazelou; F. Moss Noise enhancement of information transfer in crayfish mechanoreceptors by stochastic resonance, Nature, Volume 365 (1993), pp. 337-340

[137] J. Collins; T. Imhoff; P. Grigg Noise-enhanced information transmission in rat SA1 cutaneous mechanoreceptors via aperiodic stochastic resonance, J. Neurophysiol., Volume 76 (1996), pp. 642-645

[138] J.E. Levin; J.P. Miller Broadband neural encoding in the cricket cercal sensory system enhanced by stochastic resonance, Nature, Volume 380 (1996), pp. 165-168

[139] F. Jaramillo; K. Wiesenfeld Mechanoelectrical transduction assisted by Brownian motion: a role for noise in the auditory system, Nat. Neurosci., Volume 1 (1998), pp. 384-388

[140] B.J. Gluckman; P. So Stochastic resonance in mammalian neuronal network, Chaos, Volume 8 (1998), pp. 588-598

[141] P. Cordo; J.T. Inglis; S. Verschueren; J.J. Collins; S. Merfeld; S. Rosenblum; S. Buckley; F. Moss Noise in human spindles, Nature, Volume 383 (1996), pp. 769-770

[142] J.J. Collins; T.T. Imhoff; P. Grigg Noise-enhanced tactile sensation, Nature, Volume 383 (1996), p. 770

[143] E. Simonotto; M. Riani; C. Seife; M. Roberts; J. Twitty; F. Moss Visual perception of stochastic resonance, Phys. Rev. Lett., Volume 78 (1997), pp. 1186-1189

[144] S.M. Bezrukov; I. Vodyanoy Noise-induced enhancement of signal transduction across voltage-dependent ion channels, Nature, Volume 378 (1995), pp. 362-364

[145] L. Glass; L. Glass Synchronization and rhythmic processes in physiology, Nature, Volume 410 (2001), pp. 277-284

[146] J.P. Segundo; J.-F. Vibert; K. Pakdaman; M. Stiber; O. Diez Martinez Noise and the neurosciences: a long history, a recent revival and some theory (K. Pribram, ed.), Brain and Self-Organization, Lawrence Erlbaum Associates, 1994, pp. 299-331

[147] P. Faure; H. Korn Synaptic noise and chaos in a vertebrate neuron (M.A. Arbib, ed.), Handbook of Brain Theory and Neural Networks, MIT Press, 2002, pp. 1130-1133

[148] D. Nozaki; Y. Yamamoto Enhancement of stochastic resonance in a FitzHugh/Nagumo neuronal model driven by colored noise, Phys. Lett. A, Volume 243 (1998), pp. 281-287

[149] K. Pakdaman; D. Mestivier External noise synchronizes forced oscillators, Phys. Rev. E, Volume 64 (2001), p. 030901(R)

[150] M.T. Huber; J.C. Krieg; M. Dewald; H.A. Braun Stochastic encoding in sensory neurons: impulse patterns of mammalian cold receptors, Chaos, Solitons and Fractals, Volume 11 (2000), pp. 1895-1903

[151] A. Crisanti; M. Falcioni; G. Paladin; A. Vulpiani Stochastic resonance in deterministic chaotic systems, J. Phys. A: Math. Gen., Volume 27 (1994), p. L597

[152] R. Kozma; W.J. Freeman A possible mechanism for intermittent oscillations in the KIII model of dynamics memories – the case study of olfaction, IEEE/INNS Int. Joint Conf. Neural Networks, 1999, pp. 52-57

[153] R. Kosma; W.J. Freeman Chaotic resonnance – Methods and applications for robust clasification of noisy and variable patterns, Int. J. Bifurc. Chaos, Volume 6 (2001), pp. 1607-1629

[154] S. Sinha Noise-free stochastic resonance in simple chaotic systems, Physica A, Volume 270 (1999), pp. 204-214

[155] J. Theiler; S. Eubank; A. Longtin; B. Galdrikian; J.D. Farmer Testing for nonlinearity in time series: the method of surrogate data, Physica D, Volume 58 (1992), pp. 77-94

[156] J. Holzfuss; G. Mayer-Kress An approach to error estimation in the application of dimension algorithms (G. Mayer-Kress, ed.), Dimension and Entropies in Chaotic Systems, Springer, Berlin, 1986, pp. 114-121

[157] P.E. Rapp; T.R. Bashore; J.M. Martinerie; A.M. Albano; I.D. Zimmerman; A.I. Mees Dynamics of brain electrical activity, Brain Topogr., Volume 2 (1989), pp. 99-118

[158] G. Mayer-Kress; S.P. Layne Dimensionality of the human electroencephalogram, Ann. NY Acad. Sci., Volume 504 (1987), pp. 62-87

[159] A.M. Albano; P.E. Rapp On the reliability of dynamical measures of EEG signals (B.H. Jansen; M.E.B. Brandt, eds.), The 2nd Annual Conference on Nonlinear Dynamics Analysis of the EEG, World Scientific, Singapore, 1993, pp. 117-139

[160] L. Glass; D.T. Kaplan; J.E. Lewis Test for deterministic dynamics in real and model neural networks (B.H. Jansen; M.E.B. Brandt, eds.), The 2nd Annual Conference on Nonlinear Dynamics Analysis of the EEG, World Scientific, Singapore, 1993, pp. 223-249

[161] J.A. McEwen; C.B. Anderson Modelling the stationarity and Gaussianity of spontaneous electroencephalographic activity, IEEE Trans. Biomed. Engin., Volume 22 (1975), pp. 363-369

[162] M. Palus Testing for nonlinearity in the EEG (B. Jansen; M. Brandt, eds.), Proc. 2nd Annual Conference on Nonlinear Dynamical Analysis of the EEG, World Scientific, Singapore, 1993, pp. 100-114

[163] D. Prichard; J. Theiler Generating surrogate data for time series with several simultaneously measured variables, Phys. Rev. Lett., Volume 73 (1994), pp. 951-954

[164] D. Prichard; J. Theiler Generalized redundancies for time series analysis, Physica D, Volume 84 (1995), pp. 476-493

[165] M. Palus Testing for nonlinearity using redundancies: quantitative and qualitative aspects, Physica D, Volume 80 (1995), pp. 186-205

[166] M. Palus Nonlinearity in normal human EEG: cycles, temporal asymmetry, nonstationarity and randomness, not chaos, Biol. Cybern., Volume 75 (1996), pp. 389-396

[167] J. Theiler; P.E. Rapp Re-examination of the evidence for low-dimensional non-linear structure in the human electroencephalogram, EEG Clin. Neurophysiol. (1996), pp. 213-222

[168] T. Elbert; W.J. Ray; Z.J. Kowalik; J.E. Skinner; K.E. Graf; N. Birbaumer Chaos and physiology: deterministic chaos in excitable cell assemblies, Physiol. Rev., Volume 74 (1994), pp. 1-47

[169] K. Lehnertz Non-linear time series analysis of intracranial EEG recordings in patient with epilepsy – an overview, Int. J. Psychophysiol., Volume 34 (1999), pp. 45-52

[170] G. Viana Di Prisco; W.J. Freeman Odor-related bulbar EEG spatial patterns analysis during appetire conditioning in rabbits, Behav. Neurosci., Volume 99 (1985), pp. 964-978

[171] W.J. Freeman; G. Viana Di Prisco Spatial patterns differences with discriminated odors manifest chaotic and limit cycles attractors in olfactory bulb of rabbits (G. Palm; A. Aartsen, eds.), Brain Theory, Springer, Berlin, 1986, pp. 97-119

[172] W.J. Freeman EEG analysis gives model of neuronal template-matching mechanism for sensory search with olfactory bulb, Biol. Cybern., Volume 35 (1979), pp. 221-234

[173] W.J. Freeman Simulation of chaotic EEG patterns with a dynamic model of the olfactory system, Biol. Cybern., Volume 56 (1987), pp. 139-150

[174] W.J. Freeman Strange attractors that govern mammalian brain dynamics shown by trajectories of electroencephalography (EEG) potentials, IEEE Trans. CAS, Volume 35 (1988), pp. 781-784

[175] P.E. Rapp; I.D. Zimmermann; A.M. Albano; C. Deguzman; N.N. Greenbaun Dynamics of spontaneous neural activity in the simian motor cortex: the dimension of chaotic neurons, Phys. Lett., Volume 6 (1985), pp. 335-338

[176] J. Röschke; E. Basar The EEG is not a simple noise: strange attractors in intracranial structures (E. Basar, ed.), Dynamics of Sensory an Cognitive Processing by the Brain, Springer Series in Brain Dynamics, 1, Springer-Verlag, Berlin, 1988, pp. 203-216

[177] S. Neuenschwander; J. Martinerie; B. Renault; F.J. Varela A dynamical analysis of oscillatory responses in the optic tectum, Brain Res./Cognitive Brain Res., Volume 1 (1993), pp. 175-181

[178] A. Celleti; A.E.P. Villa Low-dimensional chaotic attractors in the rat brain, Biol. Cybern., Volume 74 (1996), pp. 387-393

[179] R. Hoffman; W. Shi; B. Bunney Nonlinear sequence-dependent structure of nigral dopamine neurone interspike interval firing patterns, Biophysic. J., Volume 69 (1995), pp. 128-137

[180] A. Babloyantz; J.M. Salazar; G. Nicolis Evidence of chaotic dynamics of brain activity during the sleep cycle, Phys. Lett. A, Volume 111 (1985), pp. 152-156

[181] I. Dvorak; A.V. Holden Mathematical Approaches to Brain Functioning Diagnostics, Manchester University Press, Manchester, UK, 1991

[182] K.E. Graf; T. Elbert Dimensional analysis of the waking EEG (E. Basar; T.H. Bullock, eds.), Brain Dynamics. Progress and Perspectives, Springer-Verlag, Berlin, 1989, pp. 174-191

[183] W.S. Pritchard; D.W. Duke Dimensional analysis of no-task human EEG using the Grassberger–Procaccia method, Psychophysiol., Volume 29 (1992), pp. 182-192

[184] T. Elbert; W. Lutzenberger; B. Rockstroh; P. Berg; R.B. Cohen; R. Cohen Physical aspects of the EEG in schizophrenics, Biol. Psychiatry, Volume 32 (1992), pp. 595-606

[185] W.S. Pritchard; D.W. Duke; K.L. Coburn Dimensional analysis of topographic EEG: some methodological considerations (D. Duke; W. Pritchard, eds.), Measuring Chaos in the Human Brain, World Scientific, Singapore, 1991, pp. 181-198

[186] W. Lutzenberger; N. Birbaumer; H. Flor; B. Rockstroh; T. Elbert Dimensional analysis of the human EEG and intelligence, Neurosci. Lett., Volume 143 (1992), pp. 10-14

[187] W.S. Pritchard; K.K. Krieble; D.W. Duke Dimensional analysis of resting human EEG II: surrogate-data testing indicates nonlinearity but not low-dimensional chaos, Psychophysiol., Volume 32 (1995), pp. 486-491

[188] J.J. Wright; D.T.J. Liley Dynamics of the brain at global and microscopic scales: neural networks and the EEG, Behav. Brain Sci., Volume 19 (1996), pp. 285-320

[189] G.B. Ermentrout; J.D. Cowan Large-scale spatially organized activity in neural nets, SIAM J. Appl. Math., Volume 39 (1980), pp. 323-340

[190] A. Babloyantz; A. Destexhe Low-dimensional chaos in an instance of epilepsy, Proc. Natl Acad. Sci. USA, Volume 83 (1986), pp. 3513-3517

[191] L.D. Iasemidis; J.C. Sackellares; H.P. Zaveri; W.J. Williams Phase space topography and the Lyapunov exponent of electrocorticograms in partial seizures, Brain Topogr., Volume 2 (1990), pp. 187-201

[192] L.D. Iasemidis; J.C. Sackellares The evolution with time of the spatial distribution of the largest Lyapunov exponent on the human epileptic cortex (D. Duke; W. Pritchard, eds.), Measuring Chaos in the Human Brain, World Scientific, Singapore, 1991, pp. 49-82

[193] J.P. Pijn; J. Van Neerven; A. Noest; F.H. Lopes da Silva Chaos or noise in EEG signals dependence on state and brain site, EEG Clin. Neurophysiol., Volume 79 (1991), pp. 371-381

[194] P.E. Rapp; A.M. Albano; I.D. Zimmerman; M.A. Jiménez-Montano Phase-randomized surrogates can produce spurious identifications of non-random structure, Phys. Lett. A, Volume 192 (1994), pp. 27-33

[195] T. Schreiber; A. Schmitz Improved surrogate data for nonlinearity tests, Phys. Rev. Lett., Volume 77 (1996), pp. 635-638

[196] T. Schreiber; A. Schmitz Surrogate time series, Physica D, Volume 142 (2000), pp. 346-382

[197] T. Schreiber Is nonlinearity evident in time series of brain electrical activity? (K. Lehnertz; J. Arnhold; P. Grassberger; C. Elger, eds.), Chaos in Brain? Interdisc. Workshop, World Scientific, Singapore, 1999, pp. 13-22

[198] D. Auerbach; P. Cvitanovic; J.-P. Eckmann; G. Gunaratne; I. Procaccia Exploring chaotic motion through periodic orbits, Phys. Rev. Lett., Volume 58 (1987), pp. 2387-2389

[199] P. Cvitanovic Invariant measurement of strange sets in terms of cycles, Phys. Rev. Lett., Volume 61 (1988), pp. 2729-2732

[200] J.N. Weiss; A. Garfinkel; M.L. Spano; W.L. Ditto Chaos and chaos control in biology, J. Clin. Invest., Volume 93 (1994), pp. 1355-1360

[201] E. Ott; C. Grebogi; J.A. Yorke Controlling chaos, Phys. Rev. Lett., Volume 64 (1990), pp. 1196-1199

[202] W.L. Ditto; S.N. Rauseo; M.L. Spano Experimental control of chaos, Phys. Rev. Lett., Volume 65 (1990), pp. 3211-3214

[203] S. Boccaletti; C. Grebogi; Y.-C. Lai; H. Mancini; D. Maza The control of chaos: theory and applications, Phys. Rep., Volume 329 (2000), pp. 103-197

[204] A. Garfinkel; M.L. Spano; W.L. Ditto; J.N. Weiss Controlling cardiac chaos, Science, Volume 257 (1992), pp. 1230-1235

[205] D.J. Christini; J.J. Collins Controlling neuronal noise using chaos control, Phys. Rev. Lett., Volume 75 (1995), pp. 2782-2785

[206] S. Lesher; M.L. Spano; N.M. Mellen; L. Guan; S. Dykstra; A.H. Cohen Evidence for unstable periodic orbits in intact swimming lampreys, isolated spinal cord, and intermediate preparations, Ann. NY Acad. Sci., Volume 860 (1998), pp. 486-491

[207] M. Le Van Quyen; C. Adam; J.-P. Lachaux; J. Martinerie; M. Baulac; B. Renault; F.J. Varela Temporal patterns in human epileptic activity are modulated by perceptual discriminations, NeuroReport, Volume 8 (1997), pp. 1703-1710

[208] P. So; E. Ott; T. Sauer; B.J. Gluckman; C. Grebogi; S.J. Schiff Extracting unstable periodic orbits from chaotic time series data, Phys. Rev. E, Volume 55 (1997), pp. 5398-5417

[209] D. Ruelle What are the measures describing turbulence, Prog. Theor. Phys. (Suppl.), Volume 64 (1978), pp. 339-345

[210] R. Artuso; E. Aurell; P. Cvitanovic Recycling of strange sets: I. Cycle expansions, Nonlinearity, Volume 3 (1990), pp. 325-359

[211] M.J. Feigenbaum Universal behaviour in nonlinear systems, Los Alamos Science, Volume 1 (1980), pp. 4-27

[212] A. Neiman; L. Schimansky-Geier; F. Moss; B. Shulgin; J.J. Collins Synchronization of noisy systems by stochastic signals, Phys. Rev. E, Volume 60 (1999), pp. 284-292

[213] D.W. Crevier; M. Meister Synchronous period-doubling in flicker vision of salamander and man, J. Neurophysiol., Volume 79 (1998), pp. 1869-1878

[214] J. Theiler On the evidence for low-dimensional chaos in an epileptic electroencephalogram, Phys. Lett. A, Volume 196 (1995), pp. 335-341

[215] L. Pezard; J. Martinerie; J. Mullergerking; F. Varela; B. Renault Entropy quantification of human brain spatio temporal dynamics, Physica D, Volume 96 (1996), pp. 344-354

[216] A. Meyer-Lindenberg The evolution of complexity in human brain development: an EEG study, EEG Clin. Neurophysiol., Volume 99 (1997), pp. 405-411

[217] C. Ehlers; J. Havstad; D. Prichard; J. Theiler Low doses of ethanol reduce evidence for nonlinear structure in brain activity, J. Neurosci., Volume 18 (1998), pp. 7474-7486

[218] C.J. Stam; J.P.M. Pijn; P. Suffczynski; F.H.L. da Silva Dynamics of the human alpha rhythm: evidence for non-linearity?, EEG Clin. Neurophysiol., Volume 110 (1999), pp. 1801-1813

[219] K. Lehnertz; C.E. Elger Spatio-temporal dynamics of the primary epileptigenic area in temporal lobe epilepsy characterized by neuronal complexity loss, EEG Clin. Neurophysiol., Volume 95 (1995), pp. 108-117

[220] K. Lehnertz; C.E. Elger Can epileptic seizures be predicted? Evidence from nonlinear time series analysis of brain electrical activity, Phys. Rev. Lett., Volume 80 (1998), pp. 5019-5022

[221] M. Molnar Commentary on Ishiro Tsuda: Low dimensional versus high-dimensional chaos in brain function – Is it and/or issue?, Behav. Brain Sci., Volume 24 (2001), pp. 823-824

[222] M. Molnar The dimensional complexity of the P3 event-related potential: area-specific and task-dependent features, EEG Clin. Neurophysiol., Volume 110 (1999), pp. 31-38

[223] J.E. Skinner; M. Molnar; C. Tomberg The point correlation dimension: performance with nonstationary surrogate data and noise, Integrative Physiol. Behav. Sci., Volume 29 (1994), pp. 217-234

[224] H. Hayashi; S. Ishizuka Chaotic responses of hippocampal CA3 region to a mossy fiber stimulation in vitro, Brain Res., Volume 686 (1995), pp. 194-206

[225] S.J. Schiff; K. Jerger; T. Chang; T. Sauer; P. Aitken Stochastic versus deterministic variability in simple neuronal circuits. II. Hippocampal slice, Biophys. J., Volume 67 (1994), pp. 684-691

[226] Z. Rogowski; I. Gath; E. Bental On the prediction of epileptic seizures, Biol. Cybern., Volume 42 (1981), pp. 9-15

[227] H. Lange; J. Lieb; J.J. Engel; P. Crandall Temporo-spatial patterns of pre-ictal spike activity in human temporal lobe epilepsy, EEG Clin. Neurophysiol., Volume 56 (1983), pp. 543-555

[228] D. Lerner Monitoring changing dynamics with correlation integrals: case study of an epileptic seizure, Physica D, Volume 97 (1996), pp. 563-576

[229] M. Casdagli; L. Iasemidis; R. Gilmore; S. Roper; R. Savit; J. Sackellares Non-linearity in invasive EEG recordings from patients with temporal lobe epilepsy, EEG Clin. Neurophysiol., Volume 102 (1997), pp. 98-105

[230] J. Martinerie; C. Adam; M. Le Van Quyen; M. Baulac; S. Clémenceau; B. Renault; F.J. Varela Epileptic seizures can be anticipated by non-linear analysis, Nat. Med., Volume 4 (1998), pp. 1173-1176

[231] M. Feucht; U. Moller; H. Witte; F. Benninger; S. Asenbaum; D. Prayer; M. Friedrich Application of correlation dimension and pointwise dimension for nonlinear topographical analysis of focal onset seizures, Med. Biol. Eng. Comp., Volume 37 (1999), pp. 208-217

[232] M.J. van-der Heyden; D.N. Velis; B.P.T. Hoekstra; J.P. Pijn; W.V. Boas; C.W.M. van Veelen; P.C. van Rijen; F.H.L. da Silva; J. DeGoede Non-linear analysis of intracranial human EEG in temporal lobe epilepsy, EEG Clin. Neurophysiol., Volume 110 (1999), pp. 1726-1740

[233] M. Le-van Quyen; J. Martinerie; C. Adam; F. Varela Nonlinear analyses of interictal EEG map the brain interdependences in human focal epilepsy, Physica D, Volume 127 (1999), pp. 250-266

[234] J.C. Sackellares; L.D. Iasemidis; D.S. Shiau; R.L. Gilmore; S.N. Roper Epilepsy – When chaos fails, Chaos in Brain? Interdisc. Workshop, 10–12 March 1999, Bonn, Germany, 1999, pp. 112-133

[235] M. Le-van Quyen; J. Martinerie; V. Navarro; M. Boon; P. D'Have; C. Adam; B. renault; F. Varela; M. Baulac Anticipation of epileptic seizures from standard EEG recordings, Lancet, Volume 357 (2001), pp. 183-188

[236] Z. Kowalik; A. Schnitzler; H. Freund; O. Witte Local Lyapunov exponents detect epileptic zones in spike-less interictal MEG recordings, Clin. Neurophysiol., Volume 112 (2001), pp. 60-67

[237] R. Ferri; M. Elia; S.A. Musumeci; C.J. Stam Non-linear EEG analysis in children with epilepsy and electrical status epilepticus during slow-wave sleep (ESES), EEG Clin. Neurophysiol., Volume 112 (2001), pp. 2274-2280

[238] C.J. Stam; B.W. van Dijk Synchronization likelihood: an unbiased measures of generalized synchronization in multivariate data sets, Physica D, Volume 163 (2002), pp. 236-251

[239] G. Widman; D. Bingmann; K. Lehnertz; C. Elger Reduced signal complexity of intracellular recordings: a precursor for epileptiform activity?, Brain Res., Volume 836 (1999), pp. 1546-1630

[240] R. Cerf; M. El-Amri; E. El-Ouasdad; E. Hirsch Non-linear analysis of epileptic seizures – I. Correlation-dimension measurements for absence epilepsy and near-periodic signals, Biol. Cybern., Volume 80 (1999), pp. 247-258

[241] K. Jerger; T. Netoff; J. Francis; T. Sauer; S. Pcora; L. Weinstein; S.J. Schiff Early seizure detection, J. Clin. Neurophysiol., Volume 18 (2001), pp. 259-268

[242] A. Arieli; A. Sterkin; A. Grinvald; A. Aertsen Dynamics of ongoing activity: explanation of the large variability in evoked cortical responses, Science, Volume 273 (1996), pp. 1868-1871

[243] K. Aihara; T. Takabe; M. Toyoda Chaotic neural networks, Phys. Lett. A, Volume 144 (1990), pp. 333-340

[244] J.E. Lewis; L. Glass Steady states, limit cycles, and chaos in models of complex biological networks, Int. J. Bifurc. Chaos, Volume 1 (1991), pp. 477-483

[245] S. Nara; P. Davis; M. Kawachi; H. Totsuji Chaotic memory dynamics in a recurrent neural network with cycle memories embedded by pseudo-inverse method, Int. J. Bifurc. Chaos, Volume 5 (1995), pp. 1205-1212

[246] L. Chen; K. Aihara Chaotic simulated annealing by a neural network model with transcient chaos, Neural Networks, Volume 8 (1995), pp. 915-930

[247] P.A. Robinson; C.J. Rennie; J.J. Wright Propagation and stability of waves of electrical activity in the cerebral cortex, Phys. Rev. E, Volume 56 (1997), pp. 826-840

[248] K. Kaneko; I. Tsuda Complex Systems: Chaos and Beyond: A Constructive Approach with Applications in Life Sciences, Springer, 2001

[249] J.J. Wright Integrative Neuroscience, Harwood Academic Publishers, 2000

[250] K. Aihara Chaos engineering and its application to parallel-distributed processing with chaotic neural networks, Proc. IEEE, Volume 90 (2002), pp. 919-930

[251] W.J. Freeman; D.P.G. Viana Relation of olfactory EEG to behavior: time-series analysis, Behav. Neurosci., Volume 100 (1986), pp. 753-763

[252] W.J. Freeman On the problem of anomalous dispersion in chaoto-chaotic phase transitions of neural masses, and its significance for the management of perceptual information in brain (H. Haken; M. stadler, eds.), Synergetics of Cognition, Springer, Berlin, 1990, pp. 126-142

[253] W.J. Freeman Neurodynamics: An Exploration in Mesoscopic Brain Dynamics, Springer-Verlag, London, 2000

[254] W.J. Freeman Mesoscopic neurodynamics: from neuron to brain, J. Physiol., Volume 94 (2000), pp. 303-322

[255] Y. Yao; W.J. Freeman Model of biological patterns recognition with spatially chaotic dynamics, Neural Networks, Volume 3 (1990), pp. 153-170

[256] W.J. Freeman; J.M. Barrie Chaotic oscillations and the genesis of meaning in cerebral cortex (G. Buzsaki, ed.), Temporal Coding in the Brain, Springer-Verlag, Berlin, 1994, pp. 13-37

[257] L. Kay; K. Shimoide; W.J. Freeman Comparison of EEG time series from rat olfactory system with model composed of nonlinear coupled oscillators, Int. J. Bifurc. Chaos, Volume 5 (1995), pp. 849-858

[258] N.E. Schoppa; G.L. Westbrook AMPA autoreceptors drive correlated spiking in olfactory bulb glomeruli, Nat. Neurosci., Volume 5 (2002), pp. 1194-1202

[259] P.E. Castillo; A. Carleton; J.D. Vincent; P.M. Lledo Multiple and opposing roles of cholinergic transmission in the main olfactory bulb, J. Neurosci., Volume 19 (1999), pp. 9180-9191

[260] J.S. Isaacson Glutamate spillover mediates excitatory transmission in the rat olfactory bulb, Neuron, Volume 23 (1999), pp. 377-384

[261] S. Grossberg Adaptive pattern classification and universal recoding. II. Feedback, expectation, olfaction, illusions, Biol. Cybern., Volume 23 (1976), pp. 187-202

[262] J.A. Anderson; J.W. Silverstein; S.A. Ritz; R.S. Jones Distinctive features, categorical perception, and probability learning. Some applications of a neural model, Psychol. Rev., Volume 84 (1977), pp. 413-451

[263] J.J. Hopfield Neural networks and physical system with emergent collective computational abilities, Proc. Natl Acad. Sci. USA, Volume 79 (1982), pp. 2554-2558

[264] T. Kohonen Self-organization and Associative Memory, Springer-Verlag, New York, 1984

[265] D.E. Rumelhart; G.E. Hinton; R.J. Williams Learning representations by backpropagating errors, Nature, Volume 323 (1986), pp. 533-536

[266] S. Grossberg The Adaptative Brain. I. Cognition, Learning, Reinforcement, and Rhythm, Elsevier, North-Holland, 1987

[267] G. Laurent; M. Stopfer; R. Friedrich; M. Rabinovich; A. Volkovskii; H. Abarbanel Odor encoding as an active, dynamical process: experiments, computation and theory, Annu. Rev. Neurosci., Volume 24 (2001), pp. 263-297

[268] G. Laurent Olfactory network dynamics and the coding of multidimensional signals, Natl Rev. Neurosci., Volume 3 (2002), pp. 884-895

[269] G. Laurent Dynamical representation of odors by oscillating and evolving neural assemblies, Trends Neurosci., Volume 19 (1996), pp. 489-496

[270] R.W. Friedrich; G. Laurent Dynamic optimization of odor representations by slow temporal patterning of mitral cell activity, Science, Volume 291 (2001), pp. 889-894

[271] G. Laurent; M. Wehr; H. Davidowitz Temporal representations of odors in an olfactory network, J. Neurosci., Volume 16 (1996), pp. 3837-3847

[272] K. MacLeod; G. Laurent Distinct mechanisms for synchronization and temporal patterning of odor-encoding neural assemblies, Science, Volume 274 (1996), pp. 976-979

[273] K. MacLeod; A. Backer; G. Laurent Who reads temporal information contained across synchronized and oscillatory spike trains?, Nature, Volume 395 (1998), pp. 693-698

[274] M. Stopfer; S. Bhagavan; B.H. Smith; G. Laurent Impaired odour discrimination on desynchronization of odour-encoding neural assemblies, Nature, Volume 390 (1997), pp. 70-74

[275] M. Bazhenov; M. Stopfer; M. Rabinovich; H.D. Abarbanel; T.J. Sejnowski; G. Laurent Model of cellular and network mechanisms for odor-evoked temporal patterning in the locust antennal lobe, Neuron, Volume 30 (2001), pp. 569-581

[276] M. Bazhenov; M. Stopfer; M. Rabinovich; R. Huerta; H.D. Abarbanel; T.J. Sejnowski; G. Laurent Model of transient oscillatory synchronization in the locust antennal lobe, Neuron, Volume 30 (2001), pp. 553-567

[277] M. Cohen; S. Grossberg Neural networks and physical systems with emergent computationnal abilities, Proc. Natl Acad. Sci. USA, Volume 79 (1982), pp. 2554-2558

[278] M.I. Rabinovich; A. Volkovskii; P. Lecanda; R. Huerta; H.D.I. Abarbanel; G. Laurent Dynamical encoding by networks of competing neuron groups: winnerless competition, Phys. Rev. Lett., Volume 87 (2001), p. 068102

[279] M. Wehr; G. Laurent Odour encoding by temporal sequences of firing in oscillating neural assemblies, Nature, Volume 384 (1996), pp. 162-166

[280] P. Varona; M. Rabinovich; A.I. Selverston; Y.I. Arshavsky Winnerless competition between sensory neurons generate chaos, Chaos, Volume 12 (2002), pp. 672-677

[281] Y.V. Panchin; Y.I. Arshavsky; T.G. Deliagina; L.B. Popova; G.N. Orlovsky Control of locomotion in marine mollusk Clione Limacina. IX. Neuronal mechanisms of spatial orientation, J. Neurophysiol., Volume 73 (1995), pp. 1924-1937

[282] T.G. Deliagina; Y.I. Arshavsky; G.N. Orlovsky Control of spatial orientation in a mollusc, Nature, Volume 393 (1998), pp. 172-175

[283] Y.V. Panchin; L.B. Popova; T.G. Deliagina; G.N. Orlovsky; Y.I. Arshavsky Control of locomotion in marine mollusk Clione Limacina. VIII. Cerebropedal neurons, J. Neurophysiol., Volume 73 (1995), pp. 1912-1923

[284] Y.I. Arshavsky; T.G. Deliagina; G.N. Gamkrelidze; G.N. Orlovsky; Y.V. Panchin; L.B. Popova Pharmacologically induced elements of the hunting and feeding behavior in the pteropod mollusk Clione Limacina. II. Effects of physostigmine, J. Neurophysiol., Volume 69 (1993), pp. 522-532

[285] J. Szentagothai The Ferrier Lecture, 1977. The neuron network of the cerebral cortex: a functional interpretation, Proc. R. Soc. Lond. B: Biol. Sci., Volume 201 (1978), pp. 219-248

[286] J. Szentagothai The neuronal architectonic principle of the neocortex, Ann. Acad. Bras. Cienc., Volume 57 (1985), pp. 249-259

[287] W.S. Mc Culloch; W.H. Pitts A logical calculus of the ideas immanent in neural nets, Bull. Math. Biophys., Volume 5 (1943), pp. 115-133

[288] I. Tsuda; E. Körner; H. Shimizu Memory dynamics in asynchronous neural networks, Prog. Theor. Phys., Volume 78 (1987), pp. 51-71

[289] I. Tsuda Chaotic itinerancy as a dynamical basis of Hermeneutics in brain and mind, World Futures, Volume 32 (1991), pp. 167-185

[290] I. Tsuda Dynamic link of memory–chaotic memory map in nonequilibrium neural networks, Neural Networks, Volume 5 (1992), pp. 313-326

[291] I. Tsuda Toward an interpretation of dynamic neural activity in terms of chaotic dynamical systems, Behav. Brain Sci., Volume 24 (2001), pp. 793-810

[292] M. Quoy; J. Blanquet; E. Dauce Commentary on Ishiro Tsuda: Learning and control with chaos: from biology to robotics, Behav. Brain Sci., Volume 24 (2001), pp. 824-825

[293] W.J. Freeman The creation of perceptual meanings in cortex through chaotic itinirancy and sequential state transitions induced by sensory stimuli (P. Kruse; M. Stadler, eds.), Ambiguity in Mind and Nature, Springer-Verlag, 1995, pp. 421-437

[294] K. Matsumoto; I. Tsuda Extended informations in one dimensionnal-maps, Physica D, Volume 26 (1987), pp. 347-357

[295] I. Tsuda Can stochastic renewal of maps be a model for cerebral cortex?, Physica D, Volume 75 (1994), pp. 165-178

[296] K. Matsumoto; I. Tsuda Noise-induced order, J. Stat. Phys., Volume 31 (1983), pp. 87-106

[297] I. Tsuda A new type of self-organization associated with chaotic dynamics in neural networks, Int. J. Neural Syst., Volume 7 (1996), pp. 451-459

[298] I. Tsuda; A. Yamaguchi Singular-continuous nowhere-differentiable attractors in neural systems, Neural Networks, Volume 11 (1998), pp. 927-937

[299] J.K. Foster Commentary on Ishiro Tsuda: Cantor coding and chaotic intinirancy: relevance for episodic memory, amnesia, and the hippocampus, Behav. Brain Sci., Volume 24 (2001), pp. 815-816

[300] A. Raffone; C. van Leeuwen Commentary on Ishiro Tsuda: Chaos and neural coding: is the binding problem a pseudo-problem?, Behav. Brain Sci., Volume 24 (2001), pp. 826-827

[301] H.D. Abarbanel; M.I. Rabinovich Neurodynamics: nonlinear dynamics and neurobiology, Cur. Opin. Neurobiol., Volume 11 (2001), pp. 423-430

[302] C.C. King Fractal and chaotic dynamics in nervous systems, Prog. Neurobiol., Volume 36 (1991), pp. 279-308

[303] J.S. Nicolis Should a reliable information processor be chaotic?, Kybernetes, Volume 11 (1982), pp. 269-274

[304] J.S. Nicolis The role of chaos in reliable information processing, J. Franklin Inst., Volume 317 (1984), pp. 289-307

[305] J.S. Nicolis; I. Tsuda Chaotic dynamics of information processing: the ‘magic number seven plus-minus two’ revisited, Bull. Math. Biol., Volume 47 (1985), pp. 343-365

[306] P. Grassberger Toward a quantitative theory of self-generated complexity, Int. J. Theor. Phys., Volume 25 (1986), pp. 907-938

[307] P. Grassberger Information content and predictability of lumped and distributed dynamical systems, Physica Scripta, Volume 40 (1989), pp. 107-111

[308] R.R. de Ruyter van Steveninck; G.D. Lewen; S.P. Strong; R. Koberle; W. Bialek Reproducibility and variability in neural spike trains, Science, Volume 275 (1997), pp. 1805-1808

[309] F. Rieke; D. Warland; R.R. de Ruyter van Steveninck; W. Bialek Spikes: exploring the neural code, Computational Neuroscience, MIT Press, Cambridge, MA, USA, 1997

[310] A. Borst; F.E. Theunissen Information theory and neural coding, Nat. Neurosci., Volume 2 (1999), pp. 947-957

[311] G.T. Buracas; T.D. Albright Gauging sensory representations in the brain, Trends Neurosci., Volume 22 (1999), pp. 303-309

[312] W. Bialek; F. Rieke; R.R. de Ruyter van Steveninck; D. Warland Reading a neural code, Science, Volume 252 (1991), pp. 1854-1857

[313] L. Kay Commentary on Ishiro Tsuda: Chaotic itinerancy: insufficient perceptual evidence, Behav. Brain Sci., Volume 24 (2001), pp. 819-820

[314] R.D. Beer Computational and dynamical languages for autonomous agents (R.F. Port; T. Van Gelder, eds.), Exploration in the Dynamics of Cognition: Mind as Motion, MIT Press, 1995, pp. 121-147

[315] R.D. Beer Framing the debate between computationnal and dynamical approaches to cognitive science, Behav. Brain Sci., Volume 21 (1998), p. 630

[316] R.D. Beer Dynamical approaches to cognitive science, Trends Cogn. Sci., Volume 4 (2000), pp. 91-99

[317] N. Wiener Cybernetics: or the Control and Communication in the Animal and the Machine, Wiley, New York, 1948

[318] C.E. Shannon Mathematical theory of communication, The Bell Syst. Techn. J., Volume 27 (1948), pp. 379-423 (623–656)

[319] J.A. Fodor The Language of Thought, Harvard University Press, USA, 1975

[320] Z.W. Pylyshyn Computation and Cognition, MIT Press, 1984

[321] D. Rumelhart; J.L. Mc Clelland Foundations, Parallel Distributed Processing, 1, MIT Press, Cambridge, UK, 1986

[322] P. Smolensky On the proper treatment of connectionism, Behav. Brain Sci., Volume 11 (1988), pp. 1-74

[323] T. Van Gelder The dynamical hypothesis in cognitive science, Behav. Brain Sci., Volume 21 (1998), pp. 615-665

[324] T. Van Gelder; R.F. Port It's about time: an overview of the dynamical approach to cognition (R.F. Port; T. Van Gelder, eds.), Exploration in the Dynamics of Cognition: Mind as Motion, MIT Press, 1995, pp. 1-43

[325] R.H. Abraham; C.D. Shaw Dynamics – The Geometry of Behavior, Addison-Wesley, Redwood City, CA, USA, 1992

[326] J.L. Elman Language as dynamical system (R.F. Port; T. Van Gelder, eds.), Exploration in the Dynamics of Cognition: Mind as Motion, MIT Press, 1995, pp. 195-225

[327] J.A.S. Kelso Dynamic Patterns: The Self-Organization of Brain and Behavior, MIT Press, Cambridge, UK, 1995

[328] J.P. Crutchfield Is anything ever new? Considering emergence (G. Cowan; D. Pines; D. Melzner, eds.), Complexity: Metaphors, Models, and Reality, SFI Series in the Sciences of Complexity XIX, Addison-Wesley, Redwood City, CA, USA, 1994, pp. 479-497

[329] G. Schoner; J.A. Kelso Dynamic pattern generation in behavioral and neural systems, Science, Volume 239 (1988), pp. 1513-1520

[330] M.T. Turvey Coordination, Am. Psychol., Volume 45 (1990), pp. 938-953

[331] J.L. Elman Distributed representations, simple recurrent networks and grammatical structure, Mach. Learn., Volume 7 (1991), pp. 195-225

[332] M.T. Turvey; C. Carello Some dynamical themes in perception and action (R.F. Port; T. Van Gelder, eds.), Exploration in the Dynamics of Cognition: Mind as Motion, MIT Press, 1995, pp. 373-401

[333] J.B. Pollack On wings of knowledge: a review of Allen Newell's unified theories of cognition, Artif. Intell., Volume 59 (1992), pp. 355-369

[334] M. Giunti Dynamical models of cognition (R.F. Port; T. Van Gelder, eds.), Exploration in the Dynamics of Cognition: Mind as Motion, MIT Press, 1995, pp. 549-571

[335] J.T. Townsend; J. Busemeyer Dynamics representation of decision-making (R.F. Port; T. Van Gelder, eds.), Exploration in the Dynamics of Cognition: Mind as Motion, MIT Press, 1995, pp. 101-120

[336] J.B. Pollack The introduction of dynamical recognizers (R.F. Port; T. Van Gelder, eds.), Exploration in the Dynamics of Cognition: Mind as Motion, MIT Press, 1995, pp. 283-312

[337] K. Kaneko Period-doubling of kink-antikink patterns quasiperiodicity in antiferro-like structures, and spatial intermittency in coupled logistic lattice, Prog. Theor. Phys., Volume 72 (1984), p. 480

[338] K. Kaneko Simulating physics with coupled map lattices (K. Kawasaki; A. Onuki; M. Suzuki, eds.), Formation, Dynamics and Statistics of Patterns, World Scientific, 1990, pp. 1-52

[339] K. Kaneko Overview of coupled map lattices, Chaos, Volume 2 (1992), pp. 279-283

[340] D. DeMaris Attention, depth gestalts, and spatially extended chaos in the perception of ambiguous figures (D. Levine; V. Brown; T. Shirey, eds.), Oscillations in Neural Systems, L. Erlbaum Associates, 2000, pp. 239-258

[341] D. Marr Vision, Freeman, W.H. and Compagny, San Francisco, 1982

[342] P. Erdi Commentary on Ishiro Tsuda: How to construct a brain theory?, Behav. Brain Sci., Volume 24 (2001), p. 815

[343] I. Tsuda; K. Tadaki A logic-based dynamical theory for a genesis of biological threshold, Biosystems, Volume 42 (1997), pp. 45-64

Comments - Policy