What's New

Chaos Theory Home * Healing & Consciousness * Chaos Consciousness * Antichaos * Chaos & You * Chaosophy * Chaotic Complexes * Universal Solvent * Self-Organizing Dreams * Process Physics * What's New

What's New with My Site?



The Chaos Theory
The aim of non linear methodologies is a description of complexity and the exploration of the
multidimensional interactions within and among components of given systems. An important
concept here is that of chaotic behaviour. It will be defined chaotic if trajectories issuing from
points of whatever degree of proximity in the space of phase, distance themselves from one
another over time in an exponential way.
In detail, the basic critical principles may be reassumed as it follows:
1) Non linear systems under certain conditions may exhibit chaotic behaviour;
2) The behaviour of a chaotic system can change drastically in response to small changes
in the system’s initial conditions;
3) A chaotic system is deterministic;
4) In chaotic systems the output system is no more proportionate to system input.
Chaos may be identified in systems also excluding the requirement of determinism. The standard
approach to classical dynamics assumes the Laplace point of view that the time evolution of a
system is uniquely determined by it’s initial conditions. Existence and uniqueness theorem of
differential equations require that the equations of motion everywhere satisfy the Lipschitz
condition. It has long been tacitly assumed that Nature is deterministic, and that correspondingly,
the equations of motion describing physical systems are Lipschitz. However, there is no a priori
reason to believe that Nature is unfailingly Lipschitzian. In very different conditions of interest,
some systems exhibit physical solutions corresponding to equations of motion that violate the
Lipschitz condition. The point is of particular interest. If a dynamical system is non-Lipschitz at a
singular point, it is possible that several solutions will intersect at this point. This singularity is a
common point among many trajectories, and the dynamics of the system, after the singular point
Journal of Consciousness Exploration & Research| December 2010 | Vol. 1 | Issue 9 | pp. 1070-1138
Conte, E., Todarello, O., Conte, S., Mendolicchio, L., Mendolicchio, L. & Federici, A.
Methods and Applications of Non-Linear Analysis in Neurology and Psycho-physiology
ISSN: 2153-8212 Journal of Consciousness Exploration & Research
Published by QuantumDream, Inc.
is intersected, is not in any way determined by the dynamics before. Hence the term nondeterministic
dynamics takes place. For a non-deterministic system, it is entirely possible (if not
likely) that as the various solutions move away from the singularity, they will evolve very
differently, and tend to diverge. Several solutions coincide at the non-Lipschitz singularity, and
therefore whenever a phase space trajectory comes near this point, any arbitrarily small
perturbation may push the trajectory on to a completely different solution. As noise is intrinsic to
any physical system, the time evolution of a non-deterministic dynamical system will consist of
a series of transient trajectories, with a new one being chosen randomly whenever the solution (in
the presence of noise) nears the non-Lipschitz point. We term such behaviour non-deterministic
chaos. This approach to chaos theory was initiated by Zak, Zbilut and Webber [1] and rather
recently we have given several examples, theoretical and experimental verifications on this
important chaotic behaviour [2].

2.1 Embedding time series in phase space
The notion of phase space is well known in physics. Let us consider a system, determined by the
set of its variables. Since they are known, those values specify the state of the system at any time.
We may represent one set of those values as a point in a space, with coordinates corresponding to
those variables. This construction of space is called phase space. The set of states of the system is
represented by the set of points in the phase space. The question of interest is that we perform an
analysis of the topological properties of phase space but, as a counterpart, we obtain insights into
the dynamic nature of the system. In experimental conditions, especially in experimental clinical
studies, we are unable to measure all the variables of the system. In this case we may be able to
reconstruct equally a phase space from experimental data where only one of the present variables
(characterizing the whole system) is actually measured. The phase space is realized by a set of
independent coordinates. Generally speaking, the attractor is the phase space set generated by a
dynamical system represented by a set of difference or differential equations. In the actual case,
let us take a non linear dynamical system represented by three independent
variables X (t),Y(t), Z(t) , functions of time.

 Fractality refers to the features of a given stochastic time series. It shows temporal self-similarity.
A time series is said self-similar if its amplitude distribution remains unchanged by a constant
factor even when the sampling rate is changed. In the time domain one observes similar patterns
at different time scales. In the frequency domain the basic feature of a fractal time series is its
power law spectrum in the proper logarithmic scale. Fractals and chaos have many common
points. When the phase space set is fractal, the system that generated the time series is chaotic.
Chaotic systems can be arranged that generate a phase space set of a given fractal form.
However, the systems and the processes studied by fractals and chaos are essentially different.
Fractals must be considered processes in which a small section resembles the whole. The point in
fractal analysis is to determine if the given experimental time series contains self-similar
features. Deterministic chaos means that the output of a non linear deterministic system is so
complex that in some manner mimes random behaviour. The point in deterministic chaos
analysis is to investigate the given experimental time series that arises from a deterministic
process and to understand in some manner the mathematical features of such a process.
Regarding a chaotic time series, this means that the corresponding system has sensitivity to initial
conditions. When we speak about strange attractors this means that the attractor is fractal [for
details see 4]. It is very important to account for such properties since there are also chaotic
systems that are not strange in the sense that they are exponentially sensitive to initial conditions
but do not have a fractal attractor. Still we have non chaotic systems that are strange in the sense
that they are not sensitive to initial conditions but they have a fractal attractor. In conclusion, we
must be careful in considering fractals and non linear approaches since they are very different
from each other. Often, instead, we are induced to erroneously mix different things with serious
The geometry of the attractors is frequently examined by calculating the so called correlation
dimension [7].


Let us start with a brief discussion on the use of non linear methodologies in psychology.
Psychological data were usually collected in the past psychological studies to assess differences
between individuals or groups which were considered to be stable over time (1,2). Instead, a
further approach has gained relevance in the past decade, which is aimed to perform an intensive
time sampling of psychological variables of individuals or groups at regular intervals, to study
time oscillations of the collected data (1). In this way human behavior has been investigated to
analyze, for instance, the impact of everyday experience on well-being (3) or the after-effects of
negative events (4) or to examine the association between emotions and behavioral settings (5).
These studies were often aimed to analyze the nature of rhythmical oscillations in mood and
performance of human beings (6). Such an approach leads to progressive changes not only in the
methods to sample psychological data but also in our way of thinking about many psychological
variables, which may be considered as expression of mind entities unfolding over time (1). A
reason to outline the importance of this approach is to acknowledge the role of the human
interactions in governing the transitions which continuously take place in mind entities.
It is becoming relevant the notion that our mind, our ideas and convictions are all formed as the
results of interactive changes and all they follow possibly a quantum like behavior. Let us
explain in detail what we mean by this statement (7,8). For certain questions, individuals have
predefined opinions, thoughts, feelings or, still, behaviors. This kind of condition may be
considered to be stable in time in the sense that an intensive time sampling of data, consisting as
example to questions asked to an individual from an outsider observer or by himself at regular
Journal of Consciousness Exploration & Research| December 2010 | Vol. 1 | Issue 9 | pp. 1070-1138
Conte, E., Todarello, O., Conte, S., Mendolicchio, L., Mendolicchio, L. & Federici, A.
Methods and Applications of Non-Linear Analysis in Neurology and Psycho-physiology
ISSN: 2153-8212 Journal of Consciousness Exploration & Research
Published by QuantumDream, Inc.
time intervals, will simply record a predefined answer that never will be determined and
actualized at the same time the question is posed .In this case, we have a stable dynamic pattern
for individuals or groups. The intensive time sampling of data will only confirm an information
on time dynamics that is stable in reporting a pattern in self-report or in performance measures
with regard to behavior in time of the involved individuals. It has been evidenced (7, 8) that,
under the profile of a statistical analysis, the cases as those just mentioned, in which individuals
have a predefined opinion or thought that may not be changed in time at the same moment in
which questions are actually posed, correspond to a kind of classical dynamics that, statistically
speaking, may be analyzed in terms of classical statistical approaches since they are not context
dependent (7,8). There are situations in which, instead, a person, who is being questioned by
himself or by an outsider observer, has no predefined opinion or thought or feeling or behavior
on the given question. The kind of opinion, as example, is formed (that is to say: it is actualized)
only at the moment in which the question itself is posed and it is formed on the basis of the
context in which the same question is posed. This is a case of a quantum like behavior for a
cognitive entity. The core of the difference resides in the fact that in the case of quantum like
behavior we are dealing with the actualization of a certain property that is dependent from the
instant of time in which the question is posed and thus, in particular, it depends also from the
context in which it is posed while, instead, in the classical case all properties are assumed to have
a definite connotation before the question itself is posed and thus they are time and context
independent. Processes of the first kind are said quantum like, and they follow a quantum like
statistics (7, 8). The basic content of such quantum probability approach is the calculation of a
probability of actualization of one among different potentialities as result of the individual
inspection itself or of an outsider observation. New paradigms are thus emerging in studies
regarding mind behavior: one is the concept of potentiality, linked to the concept of actualization.
Still, we have the concept of dynamic pattern that is linked to the observation of changing in time
as result of the interactive transitions (potentiality-actualization) which take place in human
interactions. The case of quantum like behavior is one of the manifold situations in which an
intensive time sampling of psychological data, may give important information on the dynamic
patterns in self-report and performance measures.
It is noteworthy that people have a defined “sense of self” and accompanying memories of a very
early age. It may be due to the fact that the “attractor” of personality (as developed by the brain)
has not established a defined enough probability of neuronal connections to establish such a
distribution: if neuronal connections are essentially uniform in their shape, it is questionable if an
attractor is defined. With repetitive learning inputs, the probability distributions become
established (narrowed) and “personality” emerges. Learning skills proceeds along similar lines:
repetitive “habits” further narrow the probability distributions so as to make a particular action
more refined to the point of not requiring active effort. Both personality and learning, however,
are dependent upon the genetics which establish the basic physiology of the neuronal machinery.
Predictability regarding personalities and activity is by definition of the singular dynamics, a
stochastic process: no matter how narrowed the probability distributions, there always remains a
level of uncertainty.
The performance of current neural networks is still too “rigid” in comparison with even simplest
biological systems. This rigidity follows from the fact that the behavior of a dynamical system is
fully prescribed by initial conditions. The system never “forgets” these conditions: it carries their
Journal of Consciousness Exploration & Research| December 2010 | Vol. 1 | Issue 9 | pp. 1070-1138
Conte, E., Todarello, O., Conte, S., Mendolicchio, L., Mendolicchio, L. & Federici, A.
Methods and Applications of Non-Linear Analysis in Neurology and Psycho-physiology
ISSN: 2153-8212 Journal of Consciousness Exploration & Research
Published by QuantumDream, Inc.
“burden” all the time. In contrast to this, biological systems are much more flexible: they can
forget (if necessary) the past, adapting their behavior to environmental changes.
The thrust here is to discuss the substantially new type of dynamical system for modeling
biological behavior introduced as non deterministic dynamics. The approach is motivated by an
attempt to remove one of the most fundamental limitations of current models of artificial neural
networks—their “rigid” behavior compared to biological systems. As has been previously
exposed in detail, the mathematical roots of the rigid behavior of dynamical systems are in the
uniqueness of their solutions subject to prescribed initial conditions. Such an uniqueness was
very important for modeling energy transformations in mechanical, physical, and chemical
systems which have inspired progress in the theory of differential equations. This is why the first
concern in the theory of differential equations as well as in dynamical system theory was for the
existence of a unique solution provided by so-called Lipschitz conditions. On the contrary, for
information processing in brain-style fashion, the uniqueness of solutions for underlying
dynamical models becomes a heavy burden which locks up their performance into a singlechoice
A new architecture for neural networks (which model the brain and its processes) is suggested
which exploits a novel paradigm in nonlinear dynamics based upon the concept of non-Lipschitz
singularities [7, 8]. Due to violations of the Lipschitz conditions at certain critical points, the
neural network forgets its past as soon as it approaches these points; the solution at these points
branches, and the behavior of the dynamical system becomes unpredictable. Since any
vanishingly small input applied at critical points causes a finite response, such an unpredictable
system can be controlled by a neurodynamical device which operates by noise and uniquely
defines the system behavior by specifying the direction of the motions in the critical points. The
super-sensitivity of critical points to external inputs appears to be an important tool for creating
chains of coupled subsystems of different scales whose range is theoretically unlimited.
Due to existence of the critical points, the neural network becomes a weakly coupled dynamical
system: its neurons (or groups of neurons) are uncoupled (and therefore, can perform parallel
tasks) within the periods between the critical points, while the coordination between the
independent units (i.e., the collective part of the performance) is carried out at the critical points
where the neural network is fully coupled. As a part of the architecture, weakly coupled neural
networks acquire the ability to be activated not only by external inputs, but also by internal
periodic rhythms. (Such a spontaneous performance resembles brain activity). It must be stressed,
however, that behavior may be predicted in the sense of establishing a probability distribution of
choices. Thus behavior is not determined, but ‘guessed’ within the bounds of the probability
In its most simple form, consider, for example, an equation without uniqueness:
dx/dt = x 1/3 cos wt.
At the singular solution, x = 0 (which is unstable, for instance at t = 0), a small noise drives the
motion to the regular solutions, x = ± (2/3w sin w t)3/2 with equal probabilities. Indeed, any
prescribed distribution can be implemented by using non-Lipschitz dynamics. It is important to
emphasize, however, the fundamental difference between the probabilistic properties of these
non-Lipschitz dynamics and those of traditional stochastic or differential equations: the
Journal of Consciousness Exploration & Research| December 2010 | Vol. 1 | Issue 9 | pp. 1070-1138
Conte, E., Todarello, O., Conte, S., Mendolicchio, L., Mendolicchio, L. & Federici, A.
Methods and Applications of Non-Linear Analysis in Neurology and Psycho-physiology
ISSN: 2153-8212 Journal of Consciousness Exploration & Research
Published by QuantumDream, Inc.
randomness of stochastic differential equations is caused by random initial conditions, random
force or random coefficients; in chaotic equations small (but finite) random changes of initial
conditions are amplified by a mechanism of instability. But in both cases the differential operator
itself remains deterministic. Thus, there develops a set of “alternating,” “deterministic”
We would now discuss the reason of a terminology that is delineating. As said, the analogy is
with the physics. The state s(t) of a physical entity S at time t represents the reality of this
physical entity at that time. In the case of classical physics the state is represented by a point in
phase space while in quantum physics it is represented by a unit vector in Hilbert space. In
classical terms the state s(t) of the physical entity S determines the values of all the observable
quantities connected to S at time t. The state q(t) of a quantum entity is represented instead by a
unit vector of Hilbert space, the so called normalized wave function y(r,t). For a quantum entity
in state y(r,t) the values of the observable quantities are potential: this is to say that a quantum
entity never has, as example, simultaneously a definite position and a definite momentum and
this represents the intrinsic quantum indeterminism that affects reality at this level. We have the
relevant concept of potentiality: a quantum entity has the potentiality to realize some definite
value for some of its observable quantities. This happens only at the moment of the observation
or of measurement and it is this mechanism that realizes a transition from a pure condition of
potentiality to a pure condition of actualization. A definite value is not actually realized in the
potential state y(r,t). A definite values is really actualized only at the moment of the direct
observation of some property of the given entity and through the same mechanism of the
observation during the act of the measurement. The novel feature is in the transition potentiality
 actualization that characterizes the mechanism of observation and measurement.
We have to realize here a large digression in order to clear in detail this point that appears to us
of fundamental importance.
As we know all quantum mechanics is based on such binomial conceptualization of potentiality
from one hand and actualization from the other hand. In particular, the actualization corresponds
to the observation and measurement or, that is to say, to the moment in which we become
conscious that some kind of measurement has happened (collapse of wave function) since we
read its result by some device. Generally speaking, a system is in a superposition of possible
states (superposition principle, potentiality) and such superposition principle is violated in a
measurement. This led von Neumann to postulate that we have two fundamentally different types
of time evolution for a quantum system. First, there is the casual Schrödinger equation evolution.
Second, there is the noncasual change due to a measurement and this second type of evolution
(passage from potentiality to actualization) seems incompatible with the Schrödinger form. This
situation forced von Neumann to introduce what is usually called the von Neumann postulate of
quantum measurement. This happened about 1932. Rather recently, one of us (EC), using two
theorems in Clifford algebra, has been able to give a complete justification of von Neumann
postulate. The result has appeared on International Journal of Theoretical Physics, and it is
available on line [8]. Thus we have given proof of a thing that for eighty years remained a
postulate, often discussed and largely questioned. This new result, at least under an algebraic
profile, explains the wave function collapse and gives total justification of it, also giving to
quantum mechanics an arrangement as self-consistent theory that in the past was often
Journal of Consciousness Exploration & Research| December 2010 | Vol. 1 | Issue 9 | pp. 1070-1138
Conte, E., Todarello, O., Conte, S., Mendolicchio, L., Mendolicchio, L. & Federici, A.
Methods and Applications of Non-Linear Analysis in Neurology and Psycho-physiology
ISSN: 2153-8212 Journal of Consciousness Exploration & Research
Published by QuantumDream, Inc.
questioned as missing in the theory and signing such missing as a probe of weakness of such
theory. In conclusion, the passage potentiality – actualization now seems a more demonstrated
transition to which we have to attribute the greatest importance if we do not aim to remain linked
to a too limited vision of our reality. On the other hand, there is no matter to continue an infinite
discussion on a possible link between quantum mechanics and cognition. We have unequivocal
results that demonstrate in detail such point. It is universally accepted that J. von Neumann
showed that projection operators represent logical statements. In brief, J. von Neumann showed
that we may construct logic starting from quantum mechanics. According to the fundamental
papers published by the great logician Yuri Orlov, and in the light of the results that, we repeat,
one of us has recently obtained, it may be unequivocally shown that also the inverted passage is
possible. Not only we may derive logic on the basis of quantum mechanics. We may derive
quantum mechanics from logic. So, the ring is closed. The link between quantum mechanics and
cognition is strongly established. The split that occurred between psychology and the physical
sciences after the establishment of psychology as an independent discipline cannot continue to
encourage a delay in acknowledging this thesis. We may be convinced that there are levels of our
reality in which the fundamental features of logic and thus of cognition acquire the same
importance as the features of what is being described. Here we no more can separate “matter per
se”, in Orlov words, from the features of logic and cognition used to describe it. We lose the
possibility of unconditionally defining the truth, as we explained previously, since the definition
of truth, now depend on how we observe (and thus we have cognition) the physical reality .
Obviously such relativism does not exist in classical mechanics while instead by quantum
mechanics we have a Giano picture able to look simultaneously on the left and on the right, at
cognitive as well as physical level.
Let us return now to the central problem we have in discussion.



What's New with My Subject?

If I didn't include a news section about my site's topic on my home page, then I could include it here.