The
primordial experience of reality is one of non-separateness. According to
Nishid, for instance, pure Experience refers to a "unity encompassing the
universe." It is the beginning of the world, a it is the beginning of
con-sciousness. Recognition and its object are one. What prevails is a current
awareness of the factual as such, without any meaning attached. A
differentiation between inside (dream) and outside (perception) has not yet
taken place. The experience of human history is repeated in the ontogenesis of
the individual. "The consciousness of the newborn infant is an undefined muddled
wholeness in which even light and darkness are not yet separated. Out of this
wholeness, individual states of consciousness evolve by a process of
differentiation." The noumenon of nature is ultimately the fact of direct
experience in which subject and object have not yet been separated."
2.
The separation, the first original decision, leads to self-experience and
knowledge of nature. According to Nishida, for instance, the subject and object
disassociate into opposing entities. Thus, The subject is primordially split,
having evolved out of separation process. The object, i. e., the "phainomenon
signifies the condition of contradiction in the differentiating evolution of
reality." The field of tension between noumenon and phainomenon is the realm
where spirits, gods, the One God, metaphysics, nominalism and realism thrive.
Their function always include that of a return link to the origin, whether in
the form of myth or the Big Bang theory. The subject strives for reunification.
Nature now signifies "what remains when the subjective side (but this is the
unifying function!) is deducted from concrete reality. The laws of nature
derived by the so-called inductive method terminate in the hypothesis that one
is the cause of another natural science cannot get be-yond this explanation, mo
matter what progress it may yet make. All it can do is become more exact and
more general."
3. An objective truth process from which the subject has
dissassociated itself marks the beginning of modern natural science. According
to Robert Boyle (Latour), for instance, empirical science knows things because
it may be able to produce them under conditions which are in control, i. e.,
from which an acciden-tal or subjective element has been removed. The laboratory
experiment forces the objects themselves to display their inherent laws under
repeatable conditions and in the witnessing presence of qualified subjects. The
role of the human observer is merely to testify to the staged obvious
occurrence. The actual protagonists in the theater of proof are non-human
actors. These actors, which have been produced and mobilized by empiricism
within a network of standardized practices, follow only their inherent laws. The
latter are identifiable, subjects are not only capable of knowing ( i. e.
seeing), but also have the ability to desire. The proof of the analysis lies in
the synthesis. That which is known can be made.
4. We are seeing double.
We see things as they cross our eye, and we see them through a proliferating
fluff of signifiers. These symbols, concepts, metaphors are used to catch
objects as the latter are progressively isolated from the continuum of nature.
They are the contingency condition of the history of thought, since its very
origin. According to Flusser, for instance, signifiers have passed through a
sequence of universes of increasing abstraction and decreasing dimensionality,
e. g., "the universe of sculpture (timeless bodies), that of images ( depthless
surfaces), that of text (surfaceless lines), and that of computation (lineless
points)." The individual origins are marked by the Venus of Willendorf, the cave
paintings of Lascaux, the writing system of Ugarit, and the technical images
created by photography, motion pictures, TV and computers. The numbers. The
organs used to support our notions in counting are the fingers, through which
the calculi run. For concepts in their linear structure, it is the "inner"
(theoretical) eye. The uni-dimensional world of the textline and the fabric made
up from it and with it form the basis for logos, truth, logic, causality, and
history, i. e., the Gutenberg Galaxy.
5. Signifiers detach themselves
from objects and become independent and complacent. According to Kittler, for
instance, Leibniz was the take-off point. "Never before had anybody made a
systematic attempt to manipulate neither things, nor words, nor humans, but
naked and mute symbols." While for Gauss numbers still worked as the lever
applied to objects, they soon afterwards began a life of their own. August de
Morgan wrote on Euler's imaginary number i that its "impossibility" (i.e.,
inconceivability) vanished "as soon as one has developed the habit of accepting
symbols and combination laws without attaching any meaning whatsoever to them."
At the peak of the Gutenberg Galaxy, signifiers shrink to their minimum scope,
re-lying on the sole distinction between two signs, i.e., the bits (basic
indissoluble information units) of Boole's binary notation based on the truth
values of formal logics, 0 and 1. Some 90 years later, Shannon proved that these
truth operations can be translated iinto on and off states of electrical
components. And finally (according to Hodges, for instance), Gödel responded to
Hilbert's postulate of completeness, freedom from contradiction and
decidiability of mathematics by demonstrating not only that number theory
statements exists which can be neither proven nor refuted, but also that all the
"proving" operations, the "chess-like" rules of logical inference, are
themselves arithmetic in nature and "that the formulars of this system can be
encoded by means of numbers, so that the obtained numbers representing
statements on numbers. " Since the disappearance of this last distinction
between numbers and numeric operations, it has been possible to apply mechanical
(i.e., meaningless) procedures to auto-referential statements.
6. In the
next and last step, this mechanical process is implemented in an automatic
machine. Addressing the last of Hilbert's question left by Gödel (i.e., that of
decidiability), Turing
conceived a machine, out there on the Grancester grasslands. It was a special
machine which, as he was able to prove, emulates every phenomenon and every
process lending itself to a complete and unambiguous description (this, by the
way, is the definition both of the algorithm/automaton and of the concept of
intersubjectively verifiable knowledge as used in science). As of this point,
the problem of building new machines regardless of which size or complexity, is
replaced by the problem of writing and finite set of instructions for the
universal machine which turns the latter into the new machine. At the zero point
of the signifier dimensions, we witness a type of Big Bang marking what Flusser
calls a revolutionary 180-degree turn. The Turing machine is the simplest
conceivable automaton, producing systems of any imaginable degree of complexity,
e.g., our knowledge of nature. What is more, a Turing-informed view of the
functional components of the human brain was able to conceive a comprehensive,
but nevertheless finite automaton which, as such, would on principle lend itself
to emulation by a universal Turing machine. Modelling, logical conclusions,
pattern recognition, etc., i.e., all the processes that which is conventionally
called 'thinking' have thus been placed in the machine.
7. With this
step, the conditions of our thought and existence have undergone a fundamental
change. We exist in an environment of signifiers in which hypotheses are
automatically executable. It is an environment where, as in Boole's theater of
proof, objects appear in their determinating interrelationships. Scientists
engage in simulation and scientific visualization in the belief that they are
following the classical model of science, wherein essentially hypothetical
approximations to a subject-independent world are tentatively anticipated at the
theory formation level and subsequently tested by experiments and measurements.
An example is Knowbotic Research's computer aided South Pole, where antarctica
is a topos defined almost exclusively through science. Yet, scientists state
that they would see nothing if they ever went there. For this reason, all that
exists at this location are automatons and robots feeding measurements data into
laboratories via Internet that are then used for simulation purposes. Knowbotic
Research call the resulting product Computer Aided Nature (CAN). Our perception
of nature, which has left behind the limits of an immediately perceivable long
ago (whether it is Antarctica, the quantum world or the realm of the stars), but
according to conventional wisdom is a prerequisite for cognition, is taken
over,on the one hand, by measuring equipment, scanners, satellites and
telescopes (the assumption being that the data-model will become more "truthful"
with increasing scanning density and surface area covered). On the other hand,
it is replaced by calculation, i.e., autoreferential operations performed on the
existing data space. This result, ultimately, cuts the ground from under the
feet of realism (whether naiv or critical). Simulations are not experiments; the
artificial second order visibility does not provide the visual evidence Boyle
was striving for; the bit signifiers are not indicative, but imperative in
nature (programme as prescription); autonomous models running outside human
cognitive capability are not thought models. The contingency conditions of
cognition, i.e, the perceivability of the object and the perceiving power of the
subject, have entered the machine. In the bit space of scanned and computered
CAN, autonomous agents (knowbots) are moving as first rudimentary approaches
towards the realization of Turing's brain machine. It appears that they are on
the side of scientific truth, yet with their status of epistemological double
agents, chimerae, signifier-generated beings, they always already belong to the
side of pure code.
8. When we were talking about CAD/CAM, we refer to a
computer-supported purpose-oriented generation, the design and production of
artefacts. What is the meaning of a computer-assisted existence in a
computer-based nature? The matrix contains nothing which has not been written,
nor written itself. There is no background of non-signifiers, nothing accidental
that would serve no design intention to convey contents. Where everything has
been made, there are no unmotivated objects. According to Brenda Laurel, for
instance, "The representation is all there is. Think of it as an existential
WYSIWYG." In a pure signifier space the question is not what something is, but
what should be - the question is not for truth, but for design. Models instead
of concepts, magic instead of logos, stories instead of history, aesthetics
instead of epistemology (e.g., Flusser or Feyerabend). Today we are at this zero
point of dimensions, the world of points which are "unmeasurable, a nothing, but
at the same time, immeasurable, an everything" (Flusser). "The universe of
points is empty because it contains nothing except possibilities, and the fact
that it contains all these possibilities makes it a full universe". From this,
Flusser derives his postulate that we must learn to "think, feel and act in a
category of 'possibility'." The Turing machine, which may be any machine,
constitues in Flusser's exact meaning this space of all-encompassing possibility
.
9. The term "second nature" or CAN today refers to the Turing Galaxy
with its memories, computers, and networks. The world of instant answers,
location-less gods, omniscience, is based on the original division of zero and
one. Inside it, the world coincides again with what is known about it. Cognition
and its object are once again one. There is "unity encompassing the universe",
no differentiation between inside (dream) and outside (perception). Having
passed through history from animism to animation (animateness of the first, and
secondary animation of the second nature), the subject, in its quest for
reunification, has returned into its self-made paradise. Flusser's demand for
thought, feeling and action in the 'possibility' category would thus mean
getting involved with the computer. The machine, being called host and server,
is only too ready to let us get involved. It is up to us to let ourselves be
invited and served up by these machines, to get settled in them.
Literatur: -Nishida, Kitar, über das Gute. Eine Philosophie der Reinen
Erfahrung, Übers. und eingeleitet v. Peter Pörtner, Ffm 1989
-
Latour, Bruno, We have Never Been Modern, Cambridge 1993, - Kittler,
Friedrich, Draculas Vermächtnis. Technische Schriften, Leiprig 1993, -
Hodges, Andrew, Alan Turing, Enigma, Berlin 1989, - Flusser, Villem, Lob der
Oberflächlichkeit. Für eine Phänomenologie der Medien, Bensheim und Düsseldorf
1993, - Unverzagt, Christian, Das verschwiegene Buch Meta-Realismus
(unveröffentlicht) - Laurel, Brenda, Computer als Theater, Reading, Mass.
etc. 1991 - Grassmuck, Volker, Vom Animismus zur Animation, Hamburg 1998,