2014년 8월 11일 월요일

Information

http://www.informationphilosopher.com/introduction/information/

Information
The simple definition of information is the act of informing - the communication of knowledge from a sender to a receiver that informs (literally shapes) the receiver.By information we mean a quantity that can be understood mathematically and physically. It corresponds to the common-sense meaning of information, in the sense of communicating or informing. It is like the information stored in books and computers. But it also measures the information in any physical object, like a recipe, blueprint, or production process, as well as the information in biological systems, including the genetic code, the cell structure, and the developmental learning of the phenotype.
A message that is certain to tell you something you already know contains no new information.
If everything that happens was certain to happen, as determinist philosophers claim, no new information would ever enter the universe. Information would be a universal constant. There would be "nothing new under the sun." Every past and future event can in principle be known (as Pierre-Simon Laplace suggested) by a super-intelligence with access to such a fixed totality of information.
It is of the deepest philosophical significance that information is based on the mathematics of probability. If all outcomes were certain, there would be no “surprises” in the universe. Information would be conserved and a universal constant, as some mathematicians mistakenly believe. Information philosophy requires the ontological uncertainty and probabilistic outcomes of modern quantum physics to produce new information.
But at the same time, without the extraordinary stability of quantized information structures over cosmological time scales, life and the universe we know would not be possible. Quantum mechanics reveals the architecture of the universe to be discrete rather than continuous, to be digital rather than analog.
Creation of information structures means that in parts of the universe the local entropy is actually going down. Creation of a low-entropy system is always accompanied by radiation of energy and entropy away from the local structure to distant parts of the universe, to the night sky and the cosmic background.
From Newton’s time to the start of the 19th century, the Laplacian view coincided with the notion of the divine foreknowledge of an omniscient God. On this view, complete, perfect and constant information exists at all times that describes the designed evolution of the universe and of the creatures inhabiting the world.
But at the same time, without the extraordinary stability of quantized information structures over cosmological time scales, life and the universe we know would not be possible. Quantum mechanics reveals the architecture of the universe to be discrete rather than continuous, to be digital rather than analog.
Creation of information structures means that in parts of the universe the local entropy is actually going down. Creation of a low-entropy system is always accompanied by radiation of energy and entropy away from the local structure to distant parts of the universe, to the night sky and the cosmic background.
From Newton’s time to the start of the 19th century, the Laplacian view coincided with the notion of the divine foreknowledge of an omniscient God. On this view, complete, perfect and constant information exists at all times that describes the designed evolution of the universe and of the creatures inhabiting the world.
In this God’s-eye view, information is a constant of nature. Some mathematicians argue that information must be a conserved quantity, like matter and energy. They are wrong. In Laplace's view, information would be a constant straight line over all time, as shown in the figure.
 If information were a universal constant, there would be “nothing new under the sun.” Every past and future event can in principle be known by Laplace's super-intelligent demon , with its access to such a fixed totality of information.
Midway through the 19th century, Lord Kelvin (William Thomson) realized that the newly discovered second law of thermodynamics required that information could not be constant, but would be destroyed as the entropy (disorder) increased. Hermann Helmholtz described this as the “heat death” of the universe.

Mathematicians who are convinced that information is always conserved argue that macroscopic order is disappearing into microscopic order, but the information could in principle be recovered, if time could only be reversed.
This raises the possibility of some connection between the increasing entropy and what Arthur Stanley Eddington called “Time’s Arrow.”
Kelvin’s claim that information must be destroyed when entropy increases would be correct if the universe were a closed system. But in our open and expanding universe, my Harvard colleague David Layzer showed that the maximum possible entropy is increasing faster than the actual entropy. The difference between maximum possible entropy and the current entropy is called negative entropy, opening the possibility for complex and stable information structures to develop.
We can see from the figure that it is not only entropy that increases in the direction of the arrow of time, but also the information content of the universe. We can describe the new information as "emerging."

Despite the second law of thermodynamics, stable and lawlike information structures evolved out of the initial chaos. First, quantum processes formed microscopic particulate matter – quarks, baryons, nuclei, and electrons. Eventually these became atoms,. Later, under the influence of gravitation – macroscopic galaxies, stars, and planets form.
Every new information structure reduces the entropy locally, so the second law requires an equal (or generally much greater) amount of entropy to be carried away. Without the expansion of the universe, this would be impossible.The positive entropy carried away (the big dark arrow on the left) is always greater than and generally orders of magnitude larger than the negative entropy in the created information structure (the smaller light arrow on the right).
See the cosmic creation process for the negative entropy flows that lead to human life.
Information is emergent, because the universe began in a state of minimal information (thermodynamic equilibrium, maximum disorder - or "entropy").
And there are three distinct kinds of information emergence:
  1. the "order out of chaos" when the matter in the universe forms information structures
    (this is Prigogine's chaos and complexity theory)
  2. the "order out of order" when the material information structures form self-replicating biological information structures
    (this is Schrödinger's definition of life as "feeding on negative entropy"
  3. the pure "information out of order" when organisms with minds process and externalize information, communicating it to other minds and storing it in the environment
    (this is our information theory of mind)
Information philosophy explains how new information is constantly being created, by nature and by humanity. We are co-creators of our universe.
Information theory is the mathematical quantification of communication to describe how information is transmitted and received, in human language, for example.
Information science is the study of the categorization, classification, manipulation, storage, and retrieval of information.
Cognitive science is the study of the mental acquisition, retention, and utilization of knowledge, which we can describe as actionable information.
Information philosophy is an attempt to examine some classic problems in philosophy from the standpoint of information.
What is information that merits its use as the foundation of a new philosophical method of inquiry?Abstract information is neither matter nor energy, yet it needs matter for its concrete embodiment and energy for its communication. Information is immaterial.
It is the modern spirit, the ghost in the machine.
Immaterial information is perhaps as close as a physical or biological scientist can get to the idea of a soul or spirit that departs the body at death. When a living being dies, it is the maintenance of biological information that ceases. The matter remains.
Biological systems are different from purely physical systems primarily because they create, store, and communicate information. Living things store information in a memory of the past that they use to shape their future. Fundamental physical objects like atoms have no history.
And when human beings export some of their personal information to make it a part of human culture, that information moves closer to becoming immortal.
Human beings differ from other animals in their extraordinary ability to communicate information and store it in external artifacts. In the last decade the amount of external information per person may have grown to exceed an individual's purely biological information.
Information is an excellent basis for philosophy, and for science as well, capable of answering questions about metaphysics (the ontology of things themselves), epistemology (the existential status of ideas and how we know them), idealism (pure information), the mind-body problem, theproblem of free will, and the "hard" problem of consciousness.

Actionable information has pragmatic value.
In our information philosophy, knowledge is the sum of all the information created and preserved by humanity. It is all the information in human minds and in artifacts of every kind - from books and internetworked computers to our dwellings and managed environment.We shall see that all information in the universe is created by a single two-part process, the only one capable of generating and maintaining information in spite of the dread second law of thermodynamics, which describes the irresistible increase in disorder or entropy. We call this anti-entropic process ergodic. It should be appreciated as the creative source of everything we can possibly value, and of everything distinguishable from chaos and therefore interesting.
Enabled by the general relativistic expansion of the universe, the cosmic creative process has formed the macrocosmos of galaxies, stars, and planets. It has also generated the particular forms of microscopic matter - atoms, molecules, and the complex macromolecules that support biological organisms. It includes all quantum cooperative phenomena.
Quantum phenomena control the evolution of life and human knowledge. They help bring new information into the universe in a fundamentally unpredictable way. They drive biological speciation. They facilitate human creativity and free will.
Although information philosophy looks at the universe, life, and intelligence through the single lens of information, it is far from mechanical and reducible to a deterministic physics. The growth of information over time - our principle of increasing information - is the essential reason why time matters and individuals are distinguishable.
Information is the principal reason that biology is not reducible to chemistry and physics. Increasing information (a combination of perfect replication with occasional copying errors) explains all emergent phenomena, including many "laws of nature."
In information philosophy, the future is unpredictable for two basic reasons. First, quantum mechanics shows that some events are not predictable. The world is causal, but not pre-determined. Second, the early universe does not contain the information of later times, just as early primates do not contain the information structures for intelligence and verbal communication, and infants do not contain the knowledge and remembered experience they will have as adults.
In the naive world of Laplace's demon and strict determinism, all the information in the universe is constant at all times. "Determinism" itself is an emergent idea, realized only when large numbers of particles assemble into bodies that can average over the irreducible microscopicindeterminacy of their component atoms.

Information and Entropy
In our open and expanding universe, the maximum possible entropy is increasing faster than the actual entropy. The difference between maximum possible entropy and the current entropy is called negative entropy. There is an intimate connection between the physical quantity negative entropy and information.To give this very positive quantity of "negative" entropy a positive name, we call it "Ergo" and describe processes capable of generating negative entropy "ergodic."
Ergodic processes provide room to increase the information structures in the universe. As pointed out by David Layzer, the Arrow of Time points not only to increasing disorder but also to increasing information.
The increase of biological information is primarily by perfect replication of prior existing information, but it is critically important that replication errors occur from time to time. They are the source of new species and creative new ideas.
The universe is creative. Information structures and processes are emergent. Some laws of nature are emergent. Adequately deterministic phenomena are emergent. The very idea of determinism is emergent. Knowledge of the present did not all exist in the past. We have only a rough idea of the exact future.
The creative process continues. Life and humanity are a part of the process. What gets created is in part our responsibility. We can choose to help create and preserve information. Or we can choose to destroy it.

댓글 없음:

댓글 쓰기