Read In the Beginning Was Information Online
Authors: Werner Gitt
Tags: #RELIGION / Religion & Science, #SCIENCE / Study & Teaching
Figure 8: |
What is the causative factor for the existence of information? What prompts us to write a letter, a postcard, a note of felicitation, a diary, or a comment in a file? The most important prerequisite is our own volition, or that of a supervisor. In analogy to the material side, we now introduce a fourth fundamental entity, namely "will" (volition), W. Information and volition are closely linked, but this relationship cannot be expressed in a formula, because both are of a nonmaterial (mental, intellectual, spiritual) nature. The connecting arrows indicate the following: Information is always based on the will of a sender who issues the information. It is a variable quantity depending on intentional conditions. Will itself is also not constant, but can in its turn be influenced by the information received from another sender. Conclusion:
Theorem 2: Information only arises through an intentional, volitional act.
It is clear from Figure 8 that the nonmaterial entity, information, can influence the material quantities. Electrical, mechanical, or chemical quantities can be steered, controlled, utilized, or optimized by means of intentional information. The strategy for achieving such control is always based on information, whether it is a cybernetic manufacturing technique, instructions for building an economical car, or the utilization of electricity for driving a machine. In the first place, there must be the intention to solve a problem, followed by a conceptual construct for which the information may be coded in the form of a program, a technical drawing, or a description, etc. The next step is then to implement the concept. All technological systems as well as all constructed objects, from pins to works of art, have been produced by means of information. None of these artifacts came into existence through some form of self-organization of matter, but all of them were preceded by establishing the required information. We can now conclude that information was present in the beginning, as the title of this book states.
Theorem 3: Information comprises the nonmaterial foundation for all technological systems and for all works of art.
What is the position in regard to biological systems? Does theorem 3 also hold for such systems, or is there some restriction? If we could successfully formulate the theorems in such a way that they are valid as laws of nature, then they would be universally valid according to the essential characteristics of the laws of nature, N2, N3, and N4.
Chapter
4
The Five Levels of the Information Concept
Figure 9:
Egyptian hieroglyphics.
Figure 9 is a picture of icons cut in stone as they appear in the graves of pharaohs and on obelisks of ancient Egypt. The question is whether these pictures represent information or not. So let us check them against the three necessary conditions (NC) for identifying information (discussed in more detail in paragraph 4.2):
NC 1: A number of symbols are required to establish information. This first condition is satisfied because we have various different symbols like an owl, water waves, a mouth, reeds, etc.
NC 2: The sequence of the symbols must be irregular. This condition is also satisfied, as there are no regularities or periodic patterns.
NC 3: The symbols must be written in some recognizable order, such as drawn, printed, chiseled, or engraved in rows, columns, circles, or spirals. In this example, the symbols appear in columns.
It now seems possible that the given sequence of symbols might comprise information because all three conditions are met, but it could also be possible that the Egyptians simply loved to decorate their monuments. They could have decorated their walls with hieroglyphics,
[7]
just like we often hang carpets on walls. The true nature of these symbols remained a secret for 15 centuries because nobody could assign meanings to them. This situation changed when one of Napoleon’s men discovered a piece of black basalt near the town of Rosetta on the Nile in July 1799. This flat stone was the size of an ordinary dinner plate and it was exceptional because it contained inscriptions in three languages: 54 lines of Greek, 32 lines of Demotic, and 14 lines of hieroglyphics. The total of 1,419 hieroglyphic symbols includes 166 different ones, and there are 468 Greek words. This stone, known as the Rosetta Stone (Figure 10), is now in the possession of the British Museum in London. It played a key role in the deciphering of hieroglyphics, and its first success was the translation of an Egyptian pictorial text in 1822.
[8]
Figure 10: |
Because the meaning of the entire text was found, it was established that the hieroglyphics really represented information. Today, the meanings of the hieroglyphic symbols are known, and anybody who knows this script is able to translate ancient Egyptian texts. Since the meaning of the codes is known, it is now possible to transcribe English text into hieroglyphics, as is shown in Figure 11, where the corresponding symbols have been produced by means of a computer/plotter system.
Figure 11: |
This illustrative example has now clarified some basic principles about the nature of information. Further details follow.
4.1 The Lowest Level of Information: Statistics
When considering a book B, a computer program C, or the human genome (the totality of genes), we first discuss the following questions:
– How many letters, numbers, and words make up the entire text?
– How many single letters does the employed alphabet contain (e. g. a, b, c …z, or G, C, A, T)?
– How frequently do certain letters and words occur?
To answer these questions, it is immaterial whether we are dealing with actual meaningful text, with pure nonsense, or with random sequences of symbols or words. Such investigations are not concerned with the contents, but only with statistical aspects. These topics all belong to the first and lowest level of information, namely the level of statistics.
As explained fully in appendix A1, Shannon’s theory of information is suitable for describing the statistical aspects of information, e.g., those quantitative properties of languages which depend on frequencies. Nothing can be said about the meaningfulness or not of any given sequence of symbols. The question of grammatical correctness is also completely excluded at this level. Conclusions:
Definition 1: According to Shannon’s theory, any random sequence of symbols is regarded as information, without regard to its origin or whether it is meaningful or not.
Definition 2: The statistical information content of a sequence of symbols is a quantitative concept, measured in bits (binary digits).
According to Shannon’s definition, the information content of a single message (which could be one symbol, one sign, one syllable, or a single word) is a measure of the probability of its being received correctly. Probabilities range from 0 to 1, so that this measure is always positive. The information content of a number of messages (signs for example) is found by adding the individual probabilities as required by the condition of summability. An important property of information according to Shannon is:
Theorem 4: A message which has been subject to interference or "noise," in general comprises more information than an error-free message.
This theorem follows from the larger number of possible alternatives in a distorted message, and Shannon states that the information content of a message increases with the number of symbols (see equation 6 in appendix A1). It is obvious that the actual information content cannot at all be described in such terms, as should be clear from the following example: When somebody uses many words to say practically nothing, this message is accorded a large information content because of the large number of letters used. If somebody else, who is really knowledgeable, concisely expresses the essentials, his message has a much lower information content.
Some quotations concerning this aspect of information are: French President Charles De Gaulle (1890–1970), "The ten commandments are so concise and plainly intelligible because they were compiled without first having a commission of inquiry." Another philosopher said, "There are about 35 million laws on earth to validate the ten commandments." A certain representative in the American Congress concluded, "The Lord’s Prayer consists of 56 words, and the Ten Commandments contain 297 words. The Declaration of Independence contains 300 words, but the recently published ordinance about the price of coal comprises no fewer than 26,911 words."