Read In the Beginning Was Information Online
Authors: Werner Gitt
Tags: #RELIGION / Religion & Science, #SCIENCE / Study & Teaching
We are now in a position to formulate some fundamental empirical theorems:
[12]
Theorem 6:
A code is an essential requirement for establishing information.
Theorem 7:
The allocation of meanings to the set of available symbols is a mental process depending on convention.
[13]
Theorem 8:
If a code has been defined by a deliberate convention, it must be strictly adhered to afterward.
Theorem 9:
If the information is to be understood, the particular code must be known to both the sender and the recipient.
Theorem 10:
According to Theorem 6, only structures which are based on a code can represent information. This is a necessary but not sufficient condition for the establishment of information.
Theorem 11:
A code system is always the result of a mental process (see footnote 14) (it requires an intelligent origin or inventor).
Figure 13: |
The expression "rejoice" appears in different languages and coding systems in Figure 13. This leads to another important empirical theorem:
Theorem 12:
Any given piece of information can be represented by any selected code.
Comment: Theorem 12 does not state that a complete translation is always possible. It is an art to suitably translate and express metaphors, twists of logic, ambiguities, and special figurative styles into the required language.
It is possible to formulate fundamental principles of information even at the relatively low level of codes by means of the above theorems. If, for example, one finds a code underlying any given system, then one can conclude that the system had a mental origin. In the case of the hieroglyphics, nobody suggested that they were caused by a purely physical process like random mechanical effects, wind, or erosion; Theorem 11 is thus validated.
The following is a brief list of some properties common to all coding systems:
– A code is a necessary prerequisite for establishing and storing information.
– Every choice of code must be well thought out beforehand in the conceptual stage.
– Devising a code is a creative mental process.
– Matter can be a carrier of codes, but it cannot generate any codes.
B) The Actual Syntax
Definition 4:
The actual syntax describes the construction of sentences and phrases, as well as the structural media required for their formation. The set of possible sentences of a language is defined by means of a formalized or formalizable assemblage of rules. This comprises the morphology, phonetics, and vocabulary of the language.
The following questions are relevant:
a) Concerning the sender:
– Which of the possible combinations of symbols are actual defined words of the language (lexicon and notation)?
– How should the words be arranged (construction of the sentences, word placement, and stylistics), linked with one another, and be inflected to form a sentence (grammar)?
– What language should be used for this information?
– Which special modes of expression are used (stylistics, aesthetics, precision of expression, and formalisms)?
– Are the sentences syntactically correct?
b) Concerning the recipient:
– Does the recipient understand the language? (Understanding the contents is not yet relevant.)
The following two sample sentences illustrate the syntax level once again:
A: The bird singed the song.
B: The green freedom prosecuted the cerebrating house.
Sentence B is perfectly correct syntactically, but it is semantically meaningless. In contrast, the semantics of sentence A is acceptable, but its syntax is erroneous.
By the syntax of a language is meant all the rules which describe how individual language elements could and should be combined. The syntax of natural languages is much more complex (see appendix A2) than that of formal artificial languages. The syntactic rules of an artificial language must be complete and unambiguous because, for example, a compiler program which translates written programs into computer code cannot call the programmer to clarify semantic issues.
Knowledge of the conventions applying to the actual encoding as well as to the allocation of meanings is equally essential for both the sender and the recipient. This knowledge is either transferred directly (e.g., by being introduced into a computer system or by being inherited in the case of natural systems), or it must be learned from scratch (e.g., mother tongue or any other natural language).
No person enters this world with the inherited knowledge of some language or some conceptual system. Knowledge of a language is acquired by learning the applicable vocabulary and grammar as they have been established in the conventions of the language concerned.
4.3 The Third Level of Information: Semantics
When we read the previously mentioned book B, we are not interested in statistics about the letters, neither are we concerned with the actual grammar, but we are interested in the meaning of the contents. Symbol sequences and syntactic rules are essential for the representation of information, but the essential characteristic of the conveyed information is not the selected code, neither is it the size, number, or form of the letters, or the method of transmission (in writing, or as optical, acoustic, electrical, tactile or olfactory signals), but it is the message being conveyed, the conclusions, and the meanings (semantics). This central aspect of information plays no role in storage and transmission, since the cost of a telegram, for example, does not depend on the importance of the message, but only on the number of letters or words. Both the sender and the recipient are mainly interested in the meaning; it is the meaning that changes a sequence of symbols into information. So now we have arrived at the third level of information, the semantic level (Greek
semantikós
= characteristic, significance, aspect of meaning).
Typical semantic questions are:
a) Concerning the sender:
– What are the thoughts in the sender’s mind?
– What meaning is contained in the information being formulated?
– What information is implied in addition to the explicit information?
– What means are employed for conveying the information (metaphors, idioms, or parables)?
b) Concerning the recipient:
– Does the recipient understand the information?
– What background information is required for understanding the transmitted information?
– Is the message true or false?
– Is the message meaningful?
Theorem 13:
Any piece of information has been transmitted by somebody and is meant for somebody. A sender and a recipient are always involved whenever and wherever information is concerned.
Comment: Many kinds of information are directed to one single recipient (like a letter) and others are aimed at very many recipients (e.g., a book, or newspaper). In exceptional cases, the information never reaches the recipient (e.g., a letter lost in the mail).
It is only at the semantic level that we really have meaningful information, thus we may establish the following theorem:
Theorem 14:
Any entity, to be accepted as information, must entail semantics; it must be meaningful.
Semantics is an essential aspect of information, because the meaning is the only invariant property. The statistical and syntactical properties can be altered appreciably when information is represented in another language (e.g., translated into Chinese), but the meaning does not change.
Meanings always represent mental concepts, therefore we have:
Theorem 15:
When its progress along the chain of transmission events is traced backward, every piece of information leads to a mental source, the mind of the sender.
Sequences of letters generated by various kinds of statistical processes are shown in Figure 38 (appendix A1.5). The programs used for this purpose were partially able to reproduce some of the syntactic properties of the language, but in the light of Theorems 16 and 17 these sequences of letters do not represent information. The next theorem enables one to distinguish between information and non-information:
Theorem 16:
If a chain of symbols comprises only a statistical sequence of characters, it does not represent information.
Information is essentially linked to a sender (a mental source of information) according to Theorems 13 and 15. This result is independent of whether the recipient understands the information or not. When researchers studied Egyptian obelisks, the symbols were seen as information long before they were deciphered, because it was obvious that they could not have resulted from random processes. The meaning of the hieroglyphics could not be understood by any contemporaries (recipients) before the Rosetta Stone was found in 1799, but even so, it was regarded as information. The same holds for the gyrations of bees which were only understood by humans after being deciphered by Karl von Frisch. In contrast, the genetic code is still mostly unknown, except for the code allocations between the triplets and the amino acids.
All suitable ways of expressing meanings (mental substrates, thoughts, or nonmaterial contents of consciousness) are called languages. Information can be transmitted or stored in material media only when a language is available. The information itself is totally invariant in regard to the transmission system (acoustic, optical, or electrical) as well as the system of storage (brain, book, data processing system, or magnetic tape). This invariance is the result of its nonmaterial nature. There are different kinds of languages:
A common property of all languages is that defined sets of symbols are used, and that definite agreed-upon rules and meanings are allocated to the single signs or language elements. Every language consists of units like morphemes, lexemes, expressions, and entire sentences (in natural languages), that serve as carriers of meaning (formatives). Meanings are internally assigned to the formatives of a language, and both the sender and the recipient should be in accord about these meanings. The following can be employed for encoding meanings in natural languages: morphology, syntax (grammar and stylistics), phonetics, intonation, and gesticulation, as well as numerous other supplementary aids like homonyms, homophones, metaphors, synonyms, polysemes, antonyms, paraphrasing, anomalies, metonymy, and irony, etc.
Every communication process between sender and recipient consists of formulating and understanding the sememes (Greek sema = sign) in one and the same language. In the formulation process, the information to be transmitted is generated in a suitable language in the mind of the sender. In the comprehension process, the symbol combinations are analyzed by the recipient and converted into the corresponding ideas. It is universally accepted that the sender and the recipient are both intelligent beings, or that a particular system must have been created by an intelligent being (Figures 23 and 24, chapter 7).
4.4 The Fourth Level of Information: Pragmatics