The grand theory of atheistic evolution posits that matter and energy alone have given rise to all things, including biological systems. This theory must attribute the existence of all information ultimately to the interaction of matter and energy without reference to an intelligent or conscious source. All biological systems depend upon information storage, transfer and interpretation for their operation. The primary phenomenon that the theory of evolution must account for is the origin of biological information. Evillutionists argue that fundamental laws of information can be deduced from observations of the nature of the information. These fundamental laws exclude the possibility that information, including biological information, can arise purely from matter and energy without reference to an intelligent agent. I’ll lay out the basis that these laws prove instead that the grand theory of evolution cannot in principle account for the most fundamental biological phenomenon.
In the communication age information has become fundamental to everyday life. However, there is no binding definition of information that is universally agreed upon by practitioners of engineering, information science, biology, linguistics or philosophy. There have been repeated attempts to grapple with the concept of information. Because information itself is non-material, this would be the first time that a law of nature (scientific law) has been formulated for such a mental entity. Let’s first establish a universal definition for information; then state what the laws should be; and, finally, let’s draw some comprehensive conclusions.
If statements about the observable world can be consistently and repeatedly confirmed to be universally true, we refer to them as laws of nature. Laws of nature describe events, phenomena and occurrences that consistently and repeatedly take place. They are thus universally valid laws. They can be formulated for material entities in physics and chemistry (e.g. energy, momentum, electrical current, chemical reactions). Due to their explanatory power, laws of nature enjoy the highest level of confidence in science. The following are attributes exhibited by laws of nature: 1) Laws of nature know no exceptions. This is perhaps the most important one. When dealing with a natural law, it cannot be circumvented or brought down. A law of nature is thus universally valid, and unchanging. It is immutable. A law of nature can, in principle, be refuted—a single contrary example would end its status as a natural law. 2) Laws of nature are unchanging in time. 3) Laws of nature can tell us whether a process being contemplated is even possible or not. This is an application of the laws of nature. 4) Laws of nature exist prior to, and independent of, their discovery and formulation. They can be identified through research and then precisely formulated. Hypotheses, theories or models are fundamentally different. They are invented by people, not merely formulated by them. 5) Laws of nature can always be successfully applied to unknown situations. And that is how we journeyed to and back from the moon.
When we talk of the laws ofnature, we usually mean the laws of physics (e.g. the second law of thermodynamics, the law of gravity, the law of magnetism, the law of nuclear interaction) and the laws of chemistry. All these laws are related exclusively to matter. But to claim that our world can be described solely in terms of material quantities is failing to acknowledge the extent of one’s perception. The same scientific procedures used for identifying laws of nature are also used for identifying laws governing non-material entities. Additionally, these laws exhibit the same attributes as listed above for the laws of nature. Therefore they fulfil the same conditions as the laws of nature for material quantities, and possessing, consequently, a similar power of inference.
The American mathematician Norbert Wiener made the oft-cited statement: “Information is information, neither matter nor energy.” With this he acknowledged a very significant thing: information is not a material entity. “Let me clarify this important property of information with an example. Imagine a sandy stretch of beach. With my finger I write a number of sentences in the sand. The content of the information can be understood. Now I erase the information by smoothing out the sand. Then I write other sentence in the sand. In doing so I am using the same matter as before to display this information. Despite this erasing and rewriting, displaying and destroying varying amounts of information, the mass of the sand did not alter at any time. The information itself is thus massless”. We know what information is not; the question now is what information really is.
Because information is a non-material entity, its origin is likewise not explicable by material processes. What causes information to come into existence at all—what is the initiating factor? Information always depends upon the will of a sender who issues the information. Information is not constant; it can be deliberately increased and can be distorted or destroyed (e.g. through disturbances in transmission). In summary: Information arises only through will (intention and purpose).
In scientific usage, the meaning of a term is in most cases considerably more narrowly stated than its range of meaning in everyday usage (i.e. it is a subset of-See my blog http://www.intelligentdesign.blog.com/ What is SET theory and why!). In this way, a definition does more than just assign a meaning; it also acts to contain or restrict that meaning. A good “natural-law” definition is one that enables us to exclude all those domains (realms) in which laws of nature are not applicable. The more clearly one can establish the domain of definition, the more precise (and furthermore certain) the conclusions which can be drawn. The following definition permits a secure allocation in all cases: Information is always present when all the following five hierarchical levels are observed in a system: statistics, syntax, semantics, pragmatics and apobetics. If this applies to a system in question, then we can be certain that the system falls within the domain of our definition of information. It therefore follows that for this system all five of the above laws of nature about information will apply.
1) Statistics. In considering a book, a computer program or the genome of a human being we can ask the following questions: How many letters, numbers and words does the entire text consist of? How many individual letters of the alphabet (e.g. a, b, c … z for the Roman alphabet, or G, C, A and T for the DNA alphabet) are utilized? What is the frequency of occurrence of certain letters and words? To answer such questions it is irrelevant whether the text contains anything meaningful, is pure nonsense, or just randomly ordered sequences of symbols or words. Such investigations do not concern themselves with the content; they involve purely statistical aspects.
2) Syntax. If we look at a text in any particular language, we see that only certain combinations of letters form permissible words of that particular language. This is determined by a pre-existing, wilful, convention. All other conceivable combinations do not belong to that language’s vocabulary. Syntax encompasses all of the structural characteristics of the way information is represented. This second level involves only the symbol system itself (the code) and the rules by which symbols and chains of symbols are combined (grammar, vocabulary).
3) Semantics. Sequences of symbols and syntactic rules form the necessary pre-conditions for the representation of information. But the critical issue concerning information transmission is not the particular code chosen, nor the size, number or form of the letters—nor even the method of transmission. It is, rather, the semantics, i.e. the message it contains—the proposition, the sense, the meaning.
Information itself is never the actual object or act, neither is it a relationship (event or idea), but encoded symbols which merely represent that is discussed. Information is always an abstract representation of something quite different. The symbols in today’s newspaper represent an event that happened yesterday and is not at all present where and when the information is transmitted. The genetic words in a DNA molecule represent the specific amino acids that will be used at a later stage for synthesis of protein molecules.
4) Pragmatics. Information invites action. In this context it is irrelevant whether the receiver of information acts in the manner desired by the sender of the information, or reacts in the opposite way, or doesn’t do anything at all. Every transmission of information is nevertheless associated with the expectation, from the side of the sender, of generating a particular result or effect on the receiver.
5) Apobetics. We have already recognized that for any given information the sender is pursuing a goal. We have now reached the last and highest level at which information operates: namely, apobetics (the aspect of information concerned with the goal, the result itself). The outcome on the receiver’s side is predicated upon the goal demanded/desired by the sender—that is, the plan or conception. The apobetics aspect of information is the most important of the five levels because it concerns the question of the outcome intended by the sender.
Using the last four of the five levels, we can develop an unambiguous definition of information: namely an encoded, symbolically represented message conveying expected action and intended purpose. We can term any entity meeting the requirements of this definition as “universal information.”
Ok, my head is hurting already. Digest that information. Next instalment we will try to describe the four most important laws of nature about information. Our ultimate goal is to discredit grand theory of atheistic evolution which must attribute the origin of all information ultimately to the interaction of matter and energy, without reference to an intelligent or conscious source. We will show how that is not possible.
 Wiener, N., Cybernetics, or Control and Communication in the Animal and the Machine, Hermann et Cie, The Technology Press, Paris, 1948.
 Gitt, W., In the Beginning was Information, 3rd English ed., Christliche Literatur-Verbreitung, Bielefeld, Germany, 2001. Gitt, W., Am Anfang war die Information, 3. überarbeitete und erweiterte Auflage, Hänssler Verlag, Holzgerlingen, 2002 pp. 47–49.