The Use of Information Theory in Biology : a Historical Perspective



Download 40.54 Kb.
Date conversion31.05.2016
Size40.54 Kb.
Forum HPLS

The Use of Information Theory in Biology : a Historical Perspective

Whether we consider the question raised by Giovanni Boniolo regarding a possible “biology without information” or the “the definitions of information and meaning” proposed by Marcello Barbieri, we are faced respectively with a philosopher and a specialist of theoretical biology who speculate on the different possible uses of information theory in biology.

What can be the place of historians in such a contemporary scientific discussion? The lack of hindsight make it from the beginning a very difficult task to undertake. Most of those who recognized themselves as historians will officially consider that assessing the value of current scientific research does definitively not belong to their goals. Marc Bloch, the father of the famous Annales school, criticized sixty years ago the tendency to judge in his Apologie pour l'histoire ou Métier d'historien (The Historian's Craft): “Unfortunately, by judging one finishes almost fatally by losing even the liking to explain” (Bloch 1949, p. 70). If the understanding matters more than the judging, Marc Bloch nevertheless accepted the idea of defining history as a science concerned with truth : “History might be proud to have (…) by a precise elaboration of its technique, opened for the mankind a new way towards the truth, and consequently the right.” (p. 67)

The purpose of the following pages is thus to comment on the early uses of information theory in biology in order to present more recent scientific work in an historical perspective. A few considerations on the history of the use of this theory in physics might also give hints in this direction.

  1. The early use of information theory in biology


First of all, what is meant by “information theory”? It cannot be reduced to Shannon’s 1948 publications, as Boniolo and Barbieri seem to proceed (even if they refer to the book version, published with Weaver a year later).1 Shannon himself, the father of the mathematical theory of communication almost never used the expression information theory (see Shannon 1956 on the “Bandwagon” that represents this theory). Information theory must be understood in its historical context, as a woolly theory closed to cybernetics which is defined as a general theory of control and communication which applies in living or technical systems as well as in the analysis of the society (unfortunately, the word ‘cybernetics’ doesn’t even appear in the papers by Boniolo and Barbieri). Significantly, the first bibliography published in the Institute of Radio Engineers (IRE) Transactions on Information Theory is entitled “A Bibliography on Information Theory (Communication Theory - Cybernetics)” (Stumpers 1955). Cybernetics is part of information theory and for this reason it can be applied to biology without referring to Shannon’s work.

The identification of different kinds of feedback loops represents one of the key ideas of cybernetics. It is in this general frame that information is defined as a scientific notion, namely the quantity that circulate in these loops. Biology was present from the very beginning since Norbert Wiener start working on cybernetics after his discussion with the cardiologist Arturo Rosenblueth who told him about biological feedback problems (Heims 1991).

Therefore, there are many kinds of applications concerning the use of information theory in biology. Some directly stem from Shannon’s work, some others are closer to cybernetics. As for the first case, one might think of Quastler’s symposium, organized in 1952. This Viennese radiologist émigré invited scientists from different horizons to discuss and apply Shannon’s model. Among them, Herman Branson, from the Harvard physics department, who calculated the information quantity (H) contained in a man with the expression “H(food and environment) = H(biological function) + H(maintenance and repair) + H(growth, differentiation, memory)” (Quastler 1953, p. 39). Quastler came with Sidney M. Dancoff to the conclusion that “H(man)” was about 2.10 bits (p. 167). Fifty years later, such calculus might look pretty useless for the development of biology but they are intertwined with early reflections on the notion of information which are probably worth a quote, thinking of today’s discussions. Quastler declared for instance :

‘Information Theory’ is a name remarkably apt to be misunderstood. The theory deals, in a quantitative way, with something called ‘information’, which, however, has nothing to do with meaning. On the other hand, the “information” of the theory is related to such diverse activities as arranging, constraining, designing, determining, differentiating, messaging, ordering, organizing, planning, restricting, selecting, specializing, specifying, and systematizing ; (…). (p. 41)

Quastler clearly explained that (contrary to what is claimed by Boniolo) specification can also be analyzed by information theory but that the information which one refers to in the information theory has nothing to do with its meaning (a point which was clearly stated by Shannon and before him R.V.L. Hartley, another Bell Labs engineer, as early as 1927).

In 1952, at the time of this symposium, the DNA structure was still unknown. Quastler and Dancoff were however working under the assumption that “somewhere in the chromosomes [there] is a system of determiners, called the genome” and that “it contains a substantial fraction of all information about an adult organism” (p. 269). The amazing enthusiasm which accompanied the rapid development of information theory explains how a strong determinism arose in modern genetics. A kind of “fetichisation” of genetic information occurred, which was later reinforced by the identification of a “genetic code”.

The “RNA Tie Club”, this informal group of scientists who worked from 1954 onwards with Georges Gamow to “solve the riddle of RNA structure, and to understand the way it builds proteins”, basically applied information theory and more precisely the mathematical theory of communication. As Lily Kay has brilliantly shown, if they did not succeed on the scientific level, this work enabled the constitution of a scientific community of people sharing the same vocabulary (Kay 2000). Information theory was at the origin of the creation of what Galison called a “trading zone” for an area where a new language is developed to permit commerce between cultures that lack a common language.

The cybernetics parts of information theory has also been used in biology, for instance by Jacques Monod in the operon model which included positive and negative feedback regulation (to represent enzyme induction and coordinate repression). A whole chapter of Chance and Necessity (Monod 1970) is even entitled “microscopic Cybernetics”.

Nowadays, many different parts of information theory are used in biology : algorithmic complexity, quantum information, informatics among others. In order to gain a better understanding of this situation, a short excursion in the realm of physics might be useful.

  1. Considering the role of information theory in physics


During the 1920s, about twenty years before what we know today as the “information theory”, information has been defined independently as a scientific notion in three different domains: statistics (with the work of Fisher), communication engineering (mostly at the Bell Labs) and physics (in the German speaking part of Europe). The Polish physicist Marjan von Smoluchowski (1872-1917) was the first in 1912 to tackle this subject, almost indirectly. Connecting the problem of Maxwell’s Demon with that of Brownian motion, he wrote that in order to violate the second principle of thermodynamics, the Demon had to be “taught” (in German “unterrichtet”) regarding the speed of molecules. Szilard referred seventeen years later to this result in a paper he published in German and it is only in the 1964 translation in English that we read that the Demon must be “informed”. Meanwhile, other physicist clearly recognised the importance of information. As early as 1930, Gilbert Lewis wrote “Gain in entropy always means loss of information, and nothing more” (Lewis 1930, p. 573).

The relation between entropy and information started being of the most importance after Shannon decided to call entropy the quantity of information defined in the discontinuous case by H = - pi log pi where all the pi stand for the different selection probabilities. It seems that it is the mathematician Johannes von Neumann who advised him, aware of the possible ambiguity.28 Since the physical quantity, entropy, is wrongly connected to disorder (an eminent subjective notion), this formal analogy has been tremendously misleading, among others in the development of the so-called ‘second cybernetics’ based on the idea that noise could create order.2 Norbert Wiener also played a decisive role for the development of information theory in physics when he wrote in Cybernetics that “Information is information, not matter or energy”. From the early 1950s, information started being used as a physical notion.

In the introduction of the proceedings of the 1952 symposium on the use of information theory in biology, Quastler recognized that information as a new quantity. He started with these words:

One of the basic tools in natural science is the energy concept. In recent years, another concept has begun to attain comparable dignity. It is something more subtle and elusive than energy; it is derived from a desire for dealing methodically with problems of complexity, order, organization, specificity…”

The French physicist Léon Brillouin decided a this time to reinterpret whole chapters of physics considering information as the main notion. He published his first paper on this topic in 1949 and seven years later a book entitled in Science and information theory (Brillouin 1956).

Nowadays, the scientific notion of information has still an important heuristic power. Let us consider two examples. Firstly, G. Cohen-Tannoudji, professor of theoretical particle physics, recently proposed to introduce a fifth fundamental constant after G (gravity), c (light speed), h (Planck’s action quantum), and k (Boltzmann’s constant), namely b for a Brillouin constant, which should enable a new approach of complexity in physics (Cohen-Tannoudji 1991, 1993). Secondly, a professor in optical sciences, Roy Frieden, published a book in 1998 with the title Physics from Fisher Information to show that it is the scientific definition of information introduced in statistics in the 1920s by Ronald Fisher that could help deriving physics. Regarding the connection to entropy, he wrote in a newsgroup :

Fisher information is not entropy. Entropy fails to be a unifier of physics principally because extremizing it does not lead to a differential equation (Schroedinger's, Dirac's, etc.). Also it's just the wrong information measure to use in physics overall (exception being statistical mechanics of course; but Fisher's applies there as well; see above article & book). The principle operation in physics is measurement of something. The information that measures the quality of a measurement of a parameter is Fisher's, not Boltzmann's. Boltzmann's was used by Shannon and Jaynes to count messages or states. The degeneracy of a message or a state does not measure how well a parameter can be measured or estimated. Fisher's does. This is the essential reason for it to work as a mechanism for deriving physics. In other words, physics arises out of measurement. That is, in fact, the thrust of the book. (message sent by Roy Frieden under sci.physics.research, 1999/02/26)

Contrary to what seems to be claimed by Boniolo, debates are still raging regarding the use of information theory, for instance in physics. In a lot of other disciplines, information theory plays different kind of roles, scientific, epistemological or sociological (see Segal 2003).


  1. Perspectives


Information theory constitutes today a tremendous opportunity to develop interdisciplinarity. We cannot assess now the scientific value of ongoing works, but one thing is sure, information theory offers a interdisciplinary language. An communication engineer like G. Battail published for instance in the Europhysics letters with such a question like “Does information theory explain biological evolution ?” using the theory of error correcting codes (Battail 1997).

Almost all the main developments of information theory are used in biology, even the quite abstract theory of algorithmic complexity (developed by Kolmogorov, Solomonoff, Martin-Löf and Chaitin). The emerging field of biocomputing also uses a scientific definition of information and this has been clearly stated by Boniolo in his paper.3 So may be the question is not whether or not a “biology without information” would be suitable, but more which part information theory will have the most important effect in the life sciences.


Bibliography


G. Battail, “Does information theory explain biological evolution ?”, Europhysics Letters, 40 (3), Nov. 1997, pp. 343-348.

M. Bloch, Apologie pour l'histoire ou Métier d'historien, Cahier des Annales, 3, 1949, Armand Colin, Paris (English : The Historian's Craft – Reflections on the Nature and Uses of History and the Techniques and Methods of Those Who Write It, Manchester: Univ. Press, 1954)

L. Brillouin, Science and Information Theory, New York: Academic Press, 1956

G. Cohen-Tannoudji, New York: McGraw-Hill, Inc., 1993 (original in French, Les constantes universelles, Paris: Hachette, 1991)

B. Roy Frieden, Physics from Fisher Information – A Unification, Cambridge: Cambridge University Press, 1998

S.J. Heims, The Cybernetics group, Cambridge, Mass.: MIT Press, 1991

L. Kay, Who Wrote the Book of Life ? A History of the Genetic Code, Stanford: Stanford University Press, 2000

G.N. Lewis, “The Symmetry of time in Physics”, Science, 71, 1930, pp. 569-576

J. Monod, Le hasard et la nécessité, Paris : Seuil, 1970 (Chance and Necessity New York: Knopf, 1971)

H. Quastler (ed.), Essays on the Use of Information Theory in Biology, Univ. of Illinois Press, Urbana, 1953

J. Segal, Le Zéro et le Un - Histoire de la notion scientifique d'information, Paris : Syllepse, 2003

C.E. Shannon, “A Mathematical Theory of Communication”, Bell System Technical Journal, 27, 1948, pp. 379-423 and 623-656

C.E. Shannon, “The Bandwagon”, IRE Transactions on Information Theory, 2, March 1956, p. 3

C.E. Shannon and W. Weaver, The Mathematical Theory of Communication, Urbana, Illinois: University of Illinois Press, 1949

M. Smoluchowski, “Experimentell nachweisbare, der üblichen Thermodynamik widersprechende Molekularphänomene”, Physikalische Zeitschrift, 13, 1912, pp.1069-1079 and discussion p. 1080

F.L. Stumpers, “A Bibliography on Information Theory (Communication Theory - Cybernetics)”, I.R.E. Transactions on Information Theory, 1, Sept. 1955, pp.31-47

L. Szilard, “Über die Entropieverminderung in einem thermodynamische System bei Eingriffen intelligenter Wesen”, Zeitschrift für Physik, 53, 1929, pp. 840-856

N. Wiener, Cybernetics, Paris: Hermann, 1948



1 It is Weaver’s credit to have clearly shown in his introducing paper to The mathematical Theory of Communication (Shannon & Weaver 1949) the limits of Shannon’s formalism. He distinguished three levels with different questions : Level A, “How accurately can the symbols of communication be transmitted?” (The technical problem); Level B, “How precisely do the transmitted symbols convey the desired meaning?” (The semantic problem); and Level C, “How effectively does the received meaning affect conduct in the desired way?” (The effectiveness problem). The meaning is clearly excluded from the first level and the history of information theory may be seen as different temptations to introduce meaning in Shannon’s theory.

28 On this point, see the different sources in Segal 2003 pp. 124-125. In his Mathematical foundations of quantum mechanics, in 1932, von Neumann already used an expression similar to that which defines entropy in Boltzmann’s and Planck’s writings.

2 Therefore we don’t agree with Barbieri when he says that “physical information is an inverse function of physical disorder”.

3 The proceedings of the last Pacific Symposium on Biocomputing, held January 6-10, 2004 at Hawaii, show many uses of information theory. See for example the paper on “Negative Information for Motif Discovery” (all the papers are available at http://www-smi.stanford.edu/projects/helix/psb04/). See also the symposium on “Thermodynamics and Information Theory in Biology” held in 1998 (papers available at http://www-lecb.ncifcrf.gov/~toms/aaas1998/).




The database is protected by copyright ©essaydocs.org 2016
send message

    Main page