Saturday 7 November 2015

Sylvain Magne: information vs codes

This article is a brief commentary on Sylvain Magne's recent article "Objections to Daniel Dennett's informational meme".

Like Dennett, I endorse informational memes. However, Dennett has his own conception of 'information' which differs from that of Shannon. Dennett claims - rather curiously - that there's often no identifiable Shannon channel associated with much hereditary transmission. By contrast, I simply defer to Shannon/Weaver information theory when it comes to the definition of "information". If I want an observer-neutral concept, I simply specify a reference observer.

Sylvain's article rejects the memes-as-information position and proposes Sylvain replace it with memes-as-codes.

I didn't get on with Sylvain's article at all. I have a broader conception of "information" that the one in the paper, and a narrower conception of what constitutes a "code". The term "code" doesn't really have a strict technical definition - but for me a "code" is a collection of symbols - where a "symbol" is something that stands for something else. With the term "code" there's also a pretty strong implication of a symbol-to-symbol mapping. I don't like Sylvain's idea that 'code' is a synonym for 'pattern'. That conflicts too much with common usage. My objection to memes-as-codes is the same as the one Sylvain attributes to a Dennett email in his paper: not all culturally-transmitted phenomena consists of symbols. For example, a wheel is not composed of symbols.

As for my broader conception of "information" - in my defense, my use of the term is the standard one used in information theory used by scientists and engineers everywhere. This is absolutely applicable to heredity - no matter what Daniel Dennett and many other philosophers of biology seem to think.

I point out that academic students of cultural evolution also frequently define their cultural variants in terms of "information". Boyd and Richerson have been doing this pretty consistently since 1985, for example. In fact they even define 'culture' in terms of information. Here's what they said in 1985:

Culture is information capable of affecting individual's phenotypes which they acquire from other conspecifics by teaching or imitation

These days, we would not confine culture to conspecifics or cultural inheritance to teaching and imitation - but that's a rather different topic.

The information-theoretic perspective on heredity has deep roots in evolutionary biology. George Williams pioneered the idea in 1966. He wrote:

In evolutionary theory, a gene could be defined as any hereditary information for which there is a favorable or unfavorable selection bias equal to several or many times the rate of endogenous change

- Williams 1966, page 25 - and later wrote:

A gene is not a DNA molecule; it is the transcribable information coded by the molecule

- Williams 1992, page 11.

An informational memetics thus has strong roots in evolutionary biology and firm foundations in information theory - I claim.

Sylvain mostly critiques Daniel Dennett's conception of "information" - which he says is not clearly specified. That may be so - but I don't think that can be said of my conception of "information". I'm just using the bog-standard scientific and engineering meaning of the term - from Shannon and Weaver.

IMO, if there's a problem with memes-as-information, it is that it is very broad. Saying memes are "cultural" is a lot narrower and confines expectations much more. The idea of informational memes is a rather trivial concept - one which I mostly use to contrast with the position of Aunger - that memes are brain structures. That conception is too narrow to do the work required of it in a theory of cultural evolution.

Appendix

Here is Dennett distinguishing his concept of information from Shannon information (44 minutes in) - as I alluded to earlier in this post.

I'm not talking about bits when I'm talking about information, I'm talking about information in a more fundamental sense. Shannon information measured in bits is a recent and very important refinement a one concert with information but it's not the concept I'm talking about. I'm talking about the concept with information where when one chimpanzee learns how to tighten up crack nuts by watching his mother crack nuts there's information passed from mother to offspring and that is not in bits that is that is an informational transfer but has not accomplished in in any Shannon channel that is worth talking about.
This is a pretty embarrassing quotation from Dennett, IMO. Information which is not measured in bits - pah!

References

1 comment:

  1. Thanks a lot Tim for taking the time to read the document and for your valuable comments.
    Definitions will always be a source of disagreement, won't they?
    With regard to information, I always feel there is an unfortunate mismatch between what information is to information theory and what it is to most people.
    I have come to think that information theory is ill-named. Indeed information theory is not really about information but about codes and their structure. It is about codes' frequencies, fidelity and complexity. When people say that information is substrate neutral it really just means that a code can be transcribed.
    I really feel that information theory should be called code theory or even pattern theory. That would make a lot more sense in my view. If it were the case there would less confusion I believe.
    Then Williams would have said instead:
    "a gene could be defined as any hereditary pattern or code for which there is a favorable or unfavorable selection bias equal to several or many times the rate of endogenous change"
    And everything would be simpler in my view.

    ReplyDelete