« Setting up filesharing between OSX and FC5 | Main | Seminar Dec. 11 »

October 29, 2007

Mathematical information theory and cybernetics

The reason for the limited activity on this diary at the moment is that I am working (hard) on my dissertation. I will however try to use this as a notebook of ideas, as I'm seeing myself drowning i references, books, articles that I'm reading and I'm spending more an more time trying to find what I read, where I read it and, where that book is now...

There's an interesting relationship between the mathematical theory of communication; cybernetics; and semiology and my field of inquiry, the question to me being wether the nature of the message is of relevance to the output or the stimuli caused by this message. An example would be: Can a binary signal cause a relevant change--relevant as measured from the outputting system itself--in a continous or analog system. Gregory Bateson [Bateson, 1972] would argue that since there is no such thing as a 'territory' without a 'map', or rather that every perception of the territory is a map, than the signal transmitted will already be a transformation, a map in itself, causing change in another map. This would argue that it is up to the "mapping" of the data rather than the transmission of it. From the point of view of the information theorist the main issue is whether the signal-nois ratio is acceptible.

Obviously the problem is infinitely more complex than how it is stated above, especiallly when the semiologist is allowed to enter the discussion.

Posted by henrikfr at October 29, 2007 02:49 PM

Comments

Interesting stuff! Have you looked at Shannon Entropy at all? If I understand your post correctly, I think it would be quite relevant to the discussion. Good luck with your piles of papers and books - I know how you feel! Jamie

Posted by: Jamie Bullock [TypeKey Profile Page] at December 4, 2007 03:29 PM

I have looked at information entropy, but not for long. Hmmm, it is relevant to the discussion though I'm not quite sure how right now. Shannon entropy relies on logic (010101...) can only be said to have an entropy of 0 if we accept that 0 and 1 are two distinct symbols, hence it relies on our pre-understanding of the symbol and with Bateson's strong emphasis on logic, maybe this is a point of entry. Another aspect that I just came to think of is the notion of the bit in entropy. We can reduce all digital information to its smallest common unit: The binary distinction between 0 and 1 that with a random distribution has a an entropy of 1 bit. But as we increase the wordsize we also increase the entropy of the system. IOW we also increase the information potential of the system in a way that cannot easily be abstracted back to smaller word sizes again. Hmmm... need to think about this some more I think. Thanks for the tip!

Posted by: henrikfr [TypeKey Profile Page] at December 17, 2007 01:01 PM

Post a comment

Thanks for signing in, . Now you can comment. (sign out)

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)


Remember me?

(You may use HTML tags for style)