but don't let that fool you. Instead, if you want to know the truth, read his "paper," on "Variational Information," and ask yourself the following questions:
- What's "new" in this paper? Dembski claims
This paper examines the basic
account of information, which focuses on events, and reviews how it may
be naturally generalized to probability distributions/measures. The resulting
information measure is special case of the Rényi information divergence
(also known as the Rényi entropy). This information measure,
herein dubbed the variational information, meaningfully assigns a numerical
bit-value to arbitrary state transitions of physical systems.
Now evidently Dembski has read Cosma Shalizi's critique of an earlier version of this paper (here), but for the life of me, other than admitting Shalizi is right that Dembski's "variational information" really is a special case of the more general Rényi information, I can't find anything "revised" in this paper. Much of the mathematics seems cribbed from his references, especially Billingsley (I am very familiar with the first edition), and the lack of explicit references thereto other than footnotes raises suspicions.
- If Dembskis seriously believes this "paper" has something new, why doesn't he submit it to a real journal, such as the IEEE Transactions on Information Theory?
BTW, the "Forum" of the International Society for Complexity, Information and Desgn on this topic is a hoot (see the last post), and doesn't seem to allow people to reply.
BTW, anybody who wanted to think critically about Dembski's paper would do well to read a real paper on information theory.
And google Rényi information, too.