0 Members and 1 Guest are viewing this topic.

Shannons formula implicitly states that if the probability of an event is zero, it contains infinite information Entropy and is highly informative, but if it is certain to occur with probability 1, it contains no information entropy.

This new law asserts that if the probability is zero or one, ie. completely impossible or absolutely certain to occur, it is highly informative and the information entropy is infinity. If the probability is .5, ie completely random in occurrence, then the information is at a minimum.This latter formula makes more intuitive sense to me but that is my personal opinion. We will have to let the population of information Theorists decide.

...I believe the entropy of Shannon to be a contrived defin []ition.

We can call it the StateSpace Method for Information Theory Derived from Statistical Mechanics, or the Set Theoretic approach.Where R=Takln(Na)+Tbkln(Nb)=kln(N^(Ta+Tb)*p^Ta*(1-p)^Tb)or when Ta=Tb=1, R=-kln(p*(1-p))It has a link to Control Theory as well, and they will have to minimize p to get the certainty they need and maximize R simultaneously.