The Naked Scientists
Toggle navigation
Login
Register
Podcasts
The Naked Scientists
eLife
Naked Genetics
Naked Astronomy
In short
Naked Neuroscience
Ask! The Naked Scientists
Question of the Week
Archive
Video
SUBSCRIBE to our Podcasts
Articles
Science News
Features
Interviews
Answers to Science Questions
Get Naked
Donate
Do an Experiment
Science Forum
Ask a Question
About
Meet the team
Our Sponsors
Site Map
Contact us
User menu
Login
Register
Search
Home
Help
Search
Tags
Recent Topics
Login
Register
Naked Science Forum
On the Lighter Side
New Theories
Entopy of a statistical distribution and information temperature
« previous
next »
Print
Pages: [
1
]
Go Down
Entopy of a statistical distribution and information temperature
0 Replies
1402 Views
0 Tags
0 Members and 2 Guests are viewing this topic.
MaeveChondrally
(OP)
Jr. Member
11
Activity:
0%
Thanked: 1 times
Naked Science Forum Newbie
Entopy of a statistical distribution and information temperature
«
on:
18/12/2020 22:10:58 »
s=c*NkT*integral from -inf to inf of ln(p(x)*(1-p(x))) dx
c is a constant based on the normal distribution. N is the number of moles or it could be the number of data points in the numerical distribution., k is boltzmans constant and T is the temperature in kelvin. This formula is absulutely derivable from Statistical Mechanics and came from examining equations in Statistical Mechanics, 3rd Edition by Raj Pathria.
if you just want to assess the entropy of the normal distribution then
set N=1Mol=6.023x10^(23) particles per mol and k=boltzmans constant and T=298.13Kelvin
and p(x)=1/sqrt(2pi)*exp(-x^2/2) where x=tan(theta) and dx = sec^2(theta) dtheta
and it can be discretized with h=(pi/2)/1025; so that it can be numerically integrated with the simpsons method 5th order or 2nd order as you wish but do not include the pi/2 value, everything else but.
so p(theta)=1/sqrt(2*pi)*exp(-tan(theta)^2/2)*sec(theta)^2
calculate this probability value for every point from -pi/2 to pi/2 except the two endpoints and sum it all up with simpsons method to see what s is.
Numerical distributions can be calculated with this method as well and all other distributions can be calculated with this method too.
If the entropy value is within 1% of the entropy of a different data set, then at the 1% level it is a normal distribution. differences in mean and standard deviation should be filtered out of numerical datasets by normalizing the data before testing its entropy to see what distribution it is most like.
best regards,
Maeve Chondrally
aka Chondrally
«
Last Edit: 27/12/2020 04:53:19 by
MaeveChondrally
»
Logged
Print
Pages: [
1
]
Go Up
« previous
next »
Tags:
There was an error while thanking
Thanking...