The Naked Scientists
  • Login
  • Register
  • Podcasts
      • The Naked Scientists
      • eLife
      • Naked Genetics
      • Naked Astronomy
      • In short
      • Naked Neuroscience
      • Ask! The Naked Scientists
      • Question of the Week
      • Archive
      • Video
      • SUBSCRIBE to our Podcasts
  • Articles
      • Science News
      • Features
      • Interviews
      • Answers to Science Questions
  • Get Naked
      • Donate
      • Do an Experiment
      • Science Forum
      • Ask a Question
  • About
      • Meet the team
      • Our Sponsors
      • Site Map
      • Contact us

User menu

  • Login
  • Register
  • Home
  • Help
  • Search
  • Tags
  • Recent Topics
  • Login
  • Register
  1. Naked Science Forum
  2. General Science
  3. General Science
  4. finite average level of complexity??
« previous next »
  • Print
Pages: [1]   Go Down

finite average level of complexity??

  • 10 Replies
  • 7180 Views
  • 0 Tags

0 Members and 1 Guest are viewing this topic.

Offline BenVitale (OP)

  • Jr. Member
  • **
  • 14
  • Activity:
    0%
finite average level of complexity??
« on: 01/11/2007 16:07:58 »
Complexity can only be merely defined. Anyway, the world is somewhat complex. But not so much, it could have been more intricate. We can make some fractal interpolation in nature (e.g. Fractals and Chaos in Geology and Geophysic, Turcotte, Cambridge University Press, Cambridge, 1997) and understand the complexity of some shapes from basics law (e.g. basics erosion and tectonics yield the fractal behavior of numerous landscapes).

Question : is there an approach to estimate the average global complexity?
Logged
 



another_someone

  • Guest
finite average level of complexity??
« Reply #1 on: 01/11/2007 16:55:19 »
Quote from: BenVitale on 01/11/2007 16:07:58
Question : is there an approach to estimate the average global complexity?

Is complexity the same thing as information entropy?
Logged
 

Offline BenVitale (OP)

  • Jr. Member
  • **
  • 14
  • Activity:
    0%
finite average level of complexity??
« Reply #2 on: 01/11/2007 19:38:35 »
another_some, could you expand on this?
Doesn't it depend on the whim of the one doing the calculation?
It is in no way obvious what is even meant with complexity in the absolute sense needed to calculate a total complexity level. right?
I watched on PBS (Public Broadcasting Service) the program NOVA, which featured Emergence of Complexity.
Did you watch it?
Logged
 

Offline JimBob

  • Global Moderator
  • Naked Science Forum King!
  • ********
  • 6543
  • Activity:
    0%
  • Thanked: 9 times
  • Moderator
finite average level of complexity??
« Reply #3 on: 02/11/2007 01:20:12 »
I do not know if NOVA is broadcast in the UK. And perhaps this subject needs a different place to live?????
Logged
The mind is like a parachute. It works best when open.  -- A. Einstein
 

another_someone

  • Guest
finite average level of complexity??
« Reply #4 on: 02/11/2007 01:59:16 »
Quote from: BenVitale on 01/11/2007 19:38:35
another_some, could you expand on this?
Doesn't it depend on the whim of the one doing the calculation?
It is in no way obvious what is even meant with complexity in the absolute sense needed to calculate a total complexity level. right?
I watched on PBS (Public Broadcasting Service) the program NOVA, which featured Emergence of Complexity.
Did you watch it?

My understanding of information entropy is that it is a measure of the number of independent pieces of information within a system (i.e. the number of measurements within a system that cannot be predicted from other measurements taken upon that system).
Logged
 



Offline BenVitale (OP)

  • Jr. Member
  • **
  • 14
  • Activity:
    0%
finite average level of complexity??
« Reply #5 on: 04/11/2007 00:05:31 »
Let's consider an image of a natural scene ;we 've got some individuals (pixels) that may differ one from another (different colors).
the entropy E of the system is E=sigma(-p.logp), where the sum is computed over the colors and p is the probability of occurence of a given color.
the issue here is that E simply does not depend on the location of the pixels within the image and thus does not depend on the "shapes" or "object" that one can perceive in the image (tress, etc): E only depends on the histogram of the pixels but not on the geometry of the image...

Can anyone suggest a way to estimate the complexity of images of the world?
Logged
 

lyner

  • Guest
finite average level of complexity??
« Reply #6 on: 04/11/2007 00:07:31 »
Sorry - the question is too complex.
Logged
 

another_someone

  • Guest
finite average level of complexity??
« Reply #7 on: 04/11/2007 00:51:13 »
Quote from: BenVitale on 04/11/2007 00:05:31
Let's consider an image of a natural scene ;we 've got some individuals (pixels) that may differ one from another (different colors).
the entropy E of the system is E=sigma(-p.logp), where the sum is computed over the colors and p is the probability of occurence of a given color.
the issue here is that E simply does not depend on the location of the pixels within the image and thus does not depend on the "shapes" or "object" that one can perceive in the image (tress, etc): E only depends on the histogram of the pixels but not on the geometry of the image...

Can anyone suggest a way to estimate the complexity of images of the world?

I am guessing that you are misinterpreting the equation:

 [ Invalid Attachment ]

We are not interested specifically in the probability of a colour occurring, but the probability of any particular attribute (for instance, the probability that two adjacent pixels have the same colour).  Clearly, if we look at the probability of a pixel having the same colour as an adjacent pixel, and we have large areas where pixels are of the same colour, then the entropy of the system is lower than if the number of pixels that are the same as adjacent pixels conforms to a logarithmic probability.

Another way of looking at this is to think of an ideal lossless compression algorithm.  If one passes the image through such an ideal compression algorithm, then the smaller the compressed image, the lower the entropy of the image.

* 2151b79c09f654cec8e560bd804c47c9.png (0.73 kB, 172x42 - viewed 961 times.)
Logged
 

Offline Atomic-S

  • Hero Member
  • *****
  • 981
  • Activity:
    0%
  • Thanked: 19 times
finite average level of complexity??
« Reply #8 on: 05/11/2007 04:20:27 »
Would pi in the said equation be not simply the total number of that color divided by the number of all pixels, but rather the probability that THAT PARTICULAR PIXEL would be such and such, given the circumstances surrounding the creation of the picture (I guess this would strictly be the sum of all the probabilities that that pixel would  be any of all possible colors)?
Logged
 
 



Offline Atomic-S

  • Hero Member
  • *****
  • 981
  • Activity:
    0%
  • Thanked: 19 times
finite average level of complexity??
« Reply #9 on: 05/11/2007 04:22:05 »
Another way we say this is that in a signal consisting of a sequence of symbols, the information value of each symbol is the logarithm of its improbability (at that particular position, I assume). And the entropy of the message is the sum of all those information values.
Logged
 
 

another_someone

  • Guest
finite average level of complexity??
« Reply #10 on: 05/11/2007 11:55:06 »
Quote from: Atomic-S on 05/11/2007 04:22:05
Another way we say this is that in a signal consisting of a sequence of symbols, the information value of each symbol is the logarithm of its improbability (at that particular position, I assume). And the entropy of the message is the sum of all those information values.

I would not disagree with this, but the key point is that the information value of each symbol is not merely the value of that symbol in isolation, but the value of the symbol within the context of all the other symbols, and the probability of the symbol within that context is as important as the probability of the symbol just taken in isolation.

A simple example of this is if one looks at two words:

'q?een' and 't?n'

The first word, if you add the letter 'u', you have added no information to the word at all except to confirm that the word is a valid English word), whereas in the second word, you have within the English language a choice of 'e', 'a', 'i', 'u', and 'o', and so the letter does carry considerably more information value.  Thus simply looking at the random probability of a letter within the data stream is only meaningful when looking at it within its local context within the data stream.
« Last Edit: 05/11/2007 12:09:13 by another_someone »
Logged
 



  • Print
Pages: [1]   Go Up
« previous next »
Tags:
 
There was an error while thanking
Thanking...
  • SMF 2.0.15 | SMF © 2017, Simple Machines
    Privacy Policy
    SMFAds for Free Forums
  • Naked Science Forum ©

Page created in 0.338 seconds with 55 queries.

  • Podcasts
  • Articles
  • Get Naked
  • About
  • Contact us
  • Advertise
  • Privacy Policy
  • Subscribe to newsletter
  • We love feedback

Follow us

cambridge_logo_footer.png

©The Naked Scientists® 2000–2017 | The Naked Scientists® and Naked Science® are registered trademarks created by Dr Chris Smith. Information presented on this website is the opinion of the individual contributors and does not reflect the general views of the administrators, editors, moderators, sponsors, Cambridge University or the public at large.