0 Members and 10 Guests are viewing this topic.
ChatGPT seems like a longwinded philosopher rather than someone who actually understands information theory. Try Wikipedia - the entry seems to have been written by folk who know what they are taking about.
Last time I looked, I was definitely an old bloke, and AFAIK this Cambridge is cold, wet, and near the East Coast, just like the other one. You see, my friend, it all depends on what you think you mean by Cambridge. Is there a universal meme that can be deconstructed as a set of paradigms sufficiently delocalised in spacetime that your Cambridge and mine are the same but not identical, or identical but not, in whatever sense you think you are talking about, the same? In what sense does the Pythagorean essence of Cambridge heuristically or existentially conflict with the Aristotelian ur-Cambridge such that they cannot coexist?I could say that in a few minutes I can walk across a bridge over the river Cam, but a philosopher would ask "how do you know that it is really you, and whilst the river was named by the Saxons, since the water that was there at the time is no longer there, is it still the Cam?
Does the idea of a convention have import when discussing what "information" is?
Ps ,what on earth is ur-Cambridge when it is at home?
We decide when it looks like an interference pattern, or when there is enough information content.
Thus many people think there is a disjuncture between classical and quantum physics because you can't explain the gross interference pattern in terms of particles and you can't explain the distribution of individual events in terms of waves.
Oh dear! I think you are still not distinguishing between observations and models.I'll allow "inject" as a colloquialism in this context: I think you mean "assign to the model of" an experiment. When we actually inject energy or stuff, the object is to see what happens next - understanding may come later. Procedures and algorithms are predetermined processes through which we force data or real stuff. There is nothing predetermined about the fate of a photon in the double-slit experiment, and there can't be: the outcome of any given event is essentially random.
what do random or regular patterns of dots mean?
One of the first observations if not the very first one of the implications of the quantum mechanics to the computational complexity was made by a most famous physicist, Nobel Prize winner Richard P. Feynman, who proposed in his seminal article [8] that a quantum physical system of R particles cannot be simulated by an ordinary computer without an exponential slowdown in the speed of the simulation. On the other hand, the simulation of a system of R particles in classical physics is possible with only a polynomial slowdown.The main reason for this is that the mathematical description size of a particle system is linear in R in classical physics but exponential in R according to quantum physics. As Feynman himself expressed:. . . But the full description of quantum mechanics for a large system with R particles is given by a function ψ(x1, x2, . . . , xR, t) which we call the amplitude to find the particles x1, . . ., xR, and therefore, because it has too many variables, it cannot be simulated with a normal computer with a number of elements proportional to R or proportional to N . [8]Number N in the previous citation refers to the accuracy of the simulation: the number of points in the space, as Feynman formulates. In the same article, Feynman considered the problem of negative probabilities, and returned to the same issue a couple of years later [9]. Feynman's approach may be earliest formulations to understand the role of interference in the probabilities induced by quantum mechanics.
Following Feynman's idea and using quantum mechanical systems for bearing the information and carrying out the computation, it is possible to design algorithms that benefit from the interference: the undesired computational paths may cancel each other, whereas the desired ones may amplify.This phenomenon is generally believed to be the very source of the power of quantum computing.
Is there any other way we can reproduce that effect?And what is "long enough"?Two dots? Three?