The Naked Scientists

The Naked Scientists Forum

Author Topic: What is weak measurements, and, why do we expect it to be equivalent to HUP?  (Read 7762 times)

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
HUP as Heisenberg stated it was a experiment done on one particle at one specific same time as I understands it. Today we have exchanged that to a question of probability. From my view that means that we now define HUP as where we, using indirect information collected over a longer time period, assume that what we then think us to 'know' can circumstance HUP. All as I see it.

But that isn't Heisenberg's HUP, is it?

Isn't that a modern redefinition of HUP, to fit the idea of probability.
Do you agree? And if you don't, why?


 

Offline imatfaal

  • Neilep Level Member
  • ******
  • Posts: 2787
  • rouge moderator
    • View Profile
HUP has a mathematical basis - it arises from the non-commutative nature of matrices.  The idea of measurement of one particle with a certain frequency of light - the famous/infamous Heisenburg's Microscope - is a heuristic that whilst useful also create problems.  I don't get the gist of your third sentence. 

Weak measurements (which is what I was getting at in the NT thread) look for average values over a period of time and different particles - average values in this context have great uncertainty and thus do not challenge HUP. 

No HUP remains rigid and unbending - if you use sophisticated methodology to arrive at a more and more accurate average value for a single particle at a point in time you might eventually arrive at a level of uncertainty predicted at h/4pi; but you will not get any closer.  This is why I was getting in a state about the article from Nature that stated that these weak measurements were of a level to require a relaxation of HUP - they simply were not and it was annoying journalistic hype.   
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Maybe that is to the heart of it Imatfaal. "Weak measurements (which is what I was getting at in the NT thread) look for average values over a period of time and different particles - average values in this context have great uncertainty and thus do not challenge HUP."

So, if they do not change HUP as such, then why do we search for them. Isn't that a expression of our belief in 'statistics' becoming a sort of reality? If they were of no importance we should stay at the original definition, should we not? But if we believe that this, as I see it, new definition of HUP makes a change to it? How can it be the same?
==

Let me put it another way.

Heisenberg agreed on that "the uncertainty relation does not hold for the past". That, if you before measuring the position already knew the velocity of a electron actually could say that you now 'knew' both qualities. That's also why I see it as important to define HUP as a measurement done in the same 'instant'. Only then will both qualities be impossible to measure.

But it's more than that. If you assume that 'times arrow' do not have any meaning for a measurement, as the modern view of weak measurements seems to implicate, then times arrow shouldn't exist. So the question becomes, to me that is, is this new definition of HUP working? The other side of it becomes the question if this version of 'reality' would be a correct statement of free will.

In the sense of attributing probability with an absolute certainty, as it seems to me, we also seem to define a new way of defining how 'things work'. And assuming that 'free will' being a expression of a choice taken in a SpaceTime at a specific position and time, depending on observer naturally, but in all cases able to define to certain moment from each of the observers then a 'free will' from this point of view becomes no different from any other 'measurement'.
« Last Edit: 09/06/2011 18:16:25 by yor_on »
 

Offline imatfaal

  • Neilep Level Member
  • ******
  • Posts: 2787
  • rouge moderator
    • View Profile
Average values over time and over many particles are incredibly useful - temperature for instance. 

Statistics are a realistic representation of nature and in many cases it is only the average value that has real use (ie the KE of one atom cannot tell you the temp).  But that does not mean that we do not also seek to measure individual particles to great precision - and more importantly theorise about them. 

And it is in this theorising that HUP is paramount - it isn't merely that you cannot measure a particles momentum and position (ditto time energy) to arbitrarily high accuracy; any hypothesis that implies you can (or requires it) will fail.

The HUP is not changed - the definition remains.
 

Offline CPT ArkAngel

  • Hero Member
  • *****
  • Posts: 584
  • Thanked: 3 times
    • View Profile
« Last Edit: 10/06/2011 05:23:53 by CPT ArkAngel »
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
It's about weak measurements alright, and it lifts forward my question, but where Bohm and De Broglie expect the duality to coexist I'm still leaning to Copenhagen definition. In your universe CPT you expect things to 'exist', same as we see it macroscopically.

Mine is indeterministic still. When I discuss HUP I believe I do it from Heisenberg's definition in where he says "It seems to be a general law of nature that we cannot determine position and velocity simultaneously with arbitrary accuracy". And "It has turned out that it is in principle impossible to know, to measure the position and velocity of a piece of matter with arbitrary accuracy."

So, when I discuss circumstancing it here, I do it from another point of view.
==

"Although Heisenberg admits that we can consistently attribute values of momentum and position to an electron in the past, he sees little merit in such talk. He points out that these values can never be used as initial conditions in a prediction about the future behavior of the electron, or subjected to experimental verification. Whether or not we grant them physical reality is, as he puts it, a matter of personal taste. Heisenberg's own taste is, of course, to deny their physical reality. For example, he writes, "I believe that one can formulate the emergence of the classical ‘path’ of a particle pregnantly as follows: the ‘path’ comes into being only because we observe it" (Heisenberg, 1927, p. 185). Apparently, in his view, a measurement does not only serve to give meaning to a quantity, it creates a particular value for this quantity. This may be called the ‘measurement=creation’ principle.

It is an ontological (= The metaphysical study of the nature of being and existence) principle, for it states what is physically real.

This then leads to the following picture. First we measure the momentum of the electron very accurately. By ‘measurement= meaning’, this entails that the term "the momentum of the particle" is now well-defined. Moreover, by the ‘measurement=creation’ principle, we may say that this momentum is physically real. Next, the position is measured with inaccuracy δq. At this instant, the position of the particle becomes well-defined and, again, one can regard this as a physically real attribute of the particle. However, the momentum has now changed by an amount that is unpredictable by an order of magnitude | pf − pi | ∼ h/δq. The meaning and validity of this claim can be verified by a subsequent momentum measurement.

The question is then what status we shall assign to the momentum of the electron just before its final measurement. Is it real? According to Heisenberg it is not. Before the final measurement, the best we can attribute to the electron is some unsharp, or fuzzy momentum. These terms are meant here in an ontological sense, characterizing a real attribute of the electron."

The Uncertainty Principle. Stanford university.
==

("The ontology of de Broglie-Bohm theory consists of a configuration q(t)\in Q of the universe and a pilot wave \psi(q,t)\in\mathbb{C}. The configuration space Q can be chosen differently, as in classical mechanics and standard quantum mechanics.

Thus, the ontology of pilot wave theory contains as the trajectory q(t)\in Q we know from classical mechanics, as the wave function \psi(q,t)\in\mathbb{C} of quantum theory. So, at every moment of time there exists not only a wave function, but also a well-defined configuration of the whole universe.")
« Last Edit: 10/06/2011 09:49:29 by yor_on »
 

Offline CPT ArkAngel

  • Hero Member
  • *****
  • Posts: 584
  • Thanked: 3 times
    • View Profile
It's about weak measurements alright, and it lifts forward my question, but where Bohm and De Broglie expect the duality to coexist I'm still leaning to Copenhagen definition. In your universe CPT you expect things to 'exist', same as we see it macroscopically.

Mine is indeterministic still. When I discuss HUP I believe I do it from Heisenberg's definition in where he says "It seems to be a general law of nature that we cannot determine position and velocity simultaneously with arbitrary accuracy". And "It has turned out that it is in principle impossible to know, to measure the position and velocity of a piece of matter with arbitrary accuracy."

So, when I discuss circumstancing it here, I do it from another point of view.
==
Yes, my theory is deterministic for matter-energy, but there is still the possibility of undeterministic input from biological entities. The uncertainty principle arise from a minimum spin of 1/2 or h/4pi., and it is true from my point of view too, but in my opinion, QM probabilistic Copenhagen interpretation is simply due to the charges always moving at the speed of light and entanglement phenomenon which cause true uncertainty in my model too, but if you take all entanglement information of the universe, it becomes causal (and not admitting the possibility of a classical underlying model is not a valuable point of view!).
« Last Edit: 10/06/2011 10:14:19 by CPT ArkAngel »
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
It's okay if you have a own spin on it. But as it then is a hypothesis, and so neither Heisenberg's nor the De Broglie-Bohm theory, I suspect it belongs in 'New Theories'. I'm interested in mainstream definitions of HUP and 'weak measurements' for this one CPT.
 

Offline JP

  • Neilep Level Member
  • ******
  • Posts: 3366
  • Thanked: 2 times
    • View Profile
Yeah, it appears to be a trick of using a lot of measurements to get overall results that look better than the uncertainty principle would allow if you only made a single measurement.  Each measurement you make is limited by the uncertainty principle, but if you have thousands of identical photons and measure each one, you can build up a picture that appears to beat the uncertainty principle.
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Yep and it is actually this idea that brings me to the question if physics have changed their view of a arrow? Because, to make it work it seems to me that instead of measuring as defined in a 'instant of time' one instead ignore the arrow of time, replacing it with probability and 'statistics'?

If you do so it seems to me there need to be some limits defined, and a hypothesis of what the arrow should be seen as. So how do weak measurements define the 'arrow of time' nowadays? Only through probability? Then the more measurements the merrier, right?

Then there should be a definition for what can be considered the limits of a good 'probability run' over a experiment. It feels almost as if some inverted HUP was being put in place here. I'm not saying that it can't work, after all, a chair is, to its particles, also 'constructed' from HUP so?

But it does not penetrate the mystery to me, it's a very clever piece of engineering, and once more we're copying nature? Which is a pretty smart thing to do admittedly :)

Or maybe there are other rules to what you can define as weak measurements too?
 

Offline JP

  • Neilep Level Member
  • ******
  • Posts: 3366
  • Thanked: 2 times
    • View Profile
Yep and it is actually this idea that brings me to the question if physics have changed their view of a arrow? Because, to make it work it seems to me that instead of measuring as defined in a 'instant of time' one instead ignore the arrow of time, replacing it with probability and 'statistics'?
I assume it's what physicists call a "stationary process" which means it's not changing in time.  Each photon changes in time, of course, but the statistics remain the same over time, which lets you do this.


Quote
...
Then there should be a definition for what can be considered the limits of a good 'probability run' over a experiment. It feels almost as if some inverted HUP was being put in place here. I'm not saying that it can't work, after all, a chair is, to its particles, also 'constructed' from HUP so?
It's probably just a word choice issue, but it's wrong to say anything is 'constructed' from the HUP here.  All particles have to obey the HUP, as Matthew pointed out.  Each particle in this experiment does, it's just that if you measure a lot of them, you might get more information than you can get by measuring one.  At least that's what I get from it.
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Yeah JP, like 'something', as it decreases in 'size', as our particles do when becoming that chair, also becomes into a sharper 'focus'. That is, if we define it as making simultaneous measurement of a object possible. And that must follow some 'principle', if we look at it this way. It's somewhat of a mystery to me.
 

Offline JP

  • Neilep Level Member
  • ******
  • Posts: 3366
  • Thanked: 2 times
    • View Profile
Yeah JP, like 'something', as it decreases in 'size', as our particles do when becoming that chair, also becomes into a sharper 'focus'. That is, if we define it as making simultaneous measurement of a object possible. And that must follow some 'principle', if we look at it this way. It's somewhat of a mystery to me.

The HUP means uncertainty in position times uncertainty in momentum = hbar/2.  That's a tiny tiny tiny number.  The chair appears to be in focus only because hbar/2 is so tiny we could never hope to see it on the scale of a chair. 
 

Offline imatfaal

  • Neilep Level Member
  • ******
  • Posts: 2787
  • rouge moderator
    • View Profile
That is, if we define it as making simultaneous measurement of a object possible. And that must follow some 'principle', if we look at it this way.

This is the bit that I didn't like in the original articles and I still don't agree with.  Any process that attempts to make simultaneous measurements of the same particle then each measurement will be limited by uncertainty parameters - and no multiplication of measurements/errors will lower that level of uncertainty.  the experiment sought the trajectory of a particle (which of course requires both the position and momentum) measured the particle over time and using loads of particles (it used polarization to measure momentum and you cannot get polarization from one particle) - so what it really got was a multitude of points and direction at that point; which you can join in a uber-dot-to-dot 
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
JP, what you are saying there seems to be that probability decrease with size :) (Alternatively increase depending on from where you look at it) Probably I'm misinterpreting you, but naively this is my point too. A question of 'size'. And so questioning if we really describe our universe correct?

And yeah Imatfaal, your reaction is sort of mine too. I find it hard to melt, but then again. There is a certain point to using 'statistics' for defining a probability? 

And, just for fun and totally philosophically. When does a statistical significance becomes reality? Assume a reliability up to a hundred percent and define it from what we already have around us in form of sickness and cures for example, or the probability of something developing as its ancestors did. A nut becoming a tree?

I don't know, it's like a alternative approach to reality to me.
 

Offline JP

  • Neilep Level Member
  • ******
  • Posts: 3366
  • Thanked: 2 times
    • View Profile
JP, what you are saying there seems to be that probability decrease with size :) (Alternatively increase depending on from where you look at it) Probably I'm misinterpreting you, but naively this is my point too. A question of 'size'. And so questioning if we really describe our universe correct?

Nope.  But the same amount of uncertainty looks a lot bigger when our object is the size of an electron than when it's the size of a chair.
 

Offline Geezer

  • Neilep Level Member
  • ******
  • Posts: 8328
  • "Vive la résistance!"
    • View Profile
If I may make so bold (or even if I may not make so bold), I think what JP could be suggesting is that we should ignore our perceptions of reality and accept the observed realities of QM, no matter how uncomfortable that may make us feel.

Let's face it. All of our senses are amazingly good at integrating the effects of QM so that we can see the "big picture", but there is no magical point at which QM suddenly stops working. We have lots of useful models that help us approximate "reality". However, reality is not obliged to make our understanding of it easy.

   
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Karl Popper had this to say about HUP.

"Yet in fact the Heisenberg formula for energy depends neither on wave mechanics nor on Heisenberg's matrix mechanics; nor do we need the commutation relations (which according to Hills are insufficient for the derivation of the formulae). It simply does not depend on the revolutionary new quantum mechanics of 1925-6, but follows directly from Planck's old quantum postulate of 1900:

E = hf.

From this we get immediately

(2) DE = h Df.

By using the principle of harmonic resolving power,

(3) Df approx = 1/Dt,

we obtain from (2) and (3)

(4) DE approx = h / Dt,

which leads at once to

(5) DE . Dt approx = h;

that is to say, a form of Heisenberg's so-called indeterminacy formulae.

In precisely the same way we obtain the Heisenberg formula for position and momentum from Duane's principle (whose analogy to Planck's principle has recently been stressed by Alfred Landé). It may be written

(6) Dpi approx = h / Dqi

According to Landé this may be interpreted as follows: a body (such as a grid or a crystal) endowed with the space-periodicity Dqi is entitled to change its momentum pi in multiples of Dpi approx = h / Dqi.

From (6) we obtain at once

(7) Dpi . Dqi approx = h,

which is another form of Heisenberg's indeterminacy formulae.

Considering that Planck's theory is a statistical theory, the Heisenberg formulae can be most naturally interpreted as statistical scatter relations, as I proposed more than thirty years ago. That is, they say nothing about the possible precision of measurements, nor anything about limits to our knowledge. But if they are scatter relations, they tell us something about the limits to the homogeneity of quantum-physical states, and therefore, though indirectly, about predictability.

For example, the formula Dpi . Dqi approx = h (which can be obtained from Duane's principle just as DE . DT approx = h can be obtained from Planck's principle) tells us, simply, that if we determine the coordinate x of a system (say, an electron) then, upon repetition of the experiment, the momentum will scatter.

Now how can such an assertion be tested? By making a long series of experiments with a fixed shutter opening Dx and by measuring, in every single case, the momentum Px. If these momenta scatter as predicted, then the formula has survived the test. But this shows that in order to test the scatter relations, we have actually measured, in every case, px with a precision far greater than Dpx; for otherwise we could not speak of Dpx, as the scatter of px.

Experiments of the kind described are carried out every day in all physical laboratories. But they refute Heisenberg's indeterminacy interpretation, since measurements (though not the predictions based upon them) are more precise than this interpretation permits.

Heisenberg himself noted that such measurements are possible, but he said that it was 'a matter of personal belief' or personal taste' whether or not we attach any meaning to them; and ever since this remark they have been universally disregarded as meaningless. But they are not meaningless, for they have a definite function: they are tests of the very formulae in question; that is, of the indeterminacy formulae qua scatter relations.

There is, therefore, no reason whatever to accept either Heisenberg's or Bohr's subjectivist interpretation of quantum mechanics. Quantum mechanics is a statistical theory because the problems it tries to solve-spectral intensities, for example -are statistical problems. There is, therefore, no need here for any philosophical defence of its non-causal character."

Remember that I'm not agreeing by presenting it. To me the Copenhagen definition can stand on its own, but then again. It all goes back to a feeling I have that there can be several definitions that may be able to describe reality. So I must be open for alternatives, right?

So far reality makes a fool out of me.
At least it gives me a headache ::))
 

Offline JP

  • Neilep Level Member
  • ******
  • Posts: 3366
  • Thanked: 2 times
    • View Profile
(3) Df approx = 1/Dt

I don't know it by the name "principle of harmonic resolving power," but this looks to me a lot like an application of the uncertainty principle already.  This one shows up in optics, too, where if you try to measure an arbitrarily short time, you can't also measure an arbitrarily precise frequency.

But at any rate, the uncertainty principle isn't "magic."  It's a consequence of the fact that things appear to behave like waves.  Even in purely classical optics, where quanta aren't being modeled, a beam of light can be focused down' to an arbitrarily small point (up to the diffraction limit), but then because of uncertainty relations, it spreads out in all directions.  If you instead try to make all the light go in one direction, the beam has to be of a certain minimum width.  This is because it's a wave, and width and focal spot size have to obey an uncertainty relationship because it's a wave.
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Depends on what you see as magic maybe? In the Victorian society I doubt they would have accepted that there comes a point where physics, light and matter, stops 'making sense'. Or maybe they would have referred to it as an 'act of God' when/if it did?

You can use HUP to define that place and time, but you can't use HUP to prove how HUP itself makes sense :) If you see how I think there? I guess that is what Popper felt when he introduced his idea of HUP. That there should be a way of making it 'make sense' to us, not just being a 'constant' of sorts, defining a limit of observation.
==

Heh, you got stuck on that one too :)

Seems like he explains the "principle of harmonic resolving power," in a pdf called; Quantum Mechanics without The Observer .. I have a partial of it if you're interested, but you can find it in Google docs. just search for "principle of harmonic resolving power" without his name first. But as every other page is blank you can only read a partial of it. But the PDF seems to be 'out there' (the Inet), even though I failed to download it.

Still, this one feels familiar to me. On the Quantum Wave Function. by E.D. Cooper It's similar to how I think about it too. (Found it as I searched for Popper actually, and boy, was I pleased reading it:) Especially the description of how a magnetic field always is observer dependent. Locality and the observer are the same to me. They can be interchanged, and exist both in the theory of relativity as well as QM.
==

It's strange by the way how one can know a thing to be true without reflecting on why it is true. As this description of a magnetic field, I knew it but first when I read this did I realize how simple you could describe it. It's a beautiful description of 'locality', as I read it :)
« Last Edit: 14/06/2011 05:28:33 by yor_on »
 

Offline JP

  • Neilep Level Member
  • ******
  • Posts: 3366
  • Thanked: 2 times
    • View Profile
It makes perfect sense if you consider that we live in a quantum world and that our ideas that particles and waves are separate is an artificial construct of our senses.  :)
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Wait and see :)
I'm sure there will be a way of connecting QM and relativity. It is most probably, as Castaneda expressed it, just a way of 'twisting ones head'.
 

Offline Geezer

  • Neilep Level Member
  • ******
  • Posts: 8328
  • "Vive la résistance!"
    • View Profile
It makes perfect sense if you consider that we live in a quantum world and that our ideas that particles and waves are separate is an artificial construct of our senses.  :)

Yikes! I'm getting déjà vu all over again.

"All of our senses are amazingly good at integrating the effects of QM so that we can see the "big picture", but there is no magical point at which QM suddenly stops working. We have lots of useful models that help us approximate "reality". However, reality is not obliged to make our understanding of it easy."

 

Offline Mr. Data

  • Sr. Member
  • ****
  • Posts: 275
    • View Profile
It's about weak measurements alright, and it lifts forward my question, but where Bohm and De Broglie expect the duality to coexist I'm still leaning to Copenhagen definition. In your universe CPT you expect things to 'exist', same as we see it macroscopically.

Mine is indeterministic still. When I discuss HUP I believe I do it from Heisenberg's definition in where he says "It seems to be a general law of nature that we cannot determine position and velocity simultaneously with arbitrary accuracy". And "It has turned out that it is in principle impossible to know, to measure the position and velocity of a piece of matter with arbitrary accuracy."

So, when I discuss circumstancing it here, I do it from another point of view.
==

"Although Heisenberg admits that we can consistently attribute values of momentum and position to an electron in the past, he sees little merit in such talk. He points out that these values can never be used as initial conditions in a prediction about the future behavior of the electron, or subjected to experimental verification. Whether or not we grant them physical reality is, as he puts it, a matter of personal taste. Heisenberg's own taste is, of course, to deny their physical reality. For example, he writes, "I believe that one can formulate the emergence of the classical ‘path’ of a particle pregnantly as follows: the ‘path’ comes into being only because we observe it" (Heisenberg, 1927, p. 185). Apparently, in his view, a measurement does not only serve to give meaning to a quantity, it creates a particular value for this quantity. This may be called the ‘measurement=creation’ principle.

It is an ontological (= The metaphysical study of the nature of being and existence) principle, for it states what is physically real.

This then leads to the following picture. First we measure the momentum of the electron very accurately. By ‘measurement= meaning’, this entails that the term "the momentum of the particle" is now well-defined. Moreover, by the ‘measurement=creation’ principle, we may say that this momentum is physically real. Next, the position is measured with inaccuracy δq. At this instant, the position of the particle becomes well-defined and, again, one can regard this as a physically real attribute of the particle. However, the momentum has now changed by an amount that is unpredictable by an order of magnitude | pf − pi | ∼ h/δq. The meaning and validity of this claim can be verified by a subsequent momentum measurement.

The question is then what status we shall assign to the momentum of the electron just before its final measurement. Is it real? According to Heisenberg it is not. Before the final measurement, the best we can attribute to the electron is some unsharp, or fuzzy momentum. These terms are meant here in an ontological sense, characterizing a real attribute of the electron."

The Uncertainty Principle. Stanford university.
==

("The ontology of de Broglie-Bohm theory consists of a configuration q(t)\in Q of the universe and a pilot wave \psi(q,t)\in\mathbb{C}. The configuration space Q can be chosen differently, as in classical mechanics and standard quantum mechanics.

Thus, the ontology of pilot wave theory contains as the trajectory q(t)\in Q we know from classical mechanics, as the wave function \psi(q,t)\in\mathbb{C} of quantum theory. So, at every moment of time there exists not only a wave function, but also a well-defined configuration of the whole universe.")

Well written. And a lot of talks on weak measurement theory - but I keep seeing very technical explanations of what a weak measurement is.

A weak measurement, for those who hate technicalities, is really a coupling between two systems. A weak measurement can be anything from the wave function collapse between two particles, or an observer in the lab who tentively watches an atom, is also a weak measurement on a system.

Had a lot of fun reading through these posts though. Nicely worded by many posters.
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • Posts: 12001
  • Thanked: 4 times
  • (Ah, yes:) *a table is always good to hide under*
    • View Profile
Yeah, the most interesting things (to me) isn't the forefront of physics, rather the ideas we build that forefront on. There's only so much time and we need to get it right before the measure runs out. QM today is rather incredible, as we test its limits, and find ways to circumvent them.

One of the biggest questions to me is just how I should see a weak measurement, because that reflects on how you define the world. You could argue that before flipping a coin it has a very real existence, although you from your defined intention, now might state that it is in a 'superposition' of states. That wouldn't be right though, the superposition only come to exist in the 'flipping' of it. And that's one simple definition of it, making it into some sort of 'predefined statistics'. But I don't know if that is what HUP speaks about? To me HUP is about a groundstate of uncertainty with the opposite being 'histories'(as weak measurements are). A symmetry of sorts again, isn't it?

You might want to argue that this is the way to define reality, using weak measurements, as they are 'statistically significant' meaning that from that experiment you now can draw conclusions valid for all other experiments done inside SpaceTime. But in some weird way I find that boring :) I want reality to fill me with wonder, not dreariness and statistics.

Also it doesn't push the borders, it just give a statistic significance to what we already knew without those 'statistics'. That doesn't mean it isn't one of the best tools we have to define a reality, but just as statistics can define populations, it fails to define the individual.
 

The Naked Scientists Forum


 

SMF 2.0.10 | SMF © 2015, Simple Machines
SMFAds for Free Forums