0 Members and 1 Guest are viewing this topic.
I was just wondering if someone with a better understanding of physics could tell me how significant this finding is - whether they are "chipping away" at the uncertainty principle or not. http://www.scientificamerican.com/article/particle-measurement-sidesteps-the-uncertainty-principle/The article seems to hedge a bit.
The notion of ‘uncertainty’ occurs in several different meanings in the physical literature. It may refer to a lack of knowledge of a quantity by an observer, or to the experimental inaccuracy with which a quantity is measured, or to some ambiguity in the definition of a quantity, or to a statistical spread in an ensemble of similary prepared systems. Also, several different names are used for such uncertainties: inaccuracy, spread, imprecision, indefiniteness, indeterminateness, indeterminacy, latitude, etc. As we shall see, even Heisenberg and Bohr did not decide on a single terminology for quantum mechanical uncertainties. Forestalling a discussion about which name is the most appropriate one in quantum mechanics, we use the name ‘uncertainty principle’ simply because it is the most common one in the literature.
The article closes by saying "physicists seem to have found a way to get more data with less measurement..." I'm not sure I agree. It would seem to me that they are getting the same amount of information, but now related to more parameters. Because the uncertainty in momentum and uncertainty in position are inversely proportional, it makes sense that the less precisely the value of one parameter is measured, the more precision can be expected for the measurement of the other.
The uncertainty relations involve the uncertainties in the measurements of these variables. The "uncertainty" -- sometimes called the "imprecision"--is related to the range of the results of repeated measurements taken for a given variable. For example, suppose you measure the length of a book with a meter stick. It turns out to be 23.6 cm, or 23 centimeters and 6 millimeters. But since the meter stick measures only to a maximum precision of 1 mm, another measurement of the book might yield 23.7cm or 23.5 cm. In fact, if you perform the measurement many times, you will get a "bell curve" of measurements centered on an average value, say 23.6 cm. The spread of the bell curve, or the "standard deviation," will be about 1 mm on each side of the average. This means that the "uncertainty" or the precision of the measurement is plus or minus 1 mm. ...[tex]\Delta[/tex]q is the uncertainty or imprecision (standard deviation) of the position measurement. [tex]\Delta[/tex]p is the uncertainty of the momentum measurement in the q direction at the same time as the q measurement.
Nope. As I said, the definition that they're using for uncertainty is wrong. ...
Quote from: PmbPhy on 07/08/2014 04:31:51Nope. As I said, the definition that they're using for uncertainty is wrong. ...Pete, can you help me finding the original article or something more detailed? That Scientific American page doesn't say much...--lightarrow
The article closes by saying "physicists seem to have found a way to get more data with less measurement..." I'm not sure I agree. It would seem to me that they are getting the same amount of information, but now related to more parameters.
This is an example of hype getting ahead of the science. I believe there are two things going on here.The first is weak measurement, which often gets hyped as breaking the uncertainty principle. It doesn't. What happens is that one measures an observable, let's say position, of a particle very poorly. The uncertainty principle says that a sufficiently poor position measurement won't effect the particle's momentum much.
"We do not violate the uncertainty principle,” Howland says. “We just use it in a clever way.”
The filter provided a way of measuring a particle’s position without knowing exactly where it was—without collapsing its wavefunction.
Quote from: JPThis is an example of hype getting ahead of the science. I believe there are two things going on here.The first is weak measurement, which often gets hyped as breaking the uncertainty principle. It doesn't. What happens is that one measures an observable, let's say position, of a particle very poorly. The uncertainty principle says that a sufficiently poor position measurement won't effect the particle's momentum much. I believe that’s a misunderstanding of the uncertainty principle. Uncertainty has nothing to do with the quality.accuracy of a measurement. You can determine the uncertainty without making any measurements whatsoever and those will be the uncertainty for that observable for that system for that wave function. The uncertainty being the standard deviation of the observable is determined entirely by the wave function. Given the same wave function one calculates both <x> and <x^2> and uses those to determine the standard deviation dx as being dx = sqrt(<x^2> - <x>^2). Likewise you can use the wave function to get dp = sqrt(<p^2> - <p>^2). Note that nowhere has a measurement been made. It can be shown that dp dx >= hbar/2.
QuoteThe filter provided a way of measuring a particle’s position without knowing exactly where it was—without collapsing its wavefunction.In quantum mechanics I don't think that calling the inference of where the photon is can be said to be a measurement of where it is.
But if you have two pin holes you can't tell which one it went through and when it went through either of them. If you can't tell when something was at a place then you can't really have said to have measured its position.
Sure you can, but you haven't localized it with the same precision as if you used a single pinhole. Even if you don't like calling that a measurement, and I can understand why it might not be the best term for it, it's the standard terminology in the field.
Quote from: JPSure you can, but you haven't localized it with the same precision as if you used a single pinhole. Even if you don't like calling that a measurement, and I can understand why it might not be the best term for it, it's the standard terminology in the field.A car leaves Boston MA at 8:00 am travels to and arrives in Haverhill MA 45 minute later. You're telling me that given this information you can state that you've measured the cars position? If so then I'd have to strongly disagree. Especially for a quantum particle which can't even be said to have a position until it's been measured. For all you know it tunneled through the entire space and skipped the region between Boston and Haverhill altogether.
"measurement" and "observation" are just as poorly defined as "uncertainty."
I would consider any use of an operator on the wavefunction as an "observation." As long as the two (or more) operators don't commute, there is going to be Uncertainty depending on which observation is made first, and the accuracy of that measurement.
From the point of view of an experimenter, there are many different sources of uncertainty, ..
--Schrödinger's Cat is not about what people can and can't know--it's about superposition, and as far as I'm concerned, whatever "detects" the decay of the atom and releases the poison is what does the "observation" thereby collapsing the superposition,...
Schrodinger regarded this as patent nonsense, and I think that most physicists would agree with him. There is something absurd about the very idea of a macroscopic object being a linear combination of two palpably different states. An electron can be in a linear combination of two palpably different states, but a cat cannot be in a linear combination of alive and dead.
Standard deviation is kind of useless if we are talking about a single measurement ...
but that doesn't mean that the phenomena they describe are necessarily only observed in populations, it still applies on a particle-by-particle basis.
For example, if I know a priori what my wavefunction is, I can characterize the standard deviation of the set of possible measurements.
In this sense, it is reasonable to talk about the uncertainty in an observable of a single particle insofar as this describes the possible outcomes of experiments.
As for what describes a measurement, there's no strict line between a measurement and an interaction.
We have just as much right to claim that a 500 nanometer hole made a measurement as we do that a 1 meter hole made a measurement.