The Naked Scientists
  • Login
  • Register
  • Podcasts
      • The Naked Scientists
      • eLife
      • Naked Genetics
      • Naked Astronomy
      • In short
      • Naked Neuroscience
      • Ask! The Naked Scientists
      • Question of the Week
      • Archive
      • Video
      • SUBSCRIBE to our Podcasts
  • Articles
      • Science News
      • Features
      • Interviews
      • Answers to Science Questions
  • Get Naked
      • Donate
      • Do an Experiment
      • Science Forum
      • Ask a Question
  • About
      • Meet the team
      • Our Sponsors
      • Site Map
      • Contact us

User menu

  • Login
  • Register
  • Home
  • Help
  • Search
  • Tags
  • Recent Topics
  • Login
  • Register
  1. Naked Science Forum
  2. On the Lighter Side
  3. New Theories
  4. An essay in futility, too long to read :)
« previous next »
  • Print
Pages: 1 ... 20 21 [22] 23 24 ... 3563   Go Down

An essay in futility, too long to read :)

  • 71255 Replies
  • 4935618 Views
  • 9 Tags

0 Members and 118 Guests are viewing this topic.

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #420 on: 30/10/2011 16:39:44 »
Heisenberg Uncertainty relation.



"Note that the indeterminacy of the microscopic world has little effect on macroscopic objects. This is due to the fact that wave function for large objects is extremely small compared to the size of the macroscopic world. Your personal wave function is much smaller than any currently measurable sizes. And the indeterminacy of the quantum world is not complete because it is possible to assign probabilities to the wave function.

But, as Schrodinger's Cat paradox show us, the probability rules of the microscopic world can leak into the macroscopic world. The paradox of Schrodinger's cat has provoked a great deal of debate among theoretical physicists and philosophers. Although some thinkers have argued that the cat actually does exist in two superposed states, most contend that superposition only occurs when a quantum system is isolated from the rest of its environment. Various explanations have been advanced to account for this paradox--including the idea that the cat, or simply the animal's physical environment (such as the photons in the box), can act as an observer.

The question is, at what point, or scale, do the probabilistic rules of the quantum realm give way to the deterministic laws that govern the macroscopic world? This question has been brought into vivid relief by the recent work where an NIST group confined a charged beryllium atom in a tiny electromagnetic cage and then cooled it with a laser to its lowest energy state. In this state the position of the atom and its "spin" (a quantum property that is only metaphorically analogous to spin in the ordinary sense) could be ascertained to within a very high degree of accuracy, limited by Heisenberg's uncertainty principle.

The workers then stimulated the atom with a laser just enough to change its wave function; according to the new wave function of the atom, it now had a 50 percent probability of being in a "spin-up" state in its initial position and an equal probability of being in a "spin-down" state in a position as much as 80 nanometers away, a vast distance indeed for the atomic realm. In effect, the atom was in two different places, as well as two different spin states, at the same time--an atomic analog of a cat both living and dead. The clinching evidence that the NIST researchers had achieved their goal came from their observation of an interference pattern; that phenomenon is a telltale sign that a single beryllium atom produced two distinct wave functions that interfered with each other.

The modern view of quantum mechanics states that Schrodinger's cat, or any macroscopic object, does not exist as superpositions of existence due to decoherence. A pristine wave function is coherent, i.e. undisturbed by observation. But Schrodinger's cat is not a pristine wave function, its is constantly interacting with other objects, such as air molecules in the box, or the box itself. Thus a macroscopic object becomes decoherent by many atomic interactions with its surrounding environment.

Decoherence explains why we do not routinely see quantum superpositions in the world around us. It is not because quantum mechanics intrinsically stops working for objects larger than some magic size. Instead, macroscopic objects such as cats and cards are almost impossible to keep isolated to the extent needed to prevent decoherence. Microscopic objects, in contrast, are more easily isolated from their surroundings so that they retain their quantum secrets and quantum behavior." From Uncertainty Principle:

Decoherence?

Where, and how, do I define that?
Through scales?

Number of atoms?

Einstein Bose condensates?

If I take a system of atoms (gas) and chill them, what do I get? Is there a limit to that, or is it just a practical question about what we can do?

I don't like the idea of 'virtual particles' that much any more. But I'm starting to wonder a lot about HUP.
« Last Edit: 30/10/2011 16:41:37 by yor_on »
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 



Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #421 on: 30/10/2011 17:37:09 »
In what way do decoherence define the 'bridge' between what we see macroscopically versus microscopically? Does it say anything about temperatures? Do they matter at all? (yeah I know, a pun)

The Role of Decoherence in Quantum Mechanics.

One more thing, did Einstein demand a time symmetry (the symmetry of physical laws under a time reversal)? That is, did he think that time was a symmetry that could be turned both ways? What I'm wondering about is whether he saw this as a primary mathematical solution, or one that already exist in our universe? There is a big difference between what we can prove by experiments, relative by theory as I see it. And symmetries is one of the strangest, at the same time as they seem to me as one of the most obvious, things that I know of.

Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #422 on: 31/10/2011 06:58:40 »
So if you make some 'system' to study, we then need to define borders for it to state that nothing interfere with its wave function. And one way is to use temperatures, as a condensate, right?

What more does this lead to? Many worlds theories is one, in which every interaction is taken, although only one 'visible' to you observing, or Feynman's in where 'interference' kills of those other possible outcomes, leaving only one realizable. The later one avoids those other possibilities and so is easier to handle, from the idea of conservation laws. I expect the first to lean on the assumption that in a 'superposition' conservation laws can't exist, only in the realized outcomes.
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #423 on: 01/11/2011 05:47:55 »
So let's look at time symmetries again. Using my way of looking at it radiation is a 'clock' giving you a same duration locally. That gives us a 'same' surface of durations. 'Time' as such is something more than that though, we expect it to construct causality chains, don't we? The idea is that it do have a direction and that we by backtracking that direction can draw conclusions, and build models, of what's going to happen in the future.

From entropy's side it builds on that everything seems to have a direction from 'ordered' to 'unordered'. That things move from what we define as useful energy, to 'unuseful' seems true enough. If you like, we could say that we have two states of high entropy (evenness/order). One at the beginning, according to the Big Bang, and one at the 'end' according to 'entropy'. That's also a definition in where we expect a 'heat bath' for the cosmos as I understands it, making 'heat', not 'energy' per se, becoming a ultimate definition for a 'final stage'.

So is it true? That we can turn all 'interactions' around, and that they then will give us a 'same origin' as we remember it to have been when we 'started'? Nope. It doesn't seem to be true. CPT symmetries shows us that some interactions, k-and b-mesons for example, will have a CPT- invariance but not under T (time) alone. It also can be shown with mesons that you by their behavior can know if they are going 'forward' in time, or 'backward'. The idea of all 'stuff'  being able to be played 'backward and 'forward' in 'time' is particularly expressibly described in so called Feynman diagrams.

Some physicists, most studying quantum levels, as I get it, ignore this as a unfortunate exception to a rule. Then again, what is a 'history', and 'virtual particles'?

« Last Edit: 01/11/2011 05:56:57 by yor_on »
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #424 on: 01/11/2011 08:21:59 »
From entropy's side all processes should be irreversible, in that they all cost some 'energy', described as heat loss mostly, as I've seen. So you can't really turn back the process to its original state. As I too often state, I don't know what 'energy' is, except something describing transformations. But somehow the idea of a universe in where nothing gets lost must contain whatever disappears in a transformation. Which then points to 'heat' thermodynamically.

'Virtual particles' is a very faceted idea that hurts my head. Over the years it seem to have gone from an idea to a 'fact', without no experimental evidence I know of. The idea of the Casimir effect proving it is, as far as I know, still disputed. I've seen it explained as a effect of the matter involved, as well as of 'virtual particles'.

"Virtual particles must not be considered real since they arise only in
a particular approach to high energy physics - perturbation theory
before renormalization - that does not even survive the modifications
needed to remove the infinities. Moreover, the virtual particle content
of a real state depends so much on the details of the computational
scheme (canonical or light front quantization, standard or
renormalization group enhances perturbation theory, etc.) that
calling virtual particles real would produce a very weird picture of
reality.

Whenever we observe a system we make a number of idealizations
that serve to identify the objects in reality with the
mathematical concepts we are using to describe them. Then we calculate
something, and at the end we retranslate it into reality. If our initial
initialization was good enough and our theory is good enough, the final
result will match reality well. Because of this idealization,
'real' real particles (moving in the universe) are slightly different
from 'mathematical' real particles (figuring in equations).


Modern quantum electrodynamics and other field theories are based on
the theory developed for modeling scattering events.
Scattering events take a very short time compared to the
lifetime of the objects involved before and after the event. Therefore,
we represent a prepared beam of particles hitting a target as a single
particle hitting another single particle, and whenever this in fact
happens, we observe end products, e.g. in a wire chamber.
Strictly speaking (i.e., in a fuller model of reality), we'd have to
use a multiparticle (statistical mechanics) setting, but this is never
done since it does not give better information and the added
complications are formidable.

As long as we prepare the particles long (compared to the scattering
time) before they scatter and observe them long enough afterwards,
they behave essentially as in and out states, respectively.
(They are not quite free, because of the electromagnetic self-field
they generate, this gives rise to the infrared problem in quantum
electrodynamics and can be corrected by using coherent states.)
The preparation and detection of the particles is outside this model,
since it would produce only minute corrections to the scattering event.
But to treat it would require to increase the system to include source
and detector, which makes the problem completely different.

Therefore at the level appropriate to a scattering event, the 'real'
real particles are modeled by 'mathematical' in/out states, which
therefore are also called 'real'. On the other hand, 'mathematical'
virtual particles have nothing to do with observations, hence have no
counterpart in reality; therefore they are called 'virtual'."

And

"Virtual particles are an artifact of perturbation theory that
give an intuitive (but if taken too far, misleading) interpretation
for Feynman diagrams. More precisely, a virtual photon, say,
is an internal photon line in one of the Feynman diagrams. But there
is nothing real associated with it. Detectable photons are never
virtual, but always real, 'dressed' photons.

Virtual particles, and the Feynman diagrams they appear in,
are just a visual tool of keeping track of the different terms
in a formal expansion of scattering amplitudes into multi-dimensional
integrals involving multiple propaators - the momenta of the virtual
particles represent the integration variables.
They have no meaning at all outside these integrals.
They get out of mathematical existence once one changes the
formula for computing a scattering amplitude.

Therefore virtual particles are essentially analogous to virtual
integers k obtained by computing
  log(1-x) = sum_k x^k/k
by expansion into a Taylor series. Since we can compute the
logarithm in many other ways, it is ridiculous to attach to
k any intrinsic meaning. But ...

... in QFT, we have no good ways to compute scattering amplitudes
without at least some form of expansion (unless we only use the
lowest order of some approximation method), which makes
virtual particles look a little more real. But the analogy
to the Taylor series shows that it's best not to look at them
that way."

By Arnold. Neumaier How real are 'virtual particles'?
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 



Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #425 on: 03/11/2011 04:56:09 »
So, I've been looking at Smolins, Laurent Freidels, Jerzy Kowalski-Glikman and Giovanni Amelino-Camelia ideas of a curved phase space. At first it seems very natural, the idea is that we have SpaceTime with its four dimensions 'distances' 3 & 'time' 1. Then you combine that with a four-momentum vector space, and voila, we have us a eight dimensional universe, combining the best from QM (momentum, as in radiation hitting your retina) with Einsteins SpaceTime.

Time for some quotes.

"In special relativity, four-momentum is the generalization of the classical three-dimensional momentum to four-dimensional spacetime. Momentum is a vector in three dimensions; similarly four-momentum is a four-vector in spacetime."

"In the literature of relativity, space-time coordinates and the energy/momentum of a particle are often expressed in four-vector form. They are defined so that the length of a four-vector is invariant under a coordinate transformation. This invariance is associated with physical ideas.

The invariance of the space-time four-vector is associated with the fact that the speed of light is a constant. The invariance of the energy-momentum four-vector is associated with the fact that the rest mass of a particle is invariant under coordinate transformations."

So we have the invariant 'energy' belonging to the momentums 'rest frame', like the bullet resting in the chamber before fired, and then we have the dimensions that bullet exist in. So phase space could be seen as all possible values of its position and momentum variables. And that fits very well with SpaceTime.

Then we come to what is different with this momentum space. Smolin started to wonder what would happen with the Lorentz transformations we expect to steer SpaceTime, allowing us to define the universe conceptually as a 'whole', following the invariance of 'c' in, and from, all frames possible according to GR, if he treated this momentum space as 'curved'. They found it to lose the Lorentz transformations coherence, and so instead become extremely local. But when including all eight 'dimensions' you will still find Lorentz transformations to work, as I understands this, that is.

But then we have 'time', or the arrow. In QM you don't treat the arrow the same way as we do macroscopically, at very small scales it becomes indeterministic (HUP) and 'time reversible', meaning that probability steps in, instead of the timely arrow we see. So to get it to work there will be a need to join a macroscopic arrow to QM as I see it. Doesn't mean I don't like the idea, it's close to how I see SpaceTime and 'c', as a primary local phenomena, radiation and gravity presenting us with the 'unified' SpaceTime we describe macroscopically.

« Last Edit: 03/11/2011 04:59:59 by yor_on »
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #426 on: 03/11/2011 05:12:58 »
The question is.

What is 'time' and its 'arrow'. Why do we have it macroscopically, but find it replaced by probability at small scales. And what the he* is HUP? How can it restrict us from knowing all possible outcomes. HUP comes in at a earlier stage than Planck scales, so you can't really define it to 'c' taking one Plank length in one Planck time, that is rather where physics breaks down. But before that stage we find HUP.
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #427 on: 03/11/2011 05:23:40 »
The main reason why I won't exclude time and its arrow from QM is because all measurements done is done in 'times arrow'. There is no way around this, everything you do have a time coordinate 'ticking away'. Moving you positionally inside SpaceTime even if you never 'move' at all. That is the way I see SpaceTime. The single definition of 'times arrow' losing its coherence, that I know of, is at Plank scale, 'c' moving one Planck length, in one Plank time.
« Last Edit: 03/11/2011 05:27:13 by yor_on »
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #428 on: 03/11/2011 06:46:24 »
So let's use that. How about defining 'time reversibility' to single particles only? That means that the reason we don't see time go backwards macroscopically is that those particles interacts and so disturb the 'time <- -> symmetry'. A particle will then be as a atom, no bigger.

I might have liked it more if I could define it to Plank size solely, but then again, if I use 'c' as my measure of a ideal 'clock' there can be no reversibility as that Plank length can be seen as 'frozen in time' well, as I see it.

But then we meet HUP again, don't we?
=

What I mean here, rereading myself, is that if we set up a 'ideal experiment', with single particles representing all objects, you might find a 'time reversibility', but only there. Still, it doesn't help that much, does it? As everything, at least classically is defined as 'interactions' in a causality chain, involving just those atoms interacting. A very tricky one. But what about superpositions etc? Do they fit a simple causality chain?

« Last Edit: 03/11/2011 07:05:29 by yor_on »
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 



Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #429 on: 15/11/2011 23:15:50 »
Read this first Interaction Free Measurements.

It sort of hurts my head this one, but it's about the collapse of the wave function, as I think of it, and entanglements, and before all, about 'Photons'.

I like to think that it is what you have around you that define you, sort of :) And so it should be with particles too. They get defined by what is 'around them', but entanglements is weird. Either you belong to the school where you see them as FTL in some weird manner, or to the school where they are 'instantaneous'. and then you have those defining a entanglement as undefined even after measuring 'A', meaning that it is as valid to state that your later measurement on 'B' sets that 'wave function', the idea involving a whole 'system', not only your first measurement. Add to that the question if you can, or can not impart 'information', opening the can of worms of what 'information' should be defined as if you could, for example, impart 'energy' in 'A', to then also be 'found' and lifted out at 'B'

So, where do that wave function collapse? That one is very interesting.

I tried to look at it from a definition where the 'circumstances' defined the outcome. In such a scenario I made some assumptions. That 'times arrow' don't go two ways, it goes one, the one we experience. That should mean that, as we split 'the arrow' into smaller chunks, we come to 'instants' around Planck scale. So maybe one could take any experiment and 'split' it into such chunks?

Or maybe just ignore time? If I think of it as 'instants', and they are defined from Planck size, then what defines a experiment must be what is 'closest' to it, as a vague idea? That we can assume that those too, in their turn, are influenced by what is closest to them can be ignored for this I think. And it must include all 'interactions' making everything that 'interacts' into 'observers', no 'consciousness' needed for that.

I'm not sure about it at all though, it's more of a feeling that anything I can prove.

Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #430 on: 15/11/2011 23:27:41 »
Also it has to do with how to define a distance. If Lorentz contractions exist and to the local observer is real then 'distance' is a vague description from a 'global perspective/a whole SpaceTime'. In my definition a 'distance' is as real as you measure it to be, although differing between observers. But it still tells us something, that there are no certain 'distances' except from a strictly local perspective.

So where does one frame of reference 'end' and the next one 'start'? Plank size as a idealised definition (in my thoughts)

So, how many Plank sizes is a experiment, and what do they 'see'? What communicates interactions should be 'light', but 'light' will also be the 'local clocks' for, and in, each 'instant'. I don't know how to put this together, but there should be some way.

Hopefully :)
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #431 on: 16/11/2011 09:57:53 »
So where did that wave function collapse disappear in the 'Interaction Free Measurement'? Can you argue that it exist if you can use statistics to define it? It's another circular argument to me. In this case (if working) you will 'know' the final state, without observing that last annihilation, and even though you can't 'prove it' without your final measurement you can still do a million experiments and run statistics on it. Can I expect those statistics to define it otherwise?

So, what is a wave collapse, a super position, and statistics? If I define it from statistics there can't be any doubt to the outcome in this one, can it?
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #432 on: 26/11/2011 04:37:12 »
Just a question. How do QM treat the expansion?

If I assume that everything bogs down to 'energy' and 'quanta', what happens as the 'space' expands. The 'energy' gets redistributed I presume, but if seen as 'quanta' alternatively 'bosons', indeterministic or not. Where do they come from?

Or does QM allow for 'empty space'?

Would indeterminacy be a answer to that one?
Maybe?
===

There is one more thing linked to this one. The idea of a universe in where nothing gets lost, only transformed. If I assume quanta, and then assume a 'size' I will need a explanation to why the universe can expand, and also to how 'it fill up the holes' created by the 'expansion'. If I assume 'fields' I don't need quanta of any size, but I will still need to see how they 'expand', and from where that 'expansion' lends its 'energy'. This is assuming the idea of 'space' as some sort of dormant pool of 'energy' whether being 'non interacting' or at some balance, from where we define levels both over and under it, as I understand the Higgs field to be?

For example, assuming Higgs bosons, do it becomes more of them as the expansion grow?
Where from?

In a closed universe all redistribution of 'energy' should leave a mark somewhere, as it seems to me?
==

That's actually one of the advantages with describing SpaceTime as a geometry, as I see it. I don't have to answer those questions, but if you put your trust in 'discrete bits / events' then it should have a relevance. Also you will need to define it as background independent to get away from the purely geometric definition. If you don't it seems to me as if you just painted quanta on a empty space?
« Last Edit: 26/11/2011 19:59:15 by yor_on »
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 



Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #433 on: 27/11/2011 22:10:38 »
Here's a recent paper discussing quantum states, 'wave collapses' and its definitions.

Einstein, incompleteness, and the epistemic view of quantum states. 

To get another view, and perhaps make the paper a little easier to digest you should read Can the quantum state be interpreted statistically? By Matt Leifer first.

I found it when I started to wonder what a 'wave collapse' really mean. It's a first step to see what the he* a entanglement might be, as I see it. To start from entanglements directly may seem faster, but I doubt it.
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #434 on: 28/11/2011 02:05:41 »
How about this :)

Assume that for any given volume there can be only so many states existing (at any given point/instant and always as defined relative you, and your subsequent measurement.). Assume that there always will be a doubt of knowing all data describing some experiment you do, when being inside this volume. Where is then all data 'known', and from where is it accessible for you?

To me it seems that the volume could be said to 'know', but also that it won't be accessible for you experimenting.

==

Had to add, not that it made it watertight :)
« Last Edit: 28/11/2011 02:21:00 by yor_on »
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #435 on: 28/11/2011 03:11:57 »
Also assume that what we see is what exists, meaning that the hands you write with actually is there, and will be there for any observer, even though they might not agree on the time, or place they saw it (now, that came out weird:).

If that is true then a grain of sand always will 'exist' for all observers, no matter who observes it, or how.

That seems plausible, doesn't it?

Then we come to this impossibility of knowing all data. That's also Chaos theory to me, where a small initial input may have great and unforeseen effects. But there is also a periodicity to Chaos, as the Feigenbaum constants shows us. All of this is macroscopically, not QM.
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #436 on: 28/11/2011 18:36:19 »
"Assume that for any given volume there can be only so many states existing (at any given point/instant and always as defined relative you, and your subsequent measurement.)"

There's two ways, or three, too look at this.

1. The 'states' exist, just as that grain of sand does.
2. The 'states' are a probabilistic definition of possibilities.
3. The 'states' gets defined by what circumstances it.

2 and 3 seems to me to be possible to join, if one like. And actually you should be able to join 1 and 3 too, if we define it through the outcome?
=

What I was thinking with 1 and 3 is that if we define everything from where and how we find it to be, in a measurement, then 1 is correct, or at least as correct as can be. In that nothing will exist without getting its definitions from what surrounds it. If that was possible then the state you measure will always be there, at your measurement. That you would find another state measuring it at some other point, and way, doesn't negate the fact that it always 'exists', even though not the same. Although it isn't the 'grain of sand' in the same way as it should be macroscopically, so the metaphor isn't that good.

But the point with 1 would be, ignoring the grain, that a 'state' still would be existent, at all times, even though differently expressed. Not probability per se, although that still should be the tool of choice, defining that state without measuring.
« Last Edit: 28/11/2011 21:28:06 by yor_on »
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 



Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #437 on: 28/11/2011 21:43:05 »
It's also a question of how 'identical' particles and photons are. The simplest way to disprove 3 should be to assume that they are 'indistinguishable' and then measure photons from the exact same source in the exact same circumstances. But that will open other cans of worms, like if I can expect them to have a same spin originally too? If I define them as identical I would expect all properties to be the same. Also HUP will step into it, making all definitions difficult as it there is a measure over how you decide to set up the experiment. And then we have 'time'. How identical can two objects be when differed through time?

There should be some way though.
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #438 on: 28/11/2011 21:52:01 »
The thing is, if I follow Einstein's definitions of SpaceTime, then we have a arrow of time. That arrow combined with the other three 'dimensions' defines SpaceTime. So from that perspective nothing can be the exact same, or 'identical', if separated by time. And it doesn't even help if I could assume all photons to have a same 'spin' as defined through the source. They will still not be the exact same, as I understands it.

Because SpaceTime isn't about three dimensions and 'time', it's about all four entwined.
Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 

Offline yor_on

  • Naked Science Forum GOD!
  • *******
  • 81572
  • Activity:
    100%
  • Thanked: 178 times
  • (Ah, yes:) *a table is always good to hide under*
An essay in futility, too long to read :)
« Reply #439 on: 28/11/2011 22:02:49 »
And that takes us to the idea of 'locality'. 'c' always being 'c' when measured locally, in a accelerated frame of reference, or uniformly moving.

If that is correct then any measurement done will be your definition of 'reality'. That I can do the same experiment and get the same outcome will then depend on us sharing a same 'ground state' defined by locality. In that way you might want to call it a 'global  (though always done locally) phenomena'. But we also know that with other observers, observing your experiment from other 'frames of reference', we can get other definitions than your own. It won't stop them from confirming your experiment 'when repeated locally' though. So they are 'repeatable'.

Logged
URGENT:  Naked Scientists website is under threat.    https://www.thenakedscientists.com/sos-cambridge-university-killing-dr-chris

"BOMB DISPOSAL EXPERT. If you see me running, try to keep up."
 



  • Print
Pages: 1 ... 20 21 [22] 23 24 ... 3563   Go Up
« previous next »
Tags: groundwater / water  / wars  / land clearing  / geopolitics  / resources  / holocene extinction  / environmental crises  / topsoil  / global warming 
 
There was an error while thanking
Thanking...
  • SMF 2.0.15 | SMF © 2017, Simple Machines
    Privacy Policy
    SMFAds for Free Forums
  • Naked Science Forum ©

Page created in 1.091 seconds with 64 queries.

  • Podcasts
  • Articles
  • Get Naked
  • About
  • Contact us
  • Advertise
  • Privacy Policy
  • Subscribe to newsletter
  • We love feedback

Follow us

cambridge_logo_footer.png

©The Naked Scientists® 2000–2017 | The Naked Scientists® and Naked Science® are registered trademarks created by Dr Chris Smith. Information presented on this website is the opinion of the individual contributors and does not reflect the general views of the administrators, editors, moderators, sponsors, Cambridge University or the public at large.