https://en.m.wikipedia.org/wiki/Perturbation_theory_(quantum_mechanics)

Quote:

"Time-dependent perturbation theory, developed by Paul Dirac, studies the effect of a time-dependent perturbationV(t) applied to a time-independent Hamiltonian H0.

Since the perturbed Hamiltonian is time-dependent, so are its energy levels and eigenstates. Thus, the goals of time-dependent perturbation theory are slightly different from time-independent perturbation theory. One is interested in the following quantities:

The time-dependent expectation value of some observable A, for a given initial state.The time-dependent amplitudes[clarification needed] of those quantum states that are energy eigenkets (eigenvectors) in the unperturbed system.

The first quantity is important because it gives rise to the classical result of anA measurement performed on a macroscopic number of copies of the perturbed system. For example, we could take A to be the displacement in the x-direction of the electron in a hydrogen atom, in which case the expected value, when multiplied by an appropriate coefficient, gives the time-dependent dielectric polarization of a hydrogen gas. With an appropriate choice of perturbation (i.e. an oscillating electric potential), this allows one to calculate the AC permittivity of the gas.

The second quantity looks at the time-dependent probability of occupation for each eigenstate. This is particularly useful in laser physics, where one is interested in the populations of different atomic states in a gas when a time-dependent electric field is applied. These probabilities are also useful for calculating the "quantum broadening" of spectral lines (see line broadening) and particle decay in particle physicsand nuclear physics."

Unquote:

Everret's multi world multiverse states these alternate realities as being eigenvectors of imaginary phase space. ie: Schrödinger's box full of particles...when a quantum computer is factorising a 250 digit number, it does so by operating in a superposition of 10power500 states. It is usual to think in terms of phase space, but Deutsch on the basis of Everret deduces that the quantum computation is itself a real and physical thing, not just something imagined by mathematicians. The calculation involves, in this case, 10power500 real computers working together. Where are they? Deutsch makes the point forcefully:

Quote:

"To those who still cling to a single universe world view, I issue this challenge: explain how Shor's algorithm works... When Shor's algorithm has factorised a number, using 10power500 or so times computational resources that can be seen to be present, where was the number factorised? There are only about 10power80 atoms in the entire visible universe, an utterly minisclue number compared to 10power500. So if the visible universe were the extent of physical reality, physical reality would not even remotely contain the resources required to factorise such a large number. Who did factorise it, then? How, and where, was the computation performed?"

Unquote:

My model answers this question, but first:

Quantum computing uses NRI, which is actually MRI but patients are scared of the word nuclear, to record the results of the quantum calculations. This works because a) the computer is required to generate multiples of the same calculation in case of error due to outside interference.

And b) because the results are recorded via the NRI which due to it recording all the results simultaneously, this does not result in the collapse of wave function as observing the result a singular quantum calculation would.

The first published MRI image of a human was made in a superconducting 0.1 T magnet (I used to work for the inventor, and I met the patient). There were some systems that used the earth's magnetic field as a polariser but they were never more than curiosities. You need at least 0.2 T to get enough signal/noise ratio to produce a useful image before the patient dies of boredom.

All MRI systems use RF energy.

And they all use reiterative 3D inverse-space reconstruction algorithms to produce the image because that's the only way you can do it. Whilst the algorithms themselves are fun, proving that a new algorithm is indefinitely stable and uniquely convergent is a mathematical orgasm, way beyond a mere KISS.

Quote:

"Perturbation theory is systematically used to generate root finding algorithms. Depending on the number of correction terms in the perturbation expansion and the number of Taylor expansion terms, different root finding formulas can be generated. The way of separating the resulting equations after the perturbation expansion alters the root-finding formulas also."

Unquote:

So what we are seeing here is the use of perturbation theory in the analysis of quantum computing and also within the building of an NRI picture of the quantum results.

You may correct me but I'm pretty certain that it is a time dependent perturbation that is required for both cases of these algorithms, as well as being required to calculate quantum probability.

I'm proposing that there has been a crossing of wires here. That probability is suggesting an outcome of a function and that Everett's proposal that each possible outcome exists in an alternative reality is misconceived. That these possible outcomes are just that, possible outcomes based on input parameters. That it is the time dependent perturbations of the calculations that are the true extent of a reality that is operating in a rate of time that is drastically slower than the standard second which all quantum calculations are calculated via. That within this slower rate of time relative to the standard second there is the facility for many more events than it is possible to observe within the time frames available of a standard second. This is why a single electron can occupy more than one position at the same time, or appear to have dematerialised and rematerialized at another position without having apparently travelled the space inbetween. The calculations of the probable outcome are valid, but the time perturbations, if replaced with the appropriate rate of time as per energy of the situation, will result in knowing both the position and the velocity of the electron, and probability becomes only a mathematical means of working out the most likely outcome of a future event, with probability trending to the universal tendency for the course of least action. (a concept you mentioned earlier)

So how and where does a quantum computer make its calculations? I am proposing that it makes these calculations in the manner of usual computations, and it does so in our universe only, in a much slower rate of time, and ouri observation of this slower timei is observational time frame dependant and proportional to the difference in rate of time between observation and observer.

Anyway Alan - it would seem you are perhaps ignoring me, so if you do not wish to partake in the discussion anymore then just say so. I'll not bother you any further!