There is always a probability that sort it out. Turn it around and look at your chair, it's there now right? And you expect it to be there tomorrow too

That's probability for you, it seems there always will be some ways of behavior more probable than others. The question is how to find the parameters for that probability, when looking at it from a quantum level.

Feynman defined this way.

"Assume that a particle can travel between two points a and b by a - possibly infinite - number of different paths. Each one of these paths will have a certain probability associated with it. In quantum mechanical terms, these probabilities are encoded in the wavefunction that describes the particle, which assigns to each possible path a different probability amplitude; the square modulus of this amplitude gives the corresponding probability.

The crucial point is that these different amplitudes have a wavelike nature, and as they spread through space they interfere with each other, their respective wave patterns either reinforcing or canceling each other out at various points. And if you sum over all the amplitudes of all the different paths, i.e. you sum-over-histories, then the different amplitudes will reinforce or cancel each other in such a way that the only path that survives this interference process is the one that the particle actually follows.

Feynman's sum-over-histories approach is nowadays given the name of the (Feynman) path-integral formalism of quantum field theory, which is based upon the calculus of variations and the principle of least action. This is now the standard mathematical formalism used by most physicists working in quantum field theories, because it is extremely powerful, highly visual, and will often yield answers far more quickly and simply than other formalisms."

That contrasts against the Copenhagen quantum definition in where The Heisenberg uncertainty principle (HUP) was seen as the principle defining the possibility/probability.

"An unobserved system, according to the Copenhagen interpretation of quantum theory, evolves in a deterministic way determined by a wave equation. An observed system changes in a random fashion, at the moment of observation, instantaneously, with the probability of any particular outcome given by the Born formula. This is known as the "collapse" or "reduction" of the wavefunction. The problems with this approach are:

(1) The collapse is an instantaneous process across an extended region ("non-local") which is non-relativistic.

(2) The idea of an observer having an effect on microphysics is repugnant to reductionism and smacks of a return to pre-scientific notions of vitalism. Copenhagenism is a return to the old vitalist notions that life is somehow different from other matter, operating by different laws from inanimate matter. The collapse is triggered by an observer, yet no definition of what an "observer" is available, in terms of an atomic scale description, even in principle.

For these reasons the view has generally been adopted that the wavefunction associated with an object is not a real "thing", but merely represents our knowledge of the object. This approach was developed by Bohr and others, mainly at Copenhagen in the late 1920s. When we perform an measurement or observation of an object we acquire new information and so adjust the wavefunction as we would boundary conditions in classical physics to reflect this new information. This stance means that we can't answer questions about what's actually happening, all we can answer is what will be the probability of a particular result if we perform a measurement. This makes a lot of people very unhappy since it provides no model for the object."

In the Copenhagen Interpretation, the wave and particle pictures of the atom are "complementary" to each other. That is, they are mutually exclusive, so in a final interaction there can only be one (as in Highlander:) of them, but for describing the 'wave function' before that final interaction you will need both.

==

Where The Copenhagen definition discuss the uncertainty or 'superposition' of that wave function , being both wave and particle simultaneously, before that interaction defines it, Feynman used 'sum over paths' instead. Defining it as if every path was taken for that wave function, simultaneously. All of them needed for the final outcome/interaction as they acted like waves do, in some cases reinforcing each other, in other quenching each other. the summation of all those probabilities for that wave function lead you to the same answer as you got by the older Copenhagen definition, but no longer involving 'uncertainty' as it is assumed that this always will give you the correct outcome.

But you might notice that instead of HUP:s uncertainty you now have a wavefunction that 'must' take all paths for the solution to make sense. And choosing between two ah, uncommon? Approaches to reality? I don't know, both seems to work and both describes 'reality'. Neither one is wrong. But Feynman's solution question time in a different way than HUP does it seems to me? I do like HUP myself, and Shrödinger, as they both describes a universe where we can't know it all. Feynman by questioning time present his solution as more 'probable' as we then just have to redefine 'times arrow' to something new, maybe some entropic quality? But as I trust that there actually is a 'arrow of time' same for us all, as observed where we are? I don't know?