The probability postulate is given by a density matrix. It is a situation more general than the state vector, which has been prepared, but we don't know how. This is the first clue that we base random systems as something unique to nature, but is rather a reflection on our ignorance of the processes involved.

If you don't know how a system is prepared, then the density matrix has all equal Eigenvalues, it is proportional to the unit matrix, and the aigenvalues are 1/n where n refers to the number of states. The density matrix is the anologue of classical probability distributions.

**Tr**ρ = 1

This is a true statement from the postulates above, stating that the trace times the density matrix gives the Hermitian Operator, just meaning that it is uniquely a real number, and the trace **Tr** is nsimply the sum of the eigenstates λ_{i}, where (i=1,2...n).

Each Eignevalue has an eigenstate which is in some orthonormal state which corresponds to the eigenvectors. This is a small look into the minimal amount of information we know about a system, which could be nothing about how it was prepared before measurement. There is of course, a maximal knowledge state; these states are called pure states.

Pure state just means the opposite of of the minimal knowledge; that we know how the system was prepared.

ρ = |ψ><ψ|ψ>

So what is value here, with the density matrix? Well the density matrix is the projection onto a pure state. The quantity <ψ|ψ> actually has a value of 1, so immediately we see the density is just the ket vector ρ = |ψ><ψ|ψ> = |ψ>. Of course, this means that the eigenvector has been chosen, we must know then that any other orthogonal vector φ to ψ is:

ρ = |ψ><ψ|φ>

Here, consider the equation:

M' = **Tr**ρM

The average expectation of an observable M' equals the trace times ρM.

**Tr**ρM = ∑_{i} <i|ρM|i>

which is the definition of the trace. Note also we are summing over all i'th states. The above equation can be further taken to mean:

M' = ∑_{i} <i|ψ><ψ|M|i>

Which can be solved by moving the <i|ψ> term to the right hand side of the equation to create the unit operator (matrix) which has the identity of |i><i|:

M' = ∑_{i} <ψ|M|i><i|ψ>

= <ψ|M|ψ>

That part simply disappears, because it equals 1, so we are left with the average of M is the expectation of M, which seems like quite an obvious proof.

Take the calculation in the basis in which ρ was diagonal, thus:

∑_{i} <i|ρM|i>

We will expand the representation of our states by introducing j:

∑_{ij} <i|ρ|j> <j|M|i>

if i=j for <i|ρ|j> then this is simply λ_{j} and if i=j for <j|M|i> then this makes <j|M|j> so we have finally:

= ∑_{j} λ_{j} <j|M|j>

which is the probabilities taken in the j'th state.

These are average values, probabilistic natures. Let us take a quick look at entropy. You have:

S = -∑ P_{i} log P_{i}

If you have complete knowledge of the system then the entropy is considered as zero. But if we don't know the entire make-up of our system, we must remember that P_{i}=1/n. Because the negative is taken, negative log 1/n is basically N itself, so entropy begins to look like:

S = 1/n log N

Multiply by n and we have:

S = log N

so entropy, (the fundamental idea about indeterministic systems) is given by the logarithm of the n'th states. But likewise, in all this probability, the idea of randomized systems depended on the information the observer has on the system. In fact, if this is true, we have based indeterministic systems on faulty premises, basically assuming a system is inherently random, because we cannot ascertain the whole history of the system; both past and future in many cases. This should not imply randomization, but a lack of knowledge based on our behalf. Whilst many scientists will say quantum mechanics is random in systems like radiating atoms, it is believed by the conjecture posed above, that this is not the case.

What are your thoughts on this matter?