I have been doing a little research which is a bit of a contextual mess, but I wish to try to link some past geology with the ways of life of people who lived in England from 15-1800s. In particular, I am/was wondering about resources that so-called wise people around that time might have used to treat the sick. This brought my attention to the use of capital punishment around that time, including those who were executed under the Witchcraft Act - some sources say that Gallows Hill in Chester had been used as an execution site since mid 1500 - 1800, while others say from 1600.
As I understand it, Gallows Hill was used until the government decided to move executions to inside prisons, where the condemned would be buried inside the prison walls (until a later amendment by parliament made it so that prisoners' bodies could be sent home).
Concerning pre 1900s, another source that I read, wrote about prison conditions in Chester and how the cellars that the condemned were kept in made it impossible for them to lie or stand. The cellars were a foot or so above ground level and I thought this data might be linked to another source that mentioned something about shallow soil and bedrock beneath it, so workers could not drill through.
1) I have no data about where those executed on Gallows Hill were laid to rest. How possible is it that they were laid to rest, if not cremated, inside the hill?
2) If the source about shallow soil in Chester is right, then are all burial places there on hills?
3) Is there a way to tell, without testing or digging into soil, if a significant portion of land has bedrock so close to ground level?
I think that herbal teas have a lot to offer in terms of medicinal research. I would advise though, generally, caution should be taken over articles intended to hype interest in products, e.g., those derived from coconut and partially tested theories which suggest that one should consume a regular amount of something - to cure or stave off illness. One of the benefits of herbal teas that I found, was not in the consumption, but in the external application of green tea and lemon, of a well-known brand. For at least three years, I had suffered with a painful skin ailment. I had tried lots of remedies to cure it - I even sought medical advice but was told ultimately that there was no prescription, since the ailment did not appear as the result of infection. I tried almost everything from manuka honey, to sea salt and bi carb soda. But one day, I prepared myself some green tea - infused with lemon and vitamin C. Instead of drinking it, I used it to bathe the affected area. The temperature of the tea was very hot, though not boiling and when it touched the area, began to sting, while also easing the pain. After the first two days of this treatment there was noticable change. In 4 days, the ailment was gone. This was around 2 years ago and this unsightly and painful thing has never come back, so yay for green tea!
Could Aspergillus fumigatus pathogen, a fungus commonly found in foods such as bread, be detoxified with a reagent in order to acquire antigenicity? If so, could the said toxoid be used to preserve certain foods, without the need for other harmful chemicals?
I should have said 'perhaps' FTL (due to a debate on hidden variables).
Synonymous, yes: I trawled briefly through quite a bit of data before I gathered those snippets. Due to the tiny bit of knowledge I have on the subject of entanglement, I was led to believe that proof of an entangled pair can only exist as a thought experiment. Therefore, this hypothesis is based upon proofs gathered during such experiment, where, before measurement the entangled particles were allowed to share the same photon, or I figured how else would I be able to imagine them simultaneous? However, because of the Uncertainty Principle, the act of measuring the pair could not simultaneous be achieved.
During the thought experiment, I found that which-ever particle I decided to measure would require a photon - different from the one used to imagine the entangled pair; and that if all proofs for entanglement can only be found through such experiment, then perhaps my explanation can help solve the mystery behind whether or why the effect of measurement of either particle is instant: the change of photon was the process of measurement, that I could not find a way to separate the two... the measurement process was the disentanglement of the pair! If both statements are true, then perhaps disentanglement would occur instantly, without the involvement of hidden variables.
I've read a few snippets on the web recently, which concern quantum entanglement, also whether data passed from one particle to another happens instantly and faster than the speed of light.
I came across these questions because I had been going over some of my old stuff and perhaps they helped me to solve, for my own hypothesis at least, how the instant and seemingly faster than light transfer of data comes about.
What if the measurement of either particle in an entangled state was synonymous with the separation of one from the other, synonymous with the disentanglement of A from B? What if, before measurement, both particles were only allowed to share the same state, because each shared the same photon? Then common sense told an observer, for entangled particles positioned a great distance apart, that the measurement process would require that each particle has its own photon. If this were to happen, the particles would no longer be in a coupled state. During the measurement process, one particle became disentangled with the original photon in a way that is synonymous with measurement. To me, this sounds like information may not be transferred in the way one would typically expect. For instance, it seems to avoid the problem of how data could get transferred faster than the speed of light: the separation of particles - because another becomes entangled with a different photon, is a way of transferring the data that may not require hidden variables.
It was not word perfect and I am looking for fault with this. I meant that any arbitrary measurement of a system is equal to having mimicked the photon and that that is how I perhaps should go about the measurement process, to determine whether the act of measuring particle a would have an effect on particle b. I propose that it would, through a series of arbitrary measurement. Assuming I know nothing about particle a or b; and so causal relations would only become significant as I learn more about a, through a series of arbitrary measurement. Here, I hoped to find that the act of measurement on a would have an instantaneous effect on b. That the effect would not occur if measurement was not arbitrary, since non-arbitrary measurement would only be correct within a certain average.
One cannot expect that a direct measurement of one particle in a coupled state would mean this would bring about any changes in the other particle. But perhaps in some ways, those changes can be achieved. Perhaps, in order that measurement of particle a have any effect on particle b, the measuring apparatus must mimick the photon. The reason is, each particle a and b has potential. Not all of these potentials will be realized however in a single time-frame. The photon therefore, posessess the most potential. It has to be ready to serve data about particle a behaviour to the observer. So, rather than to begin with arbitrary measurement of particle, one begins with a series of arbitrary measurements of the photon. After arbitrary measurements on the photon have been made, the result should be a series of potentialities, pertaining to the photon and therefore particle a and b. Throughout this measurement process, information acquired about particle a, directly affects particle b. It is unlikely one would gain a classical measurement, though at this stage not impossible. The more arbitrary measurements are made on the photon, the closer one gets to a full description of both particles. That is to say, an observer at particle a, making a measurement would have no effect at all on particle b, because of a resitriction in potentialities.
Unless the machine had a mind of its own, the machine itself could be hacked. One would also need a clear definition of what hacking is. And how best it is not used in the public interest. For example, I half read something today about a hacking group, unsatisfied with the way authorities were dealing with things, are going to solve a mystery murder with their hacking skills. Such software may create a temporary loop-hole for third-parties to block out consent for the sharing of data. It is the definition which is the problem, I think. Because if the software could not be manipulated, it would have a mind of its own.
A less arbitrary interpretation of hidden variables takes less into account the hidden variables themselves in order to grasp further the local aspect. Suppose the actual entanglement with regards to particles is with the photons and the relation of a supposed entangled pair of particles has to do with their individual negative or positive relation with the photon, rather than with each other. I mean, if one takes an object and suppose that the only reason, so far, one is unable to verify the relation between local (observable) and non-local (the other side of the object at the other end of the universe) is that the light only allows us to see the local half. We are assuming that the object, which include entangled photons take up the whole of space. The local half of the object has particles at a distance the length of the observable. These behave as entangled pairs at an infinite distance apart, but for the reason they do not occupy the whole of space and therefore differ from the previous interpretation. In this interpretation, the only real entangled pairs is a point at a local side and a point at a non-local side. Note, however, not in this interpretation to be an entangled pair. That is, one of the particles shares a positive relation to any single photon present at angle, while the other particle shares a negative relation. For instance, if one observes a point at one angle, then the entanglement of the point and photon are related by the angle at which the photon allows us to observe the point and the angle by which the point is seen. Concerning degeneracy, one could describe one entangled point at the cusp of observability as having a 1 relation to the photon and the point on the other side of the object as having -1 relation. All points may be described as having consequent positive or negative relation to the photon. Since the object takes up the whole of space can be confirmed via a measurement of such relations. The difference between describing entanglement of particles , rather than entanglement of particle and photon is that one could not thus describe a local and non-local system, because photons would be classed as separate from the system, therefore the system could not take up the whole of space.
Robert Frost is by far my favourite poet. I know a few of his works off by heart, then recite them during my trips to the woods. Personally, I would put him on a par with Albert Einstein and Emily Dickenson. A lot of my own science works were inspired by these three. As for the poem"Two Roads Diverged in a Yellow Wood" I dare not hold it to a general concept. The pictures he paints in his lines are too beautiful and numerous that I fear a general concept would not do them justice. However, one of my favourite stanzas, "And both that morning equally lay in leaves no step had trodden black, oh I kept the first for another day, yet knowing how way leads onto way I doubted I should ever come back." Probably, only in my mind, I interpret as equally pure, untrodden roads and doubting he should ever come back, is Frost's realization of finite infinity. That he is forever somewhere, but always too far down the road of time to return. That doesn't really do it justice, either. Course there are hints towards the untrodden road that could be black and the reader gets the best of both worlds. Too many concepts. Love the thread.
I don't think there is any clear evidence; but I think there are some strong analogies between the nature of the mind and physical reality. On the subject of the degeneracy of hidden variables, Einstein's theory of relativity, Group theory and Fourier transform have not been stirred due to my interpretation. I also point out that the true test of such hypothesis is whether the system is able to replicate an unspecified observable and again I see no flaws relating to this, nor Einstein's theory of light, Group theory or Fourier transform.
In order to obtain more accurate results, measuring apparatus should be close to osscillations as possible. The limits of how close the measuring apparatus can get, is a limit to perception. The oscillations one can't account for degenerate into the observable system.
I always entertain the idea of the stubborn, push/pull of psychological and physcical reality. Psychological reality is physical; and on the other hand, the data one extracts from the universe becomes the individual reality to the extent that it is a universe in its own right. I believe that if one does not entertain the idea of both forms, but rather, takes reality to include that which one mainly can't see, then it seems an arbitrary argument, however, I have still made my point. Reality, whether the individual believes it to be made up mainly from that which one can't see- is a concept and as such belongs to the perceptual domain.
When thinking of a simulated reality, one’s first reaction might be to reject the idea as something of a fictional foundation. People believe many things on principle; these are the characteristics given to us by parents, then by society as we move into some system of belief, be that Christianity, atheism or science. But the idea that we live in a simulated world is much too simple. For reality may well be a computer translation – even if that reality never began until humans invented computers! The idea is not a new one. According to mathematician Hanz Morovec, the universe may be perceived as the existence of a simulation. This philosophical hypothesis was first published in 1998. But while the idea that we live in a simulated world is exciting, the subject has been a focus of scepticism. Those who are sceptical imagine some kind of sci-fi scene, but the truth is the hypothesis is just a way of looking at a system that standard physics attempts to describe. When we play a video game, we know there’s a difference between what’s happening on-screen, to what’s happening off-screen. But while everything we perceive has been picked up by the senses, we may be fooling ourselves if we exclude the possibility of simulated reality based on a single argument. It’s possible to create computer simulations. And unless – while we’re doing it we have become separate from the universe, then it’s possible for the universe to create computer simulations. The question is, how many virtual dimensions can we recreate that allow us to interact without the psychological separation; that is the separation between what we think reality is. When we watch television, we’re able to suspend our belief of reality. Temporarily, we are fooled into believing we are somehow in the picture and the earliest examples are seen in books. Hanz Morovec wrote, “Is the Mount Rushmore monument a rock formation or four presidents’ faces? Is a ventriloquist’s dummy a lump of wood, a human simulacrum, or a personality sharing some of the ventriloquist’s body and mind? Is a video game a box of silicon bits, an electronic circuit flipping its own switches, a computer following a long list of instructions, or a large three-dimensional world inhabited by the Mario Brothers and their mushroom adversaries? But to recreate ourselves within a simulated world is not so simple as to cancel the observations that cause us to be psychologically separated – if, in the simulation we are to be conscious observers. The simulated world would be subject to its own laws and in cancelling any ability would leave us entirely within those laws where conscious observers need not exist. It’s by realizing the separation between various aspects of ‘reality’ that make us who we are; and so, in linking to any other set of laws, we would need a separate link for an ability to ‘see.’ Visual Symmetrics attempts to address the problem as it addresses time-travel, through the means of consciousness. Physical time-travel seems impractical so far as perception goes, because the time-traveller, in order to accurately visit a point in time, would a) need to have adequate altered perception, b) need to have the adequate biological alterations. But in time-travel, just as in moving to a simulated reality we would be unwise if we were to cancel any perception in an attempt to gain accurate results, since there would be no way to return to our normal environment. The exploration of the possibility of time-travel is a good way to highlight problems of being able to move into simulated reality. To say we could accurately time-travel using unaltered perception, would be equal to say we could move into a simulated environment and remain exactly as we are now.
The degeneracy of hidden variables idea cannot logically be disproved via a measure of physical qualities. This is because the observables they describe do not propose a violation of the speed of light. Position and momenta are the fundamental of everything, it would always allow for experimental matches or mismatches. A local hidden variables theory however, may find itself open to scrutiny, if it offers no way to understand how systems in states of chaos may be ordered through the nature of light and conservation of energy. There is a limit to how one may describe an observable system and one of those being Bell's theorem. A local/non local theory of hidden variables is less likely to be intercepted by such argument, because the identity of the system is left unchanged. An argument about determinism is not relavent then; one would get deterministic results half the time and indeterministic results the other half.
" There is no real difference between determinism and indeterminism if a system that is determined does not imply that the act of determining that system was done under the same rule. There is no real difference between determinism and indeterminism if neither it implies it was not order that prevented indeterminism in the first place. Most likely, the brain creates order from a preconception that there was indeed any chaos."
Those who believe quantum theory is complete will no doubt find evidence that a hidden variables theory is not needed through Werner Heisenberg's matrix mechanics. However, one wishing to relate Albert Einstein's relativity theory - with quantum theory will disagree. For some, the question is not merely that if one can describe reality with abstract means, then is an alternative picture needed - that a complete theory may account for? But rather, how Einstein's theory of light can be neglected in favour of quantum phenomena.
Group theory teaches the importance of the ability to define systems. Those systems however, can only be accounted for with a theory of light. But if such a system was to violate that theory, then where would be the axioms to suggest an identity for the system? E=mc2 may account for the assumption that there are hidden forces at work. It accounts for how a system may be identified and may account for the reason why coupled particles at arbitrarily large distance cannot be comprehended due to light speed. On the other hand, one can combine Heisenberg and Einstein's theory if one assumes that a system may accurately be described either way, but only if a proportional change in distance is equal to a proportional change in speed.
Identity of objects is created (remember we have only the vastness of space), by infinitesimal interlocking that create more and more gravity.
Quantum entanglement: action at a distance.
Quantum entanglement is the phenomenon of a pair or group of particles that at arbitrarily large distances cannot be described independent of the other. Albert Einstein called this "spooky action at a distance." The application of Group Theory to the phenomena of hidden variables may help to avoid contradicton with the theory of relativity. In the "degeneracy of hidden variables", the observer forces a degeneracy, while hidden variables reinforce their purpose. This action preserves the speed of light. Problems arise which concern the light barrier when arbitrarily large distances are considered. Because the entangled pair make the whole system, the identity of the object cannot be reinforced and this gives the impression of "spooky action at a distance." Take the identity of molecules as an example. The fact that the system has identity, highlights the acknoldgement of a system - having been scrutinized by an observer and stubborbly resisted by the hidden variables. When one considers why action at a distance occurs, hidden variables cannot complete the task of reinforcing identity of the system; no light occurs between a coupled state at such large distance because identity of the system has not been acknowledged. That the particle properties are reflections of each other implies it is a statistical aggregate of a complete system as may be observed in a vector model. To question action at a distance is perhaps equal to question whether the moon exists in three dimensions because we only get to see one side.