Naked Science Forum
Non Life Sciences => Physics, Astronomy & Cosmology => Topic started by: hamdani yusuf on 09/10/2019 04:03:39

Two samples of identical radioactive material are placed separately, one on earth surface at sea level, while the other are in a satellite orbiting the earth. Will they have different half lives ?

Two samples of identical radioactive material are placed separately, one on earth surface at sea level, while the other are in a satellite orbiting the earth. Will they have different half lives ?
Well no. If some element has a half life of a million years, it will still be a million years anywhere.
But if I measure that million years using a clock that's not present with the element decaying, the dilation between the clock and the element being measured will not be the same, so the result can vary one way or another.
Yes, time dilation effects everything. Below 3200 km, the orbital speed is high enough that time would run slower. The ISS is a case of this. Above that altitude, the speed is lower and the gravitational potential is higher, so time runs faster. GPS satellites are an example of this.

The normal values of gravity, magnetic field, temperature etc that we encounter or can generate on the Earth have no effect on radio activity but the fields encountered close to magnetic and neutron stars certainly do.
It has been suggested that Neutrino flux has some effect but that is still unproven

Two samples of identical radioactive material are placed separately, one on earth surface at sea level, while the other are in a satellite orbiting the earth. Will they have different half lives ?
Well no. If some element has a half life of a million years, it will still be a million years anywhere.
But if I measure that million years using a clock that's not present with the element decaying, the dilation between the clock and the element being measured will not be the same, so the result can vary one way or another.
Yes, time dilation effects everything. Below 3200 km, the orbital speed is high enough that time would run slower. The ISS is a case of this. Above that altitude, the speed is lower and the gravitational potential is higher, so time runs faster. GPS satellites are an example of this.
Let's say the half life of the sample when measured on earth surface is 1 year. Let one sample orbits the earth for 10 years, and then sent back to earth. Will those samples have the same mass?

It is a brilliant experiment! May be difficult to compare the masses with sufficient accuracy but the sample activity (i.e. the decay constant multiplied by the number of remaining nuclei) may be adequately measurable.
I'd be interested in some numbers from any relativity guru on this forum: what duration of travel might produce a reasonably measurable time difference between two clocks? 10 years in earth orbit is an entirely feasible experiment, 50 years maybe, 100 years may exceed the lifetime of rational thought and scientific enquiry on this planet.

Without relativity, we could not detect muons coming from space on Earth. Muons decay time is just too fast. It is the same for particles in acceleretors. Halflives are longer from the perspective of the physicists in the lab. In orbit, speed increases halflife and lower gravity decreases it, relative to an observer on Earth.

I'd be interested in some numbers from any relativity guru on this forum: what duration of travel might produce a reasonably measurable time difference between two clocks?
It doesn't have to be a long trip at all for the difference to be measurable: https://en.wikipedia.org/wiki/Hafele%E2%80%93Keating_experiment

Two samples of identical radioactive material are placed separately, one on earth surface at sea level, while the other are in a satellite orbiting the earth. Will they have different half lives ?
Well no. If some element has a half life of a million years, it will still be a million years anywhere.
But if I measure that million years using a clock that's not present with the element decaying, the dilation between the clock and the element being measured will not be the same, so the result can vary one way or another.
Yes, time dilation effects everything. Below 3200 km, the orbital speed is high enough that time would run slower. The ISS is a case of this. Above that altitude, the speed is lower and the gravitational potential is higher, so time runs faster. GPS satellites are an example of this.
Let's say the half life of the sample when measured on earth surface is 1 year. Let one sample orbits the earth for 10 years, and then sent back to earth. Will those samples have the same mass?
It depends on the altitude of the orbit. A clock sitting on the surface of the Earth, as measured by a distant observer shows a time time dilation due to it gravitational potential relative to the Earth. For an orbiting clock, both the potential and orbital speed come into play.
Now according to our far off observer, the higher orbiting clock run faster than the lower surface clock if they just consider gravity, but the orbital speed adds a factor which adds an additional amount of slowing. Now the lower the orbit, the less the difference in gravitational potential, but the higher the orbital speed.
So for example, it the orbiting clock were able to orbit the Earth at the same height as the ground clock, then there would be no difference in tick rate due to gravitational potential, any time dilation difference would be just due to orbital velocity, and the orbiting clock would be ticking slower.
However, if you move the orbiting clock to some really high altitude ( say moon orbit distance), the gravitational potential difference grows and the orbital speed goes down. Now the orbiting clock is runs fast compared to the ground clock by an amount that is greater than what it is slowed by orbital speed. The orbiting clock runs fast compared to the ground clock.
Somewhere between these two cases is an orbit where the two effects cancel each other out and the orbiting clock and ground clock tick at the same rate.
If R is the radius of the Earth, then r the radius of the orbit when this occurs is found by
r = 3R/2
Ergo, if the orbiting clock orbits at an altitude 1/2 an Earth radius above the ground, it ticks at the same rate as the ground clock. Orbiting lower means it ticks slower than the ground clock and orbiting higher means it ticks faster.
So for example, a GPS clock would be expected to run slow by ~38 microseconds per day. Over ten years, that would accumulate to a difference of ~0.13 sec. So an isotope with a halflife of one year would decay imperceptibly less in orbit than on the Earth. Since radioactive decay is statistical in nature, the difference may even be within that statistical variation. (in other words, two samples side by side may vary in result by more than the difference expected from time dilation in this example for any reasonable size of sample.)

Let's say the half life of the sample when measured on earth surface is 1 year. Let one sample orbits the earth for 10 years, and then sent back to earth. Will those samples have the same mass?
The experiment has already been done with the muons, which have a halflife of about 1.56 usec, enough time for light to travel about half a km. Yet due to time dilation, most of them reach Earth's surface about 15 km away.

I'd be interested in some numbers from any relativity guru on this forum: what duration of travel might produce a reasonably measurable time difference between two clocks?
It doesn't have to be a long trip at all for the difference to be measurable: https://en.wikipedia.org/wiki/Hafele%E2%80%93Keating_experiment
Two samples of identical radioactive material are placed separately, one on earth surface at sea level, while the other are in a satellite orbiting the earth. Will they have different half lives ?
Well no. If some element has a half life of a million years, it will still be a million years anywhere.
But if I measure that million years using a clock that's not present with the element decaying, the dilation between the clock and the element being measured will not be the same, so the result can vary one way or another.
Yes, time dilation effects everything. Below 3200 km, the orbital speed is high enough that time would run slower. The ISS is a case of this. Above that altitude, the speed is lower and the gravitational potential is higher, so time runs faster. GPS satellites are an example of this.
Let's say the half life of the sample when measured on earth surface is 1 year. Let one sample orbits the earth for 10 years, and then sent back to earth. Will those samples have the same mass?
It depends on the altitude of the orbit. A clock sitting on the surface of the Earth, as measured by a distant observer shows a time time dilation due to it gravitational potential relative to the Earth. For an orbiting clock, both the potential and orbital speed come into play.
Now according to our far off observer, the higher orbiting clock run faster than the lower surface clock if they just consider gravity, but the orbital speed adds a factor which adds an additional amount of slowing. Now the lower the orbit, the less the difference in gravitational potential, but the higher the orbital speed.
So for example, it the orbiting clock were able to orbit the Earth at the same height as the ground clock, then there would be no difference in tick rate due to gravitational potential, any time dilation difference would be just due to orbital velocity, and the orbiting clock would be ticking slower.
However, if you move the orbiting clock to some really high altitude ( say moon orbit distance), the gravitational potential difference grows and the orbital speed goes down. Now the orbiting clock is runs fast compared to the ground clock by an amount that is greater than what it is slowed by orbital speed. The orbiting clock runs fast compared to the ground clock.
Somewhere between these two cases is an orbit where the two effects cancel each other out and the orbiting clock and ground clock tick at the same rate.
If R is the radius of the Earth, then r the radius of the orbit when this occurs is found by
r = 3R/2
Ergo, if the orbiting clock orbits at an altitude 1/2 an Earth radius above the ground, it ticks at the same rate as the ground clock. Orbiting lower means it ticks slower than the ground clock and orbiting higher means it ticks faster.
So for example, a GPS clock would be expected to run slow by ~38 microseconds per day. Over ten years, that would accumulate to a difference of ~0.13 sec. So an isotope with a halflife of one year would decay imperceptibly less in orbit than on the Earth. Since radioactive decay is statistical in nature, the difference may even be within that statistical variation. (in other words, two samples side by side may vary in result by more than the difference expected from time dilation in this example for any reasonable size of sample.)
How do we reconcile these two different results?

The experiment has already been done with the muons, which have a halflife of about 1.56 usec, enough time for light to travel about half a km. Yet due to time dilation, most of them reach Earth's surface about 15 km away.
How to make sure that the muon detected at sea level is indeed the one formed at 15 km altitude, not above or below it?

How do we reconcile these two different results?
What two results? The HK thing wasn't measuring radioactive decay. It used atomic clocks. It also did not involve anything in orbit.
The orbital test just involves speeds so low that using a radioactive sample as a clock lacks the precision needed to distinguish the moving sample from the base rate.
The experiment has already been done with the muons, which have a halflife of about 1.56 usec, enough time for light to travel about half a km. Yet due to time dilation, most of them reach Earth's surface about 15 km away.
How to make sure that the muon detected at sea level is indeed the one formed at 15 km altitude, not above or below it?
The're not all formed exactly at that altitude. Some are higher, some lower, but they don't form at any significant rate at lower altitudes, say 5km. So they compare the flux on say a 2 km mountain and that at sea level. Without dilation, the sea level flux should be about 5% of that measured on the mountain. But the rate measured at near sea level was 73% of that of the higher altitude rate, yielding a relativistic dilation factor of around 8.8
The figures above were taken from the Frisch–Smith experiment, 1963, performed on Mt Washington and Cambridge, MA.

I'd be interested in some numbers from any relativity guru on this forum: what duration of travel might produce a reasonably measurable time difference between two clocks?
It doesn't have to be a long trip at all for the difference to be measurable: https://en.wikipedia.org/wiki/Hafele%E2%80%93Keating_experiment
Two samples of identical radioactive material are placed separately, one on earth surface at sea level, while the other are in a satellite orbiting the earth. Will they have different half lives ?
Well no. If some element has a half life of a million years, it will still be a million years anywhere.
But if I measure that million years using a clock that's not present with the element decaying, the dilation between the clock and the element being measured will not be the same, so the result can vary one way or another.
Yes, time dilation effects everything. Below 3200 km, the orbital speed is high enough that time would run slower. The ISS is a case of this. Above that altitude, the speed is lower and the gravitational potential is higher, so time runs faster. GPS satellites are an example of this.
Let's say the half life of the sample when measured on earth surface is 1 year. Let one sample orbits the earth for 10 years, and then sent back to earth. Will those samples have the same mass?
It depends on the altitude of the orbit. A clock sitting on the surface of the Earth, as measured by a distant observer shows a time time dilation due to it gravitational potential relative to the Earth. For an orbiting clock, both the potential and orbital speed come into play.
Now according to our far off observer, the higher orbiting clock run faster than the lower surface clock if they just consider gravity, but the orbital speed adds a factor which adds an additional amount of slowing. Now the lower the orbit, the less the difference in gravitational potential, but the higher the orbital speed.
So for example, it the orbiting clock were able to orbit the Earth at the same height as the ground clock, then there would be no difference in tick rate due to gravitational potential, any time dilation difference would be just due to orbital velocity, and the orbiting clock would be ticking slower.
However, if you move the orbiting clock to some really high altitude ( say moon orbit distance), the gravitational potential difference grows and the orbital speed goes down. Now the orbiting clock is runs fast compared to the ground clock by an amount that is greater than what it is slowed by orbital speed. The orbiting clock runs fast compared to the ground clock.
Somewhere between these two cases is an orbit where the two effects cancel each other out and the orbiting clock and ground clock tick at the same rate.
If R is the radius of the Earth, then r the radius of the orbit when this occurs is found by
r = 3R/2
Ergo, if the orbiting clock orbits at an altitude 1/2 an Earth radius above the ground, it ticks at the same rate as the ground clock. Orbiting lower means it ticks slower than the ground clock and orbiting higher means it ticks faster.
So for example, a GPS clock would be expected to run slow by ~38 microseconds per day. Over ten years, that would accumulate to a difference of ~0.13 sec. So an isotope with a halflife of one year would decay imperceptibly less in orbit than on the Earth. Since radioactive decay is statistical in nature, the difference may even be within that statistical variation. (in other words, two samples side by side may vary in result by more than the difference expected from time dilation in this example for any reasonable size of sample.)
How do we reconcile these two different results?
Do you mean when comparing the HK experiment with your thought experiment?
The HK experiment was done with equipment specifically designed to be able to detect the difference. My analysis of your experiment was based on a sample with a halflife of 1 yr over ten years. The difference is that the 1 yr halflife would not not produce a significant enough difference in the sample considering the small difference in total time 0.13 sec is very compared to a 1 year halflife. So a 0.013 second difference isn't going to result in that many fewer atoms decaying. It would be different if you used an isotope with a shorter halflife.

Accurately measuring half life is difficult.
This is what is regarded as high precision they measure the halflife to 4 significant figures.
https://link.springer.com/article/10.1007/s1096700806373
How long and how fast would a sample need to be in orbit to change the half life that much?