0 Members and 1 Guest are viewing this topic.
I think I meant atmospheric refractionSince the atmosphere refracts the light around the earth during sunrise and sunset (hence the sun appears there when it isn't) but not at midday. So if I was watching the sunrise it would appear to travel a greater distance in less time since the curve is decreasing at the same time as it moves.I hope that is more clear.Sorry about the very nontechnical language.
Can I suggest a "Flat Sky Theory".
I have attached a basic diagram.
... For the sun or the moon, the angular velocity is constant, so that at least partially we do this transformation, and judge them to be going faster when they are low in the sky than when they are overhead.
Humans perceive the sky as a flattened dome, with the zenith nearby and the horizon far away. It makes sense; birds flying overhead are closer than birds on the horizon. When the moon is near the horizon, your brain, trained by watching birds (and clouds and airplanes), miscalculates the Moon's true distance and size.