0 Members and 1 Guest are viewing this topic.
I think I meant atmospheric refractionSince the atmosphere refracts the light around the earth during sunrise and sunset (hence the sun appears there when it isn't) but not at midday. So if I was watching the sunrise it would appear to travel a greater distance in less time since the curve is decreasing at the same time as it moves.I hope that is more clear.Sorry about the very nontechnical language.
Can I suggest a "Flat Sky Theory".
I have attached a basic diagram.
... For the sun or the moon, the angular velocity is constant, so that at least partially we do this transformation, and judge them to be going faster when they are low in the sky than when they are overhead.
Humans perceive the sky as a flattened dome, with the zenith nearby and the horizon far away. It makes sense; birds flying overhead are closer than birds on the horizon. When the moon is near the horizon, your brain, trained by watching birds (and clouds and airplanes), miscalculates the Moon's true distance and size.
RD is correct I got it the wrong way around. But I think the point still stands. It may in fact appear to speed up at sunrise with reduction of the angle and slow down at sunset with the increase of the angle.
No need to apologize.Could you explain in more detail please. It has been a long time since I studied physics at school.