Science Questions

What happens to the signal if you fly past base stations quickly?

Sun, 3rd Apr 2011

Listen Now    Download as mp3 from the show Keeping the Conversation Flowing


Mike, Colchester asked:

If you were to use a mobile phone on an aeroplane, what would happen if you were passing between base stations at 500mph?


Chris - It wouldn’t really matter from a radio point of view because radio is moving at the speed of light which is considerably faster than 500 miles an hour. It’s whether the network could actually keep up with the very rapid transition between the different cells.


Subscribe Free

Related Content


Make a comment

Actually, if it was a GSM mobile phone network then the system can't cope with your distance to the base-station changing too quickly. I don't have the specification document to hand, but I believe the upper limit is around a couple of hundred km/hour.
The transmissions between handset and base station happen in brief pulses, 216 per second, and the system is arranged so that the transmission from the handset arrives back at the basestation at a fixed time offset to the transmission from the basestation. The mobile therefore has to advance its transmission timing according to its distance from the base-station - and there's a maximum rate at which this "timing advance" parameter can be adjusted, which leads to a maximum speed limit (resolved in the mobile-basestation direction).

See also:

Ah ha - the GSM spec says

(I recall the GSM mobile handset frequency-locks its transmissions to the received signal from the base station, so any Doppler shift is transparently compensated for.)

As Chris says, the high speeds may create additional problems with coordinating basestation (cell) handovers. techmind, Mon, 18th Apr 2011

That's very interesting Techmind. It never even crossed my mind that speed could be a factor, but it makes a lot of sense now.

It's interesting how we tend to think that the speed of light is so fast that, for many purposes, it might be considered "instantaneous", at least that's what I used to think until I tried to design a digital circuit. That's when I began to realize how painfully slow the speed of light really is 

(Before someone objects, I should point out that digital signals travel at about one third the speed of light in electronic equipment.)

Geezer, Tue, 19th Apr 2011

... which is still pretty #@!* fast!  Since I only deal with theoretical circuits, this is news to me.  What type of applications did you need to take into account the travel speed? jpetruccelli, Tue, 19th Apr 2011

... which is still pretty #@!* fast!  Since I only deal with theoretical circuits, this is news to me.  What type of applications did you need to take into account the travel speed?

It's surprising I know, but digital system engineers have to contend with it all the time. Suppose you want to build a synchronous system with a clock frequency of, say, 50 MHz, or a clock period of 20 nS. In a truly synchronous system you'd try to make every state change happen well within one clock. That means all logic elements have to settle within that interval, so all the worst case delays for a combination of gates, plus all the propagation delays through all the wiring interconnects for that circuit have to total less than 20 nS.

The signals propagate at around 100,000,000 m/s - still pretty fast you'd think, until you realize they can only travel a lousy 100 mm (about 4 inches) in one nanosecond! So, a few traces on PCBA can easily chew up a sizable chunk of your precious 20 nS budget, and that's before you even get to worry about all the semiconductor delays.

Naturally, the faster the clock rates are, the more acute the problem becomes. The speeds in ICs are now so great that, even though the trace lengths are very short in an IC, the engineers have to worry about signal delays measured in picoseconds.

If Graham is around, I'm sure he will be happy to give us a substantial earful on this subject 

EDIT: I don't think the following is really a legitimate argument, but just to be impish, I could suggest that the reason the famous Moore's Law is no longer valid is because the engineers can find lots of ways to make the circuits switch faster, but, try as they might, they have not had much success at increasing the speed of light

Geezer, Tue, 19th Apr 2011

Most commercial wireless systems are designed to cope with relative speeds of up to 70mph with an acceptable level of deteriation in quality. Some are only designed to cope with up to 50kph however; it depends on the application and what the acceptable Bit Error Rate is (if digital) or a more woolly measure of sound quality if analog. However, all these specifications are based on the deteriation being due to "fading" rather than timing problems related to Doppler effects. It is to do with the way the radio copes with moving between areas where the received signals may be cancelled by interference to areas where the signal is OK (or even reinforced) by interference. There are computer models used (called fading models) that emulate such effects and wireless systems have to pass such tests (TU-50 is an example). The problems are related to the terrain as well as the movement which is usually irrelevent in a plane for example.

The only areas I can think of where specific care is needed to take into account Doppler effects are related to communications with non-geostationary satellites such as with GPS or some specialised systems that are used to track objects (like shipping containers) using Low Earth Orbit Satellites. GPS needs to adjust its calculations but not specifically the radio system, but I know the system using LEO sats does have to adjust its radio, especially when the satellite is low to the horizon and has a maximum closing (or retiring) speed which will be around 17,000 mph (= 4.7mps). This is significant compared to lightspeed (186,000mps) being 25ppm. This 25kHz with a carrier of 1GHz and comfortably takes you into the next-but-one channel!! It has to be allowed for so that the Local Oscillator is suitably tuned depending on the satellite's speed and is changed as the satellite passes.

These problems are not generally significant at aeroplane speeds where even at 680mph it would only be a 1ppm error. Whilst this can be a problem in capturing very long data streams, it is generally done anyway because you cannot cheaply buy reference frequency sources based on crystals better than this. graham.d, Tue, 19th Apr 2011

To answer Geezer's challenge :-)...

Digital IC's have gate delays of the order of picoseconds as you say. These all have to be modelled accurately if the design is to work. But timing of incoming and outgoing data from/to another independently timed circuit (via wireless or not) has to follow some sort of protocal and will generally use an agreed reference frequency with tolerances that have to be coped with. This is always a lot slower than the capability of the chips and is usually a bottleneck in data transport.

I hope that was succinct :-) graham.d, Tue, 19th Apr 2011

Doppler shift also matters. GSM will not work above 250km/h, LTE has limit of 350km/h. Pavel, Mon, 15th Dec 2014

See the whole discussion | Make a comment

Not working please enable javascript
Powered by UKfast
Genetics Society