The journey: Navigating to the moon

Hear from Norman Sears, an engineer that designed the system to land man on the moon.
23 July 2019

Interview with 

Norman Sears, MIT Instrumentation Lab & DRAPER


"The Blue Marble" is a famous photograph of the Earth taken on December 7, 1972, by the crew of the Apollo 17 spacecraft en route to the Moon at a distance of about 29,000 kilometres (18,000 mi). It shows Africa, Antarctica, and the Arabian Peninsula.


The Apollo 11 launch was just the beginning of the journey that would rewrite the history books. Taking around 4 days to reach their destination. Except for 48 minutes every lunar orbit when Mike Collins was on the far side of the Moon, the Apollo 11 crew were in constant contact with mission control. The team kept in touch using radio waves, which travel at the speed of light. Even with that, there was still over a second’s delay as the light made the 350,000km trip to the Moon. Unfortunately, we can’t travel at the speed of light so when it comes to the journey itself how do you devise a route for A Trip that’s never been completed? Luckily, Izzie Clarke spoke to Norm Sears, who had a pretty good idea…

Norman - I'm Norman Sears. I was a former member of the instrumentation laboratory at MIT and I was fortunate enough to be involved in the Apollo program, which is very significant nationally and internationally even.

I started out early in the program on the lunar phases of the mission that would be the powered landing, the ascent from the lunar surface and the rendezvous back with the command module. Those were fairly key phases of the mission. Eventually, as time went on, I evolved into covering most of all of the mission phases and the systems required during those mission phases.

Izzie - Norm was the technical director for the primary guidance navigation and control system - the crucial equipment in place to get the spacecraft to its all important destination.

Norman - That was determined before I even started. You had to pick a landing area that you were interested in going to. You then had to time the mission so that as you approached that landing site, the sun was at the back - behind you - casting shadows out to the front and that determined when you would have to launch and had arrived at those different points.

Izzie - And I guess absolutely crucial information for the team like yourself behind the navigation system.

Norman - That's correct. The navigation system was one of the three functions. The navigation is telling you where you are and where you're gonna be in the future. If you want to go somewhere else, you have to change that orbit and the way you do it is change the velocity both in direction and size to go somewhere else. That's called guidance. Then the control part of it is actually executing what the guidance has told you is required to do. That involves orienting the vehicle, turning the engines on or off to get the right change in velocity to take to a new position. Those are your three functions of navigation, guidance and control.

Izzie - And those features relied on three essential inputs. First up there was the inertial platform, which used gyroscopes to measure and maintain the orientation of the craft and accelerometers to measure the acceleration. Then there was the optical alignment. This system allowed the astronauts to sight stars and landmarks simultaneously, which helped the Apollo navigator to determine the position and speed of the spacecraft. And then there was a computer that had to solve the guidance navigation and control functions with accuracy and quickly.

Norman - NASA at that time - to meet the goal President Kennedy set out to land on the moon and return by 1970 was very demanding and NASA - did not want to do any new development that they didn't actually have to do. In our particular case, of those three units: the inertial units, the optical units that aligned them... those technologies were fairly well known and no new development had to be made. The computer, however, was a brand new situation. It had to be capable of solving pretty complex space navigation problems for the guidance navigation control functions and they had to do it in real time.

At that time that was pressing that technology. Up to then, analogue computers had done most of that type of work, only they didn't have anywhere near the accuracy that would be required for the lunar mission. So for the laboratory itself - the new technology that had to be pursued and developed - was the digital flight computer. And that was our high risk item and most of our new development effort went into that system.

Izzie - With a lot of testing and training after its development to make sure this took the crew to the moon. But, four days after launch - on the 20th of July 1969 - Aldrin Armstrong and Collins found themselves entering the moon's orbit and it was time for the team to split.

Norman - The command module and the lunar module were attached from Earth orbit to lunar orbit. Once they were in lunar orbit, two of the crew members transferred to the lunar module and they were then separated during preparation to descend to the moon for landing.

Izzie -Those two being Buzz Aldrin and Neil Armstrong leaving Michael Collins behind in the command module Columbia. With his two colleagues gone and radio contact with the abruptly cutting off at the instant he disappeared behind the moon. He became the most remote human in our solar system. Armstrong and Aldrin, however, continued on their descent to the moon in the lunar module, otherwise known as Eagle. And, whilst we know how the story ends, landing didn't quite go to plan.

Norman - The landing itself went fairly straightforward as it had been planned. The part that was unexpected is the computer alarms that occurred during the powered landing maneuver.

AUDIO from Apollo 11.

Norman - In the terms of Apollo though, the lavatory was the key operating support. In the sense that we had to be available during any mission phase. If something went wrong, people who had made the designs and developed them were available for a consultation if they were required. That turned out to be very crucial on the Apollo 11 because we did have that happen in an unexpected event during the power landing where we had five computer alarms sounded during the landing.

In the lunar module and command module, I believe, the astronauts had a switch or button. A computer checked that at every computer cycle. If they detected it, at any point, that program they were in would immediately stop and a new program would come up to take them back to the command module.

The computer alarms that happened - it turned out - was that rendezvous radar was turned on. The computer - even though that radar was not used for any real function - the computer tried to read it didn't have enough time to read it and that's what triggered the alarms. Fortunately, we had enough people in the right places to determine those were not crucial to the operation. They were really periphery items that the computer couldn't handle and gave the decision to keep going which was the right decision.

AUDIO from Apollo 11.

Norman - There was endless amounts of training missions through the simulators and in a sense they were sort of repetitive. You’d hear the same things over and over, you’d look at the same downlink data to see if you were in the various limits and all. I was in the Mission Control Center during the landing and listening to some of this going on though the various flight controllers and that was what I had heard many many times before. However there’s something I'd never heard on any other simulation training session and that was very close to the ground when Buzz Aldrin mentioned the words “picking up dust”. I'd never heard that before.

AUDIO from Apollo 11.

Norman - At that point it dawned on you this was no training exercise and you were just talking about a few seconds left.

AUDIO from Apollo 11.


Add a comment