How does a computer chip work?

  • 2 Replies

0 Members and 1 Guest are viewing this topic.


Patrick Cassell

  • Guest
How does a computer chip work?
« on: 05/10/2009 08:30:03 »
Patrick Cassell  asked the Naked Scientists:
Dear Naked Scientists,

I am studying computer science and mathematics at Salt Lake Community College in Salt Lake City, Utah.

I want to understand how a computer channels the electrical current after it leaves the power supply in a way that does meaningful work?

For instance, how does the computer turn a constant direct current into something intelligent enough to run a computer? It seems so complex. Could you make it more understandable?

Patrick Cassell,

West Jordan, UT

What do you think?


Offline graham.d

  • Neilep Level Member
  • ******
  • 2208
    • View Profile
How does a computer chip work?
« Reply #1 on: 05/10/2009 09:31:43 »
The essence of the question is "how do logic gates work?". The basics of operation are governed by the invention of the transistor which allows the flow of electric current between two points to be controlled by the voltage on a third point. This structure can be made to operate as a voltage controlled switch. In most computer chips you could think of having two types of switch - one that is turned on by a high voltage and off by a low voltage -  and another which is turned on by a low voltage and off by a high voltage. (By high voltage I mean around 1.2V in modern computer chips, and low voltage being 0V).

By having a voltage supply of 1.2V and 0V and these two types of switch you can construct logic gates. If you have not done so yet you will learn about logic gates on your course. Try looking on the web for "logic gates", MOS transistors, etc. IT is hard to explain all without pictures and there are many sites that will explain better than I can on this forum.


Offline Michael Peoples

  • First timers
  • *
  • 5
    • View Profile
How does a computer chip work?
« Reply #2 on: 11/10/2009 04:11:38 »
Graham certainly knows what he's talking about, but unless you're going to actually go into chip design, it's not really as important as understanding the binary logic that is basis of all computing.  You need to focus on truth tables and boolean logic, then work "upward" from there.

If you are interested in how digital circuits work, I've picked up a book at library called "Digital Electronics Demystified" by Myke Predko.  I've only gotten through the first few sections, but it is very clearly presented, and starts with basic logic theory.

I've worked with computers in various capacities for decades.  If you really want to learn computers, get one, decide on something you'd like to do (a program), then grind through it.  Experience is an excellent teacher.  What you have to develop is a sense of what computers do well and what they don't do well.