Graham certainly knows what he's talking about, but unless you're going to actually go into chip design, it's not really as important as understanding the binary logic that is basis of all computing. You need to focus on truth tables and boolean logic, then work "upward" from there.
If you are interested in how digital circuits work, I've picked up a book at library called "Digital Electronics Demystified" by Myke Predko. I've only gotten through the first few sections, but it is very clearly presented, and starts with basic logic theory.
I've worked with computers in various capacities for decades. If you really want to learn computers, get one, decide on something you'd like to do (a program), then grind through it. Experience is an excellent teacher. What you have to develop is a sense of what computers do well and what they don't do well.