Chatty computer chips

Redesigning computer chips for better parallel processing.
16 July 2019

Interview with 

Alex Chadwick, Cambridge University

COMPUTER_CHIP

blue computer chip

Share

There has been a huge leap in computing power over the past few decades. And to tell us how this has been achieved, and why computers could be about to get even more powerful, Chris Smith spoke to Alex Chadwick from Cambridge University...

Alex -  Yeah, so I think between 1970 and the early 2000s, I think computers got a thousand times faster, which, just for context, if planes had done the same thing we'd now be able to fly from London to New York in 28 seconds.

Chris - What did it take to realise that?

Alex - Well the biggest thing driving this is actually not really computer scientists. I think the most fundamental reason for the advance is because electronic components could be made that were smaller and faster and cheaper. It's really weird as a computer scientist, what you could do is you could design a processor and then two years later you could use the latest components and it would be twice as fast, half the size and half the cost.

Chris - This is Moore's Law isn't it? Where we see every X number of months a doubling in power and usually the price as well! But where is this all going then? I mean, we're going to reach a point though surely, where we can’t improve with present technology any more than we have already.

Alex - Yeah. So it's really interesting around the early 2000s things started to change a little bit actually because what really went wrong, and there's a number of factors, but the biggest one probably is that when you have all these tiny components and they're getting faster and faster and you’re cramming them into a small space, they get really hot and actually the trouble was, that essentially if you just kept making them faster and faster they would start to melt. And so we no longer really could keep making the computers faster. And so that's why computers actually haven't, sort of, numerically got any faster since the early 2000s. What's happened instead is we've realised that, because the components getting half the size, we can actually put two computers together in the same space we would previously have one. And so that's the concept, you may have heard of a dual core computer: that's when you essentially have two old computers, effectively, a core, stuck together. Although each of them are roughly the same speed as the previous computers, by working together they can achieve results that are faster.

Chris - I suppose though that in order to support having multiple cores, effectively multiple computers working together, you've got to have the right architecture inside the computer so that you can feed the instructions in and they're divvied up among those cores the right way to make those instructions get followed to produce something useful.

Alex - Yes, and I think this is what's, sort of, has been changing in the last few years. Thing is when people first started putting multiple computers together they more or less didn't think about that. They more or less just took two existing computers, bolted them together, changed nothing and expected that to do well. And for two it kind of works. The trouble is sort of nowadays we're getting more and more computers stuck together, I think you can probably buy 16…

Chris - Yeah the server that's running the Naked Scientist website has got some crazy number, its like 24 cores…

Alex - Yeah, yeah, yeah, exactly.

Chris - Just amazing to think you can do that.

Alex - They're really good individually, these cores at working on their own problems and that's kind of what they were designed for so that makes sense. But when you stick them together, it's like if you have you know multiple people working in a team. You know, they need to communicate and work together. And the more of them you have actually the more communication you have. You know in a business with hundreds of employees there's gonna be people all the time in meetings constantly coordinating. It's the same with computers as we get to, you know, 24 cores or more, we have to communicate between them.

Chris - Because you're working on something which is hopefully going to mean that they are less antisocial and they get on better, these cores, and communicate better. So is that the linchpin?

Alex - We thought “what would we do differently if we were redesigning the core from scratch”? Rather than just bolting existing ones together - let's throw that design out the window - and say okay what would we do differently today, knowing that the computers of the future will have many cores? And so we have designed a sort of sociable core, if you like. Yes. So one that is capable of working together all the time. And so these cores are able to do computations at the same time as sort of having a natter to their mates and sort of constantly talking about what is happening on the calculations and so on. Working together to solve the problem is sort of fundamental to the design.

Chris - And how much faster will your architecture be?

Alex - It's very difficult to answer that question because essentially the individual cores are worse because as a trade off for their…

Chris -  There's an overhead, because of their being more sociable they get distracted more often.

Alex - Exactly, yes they're all having a natter, exactly. So I think it really depends on the problem. So like in the previous interview we heard about the heart simulation. So in that example, you can imagine each core sort of handling one cell of the heart and then having a natter about, you know, what's happening to that cell to the next core.

Chris - So I guess it's going to be a question then of actually writing software and writing systems that will exploit your system to make the most of it. Because if you take your system and just shove the present day operating environment at it, it's just not going to cope so well. But if you've got things written bespoke for it's gonna do much better.

Alex - Exactly, and the programmers really need to think in a different way to use it. That's kind of what I'm personally researching.

Comments

Add a comment