Goodbye Von Neumann? (27 Nov, 2019)

Have your say on today's Aardvark Daily column

Goodbye Von Neumann? (27 Nov, 2019)

Postby aardvark_admin » Wed Nov 27, 2019 5:11 am

This column is archived at:

Could we be nearing the end of the Von Neumann architecture as the model on which computer technology is based?

Are the new AMD Threadripper parts already at the limits of contemporary CPU design to the extent that we can't just keep adding cores and hiking clock speeds forever?

Might the next generation of computers be based on a totally different paradigm to those that have gone before?

Might AI running on hugely parallel neural networks be the way forwards?

If that's not the case, what can we do to mitigate the effects of those pesky laws of physics and the fact that some tasks just can't be parallelised enough to take advantage of hugely multi-cored CPUs?
Site Admin
Posts: 4344
Joined: Wed May 07, 2014 2:10 pm

Re: Goodbye Von Neumann? (27 Nov, 2019)

Postby Hiro Protagonist » Wed Nov 27, 2019 9:15 am

I have seen the future, and it speaks qubits.
Hiro Protagonist
Posts: 121
Joined: Thu May 08, 2014 11:26 am

Re: Goodbye Von Neumann? (27 Nov, 2019)

Postby Malcolm » Wed Nov 27, 2019 11:39 am

These musings seem to come up from time to time. I recall back when Intel announced they had moved to a 90nm process for their CPUs there was a bunch of articles talking about how they probably won't be able to get any smaller and a new paradigm will be needed to move forward. Yet here we are with CPUs now built on a 9nm and even talk of 6nm. But once again people are asking is this the end?
Posts: 470
Joined: Thu May 08, 2014 9:43 am

Re: Goodbye Von Neumann? (27 Nov, 2019)

Postby Necrotic Kingdom » Wed Nov 27, 2019 12:16 pm

With data centres being built in the arctic to try and keep them cool I think the future may lie in reducing power consumption. ... oubled.png
Necrotic Kingdom
Posts: 154
Joined: Thu Aug 29, 2019 11:31 am

Re: Goodbye Von Neumann? (27 Nov, 2019)

Postby roygbiv » Wed Nov 27, 2019 1:37 pm

Always room for improvement, make them smaller, cheaper, consume less energy, less cooling required, parallelise CPUs better - faster inter-connects. With the IoT generating petabytes of data that needs crunching. And then I suppose you could make them even faster.
User avatar
Posts: 258
Joined: Wed May 21, 2014 9:28 pm

Return to Today's column

Who is online

Users browsing this forum: No registered users and 9 guests