This site is not maintained. Click here for the new website of Richard Dawkins.

← Nanoscale wires defy quantum predictions

Nanoscale wires defy quantum predictions - Comments

Steve Zara's Avatar Comment 1 by Steve Zara

I thought that reducing the size of components wasn't the problem. The problem is getting rid of heat. It's always been possible to run mass-produced processors faster than their certified speed providing you are willing to put the effort into some innovative technology to stop the circuits frying. This heat problem is one reason why manufacturers have been putting so much effort into multi-core devices, able to run many processes at once. That allows increased processing power to be delivered without increasing clock speeds or component density. I'm fortunate in that the kind of software I develop can easily make use of multiple processors, but that isn't typical.

Until the heat production problem has been solved, further miniaturization isn't going to help much.

Sun, 08 Jan 2012 17:53:42 UTC | #906505

JoxerTheMighty's Avatar Comment 2 by JoxerTheMighty

Indeed, the rising trend today is to try and distribute the work into multiple processes, which don't even need to be of the same type(it could be CPU,GPU,etc...). Assuming of course the problem can be parallelized sufficiently, which isn't always so. Graphics programming, which is my hobby, lends itself very easily to parallelizing. IMO, more robust and safer concurrent programming languages and systems would be of greater benefit than further advances in component size. In many places in the software world, concurrent programming is still a bit of hell, with what the stupid stuff you have to deal when sharing state(especially with threads). Mainstream languages don't really excel at this. We need a way to streamline this stuff better.

Interesting relevant tweet from Carmack: (http://twitter.com/ID_AA_Carmack)

"It is sort of depressing when it becomes clear that it is more effective to do crappy parallel work than good sequential work."

Sun, 08 Jan 2012 18:22:19 UTC | #906513

Galactor's Avatar Comment 3 by Galactor

Comment 1 by Steve Zara :

Until the heat production problem has been solved, further miniaturization isn't going to help much.

Heat generation is indeed an issue. The brain gets around it by using blood. IBM have for some time now been researching a 3D layered chip that uses water, piped through the layers in small tubes, to extract the heat.

There is a press release of this product here; it's dated June 2008 - three and a half years ago.

One of the exciting aspects of this product is its ability to reduce the average distance between components thereby providing a higher possibility of parallelism. The similarities to how the brain is set up, are not difficult to see.

Sun, 08 Jan 2012 18:50:47 UTC | #906518

Galactor's Avatar Comment 4 by Galactor

I wonder whether carbon nanotubes may lead to a dramatic reduction in the resistivity of the chip wiring. It's possible I suppose. Less resistance, less heat production. Carbon nanotubes may be able to support a current density of 10(13) A/cm(2) compared with say, copper, at around 400 A/cm(2). No contest.

What strikes me with the advancement of Moore's law is how differing technologies may help it continue longer than was expected.

Sun, 08 Jan 2012 19:03:18 UTC | #906521

robaylesbury's Avatar Comment 5 by robaylesbury

All this talk of over heating reminds me of the days when I overclocked my PC's to get a few more Mhz of power out of them for gaming. My old AMD chips have never forgiven me.

Sun, 08 Jan 2012 19:06:38 UTC | #906523

Quine's Avatar Comment 6 by Quine

Most of the heat generated in the chips comes from the switching in the transistors, not the resistance in the conductors. Many techniques are being researched to reduce that heat, which has traditionally been reduced by lowering operating voltage, such as adiabatic logic and low power asynchronous logic. I am also interested in simulated evolution or genetic algorithms to design circuits where maximum computation per microwatt per cubic micron is selected.

Also remember that in the memory arrays the value comes mostly from the density of bits, only a few of which are being accessed at any given moment, so heat is less of a problem and smaller feature size is always sought after.

Sun, 08 Jan 2012 19:40:59 UTC | #906527

huzonfurst's Avatar Comment 7 by huzonfurst

The last time I looked Moore's Law had a doubling of 18 months, not two years, a significant difference.

Sun, 08 Jan 2012 20:40:18 UTC | #906534

Steve Zara's Avatar Comment 8 by Steve Zara

Comment 6 by Quine

Thank you. One of the delights of this site is feedback from experts :)

Sun, 08 Jan 2012 21:13:00 UTC | #906540

God fearing Atheist's Avatar Comment 9 by God fearing Atheist

I read the comments under the Nature article:-

2012-01-07 08:19 AM

Report this comment | #35865

Tyrone Jackson said:

If they had bothered to have this paper refereed by people who actually

know about metal-insulator transitions in semiconductors, they'd have bounced this paper because when it comes to phosphorous doped silicon, if doped enough to be "metallic" (degenerately doped), the mean P-to-P spacing has to be 3nm or less. That mean P-to-P distance also is the carrier mean free path. The authors actually claim to have doped to a concentration three orders of magnitude higher than the Mott Transition level, which means their mean P-to-P spacing has to be ~1nm. Hence its no surprise this "wire" did not see any effects of its dimensionality.

Steve Furber (prof at Manchester and designer of the ARM7), gave a talk where he said:-

1) Neurons are 100,000 to 1,000,000 million times as power efficient as high frequency (GHz) transistor circuits.

2) Lower frequencies are more efficient than high frequencies. Hence, massively parallel, how frequency processors are the way to get more MIPS per watt.

3) Making transistors smaller will hit doping problems. A 20 atom * 20 atom * 20 atom cube has 8,000 atoms. At current doping concentrations that gives in the order of 5 dope atoms. That means there is a probability that there will be zero dope atoms in the cube (think Poisson distribution), and hence the circuit the component is a part of won't function.

A few months ago Quine posted a link to another adiabatic logic paper, where I think the equations stated that power was proportional to the switching frequency of a transistor squared. Which explains the observations above (if I interpreted the paper correctly).

Even mainstream computing is going multicore. As a few above have commented, we now need programming languages to address the problem. Steve Furber was most scathing about the ability of software to keep up. Sometimes I think of VHDL, which proves him wrong (but others might categorise that as hardware!)

Sun, 08 Jan 2012 23:05:36 UTC | #906555

Rawhard Dickins's Avatar Comment 10 by Rawhard Dickins

I think they just invented the worlds smallest fuse!

Sun, 08 Jan 2012 23:12:45 UTC | #906556

Starcrash's Avatar Comment 11 by Starcrash

Comment 1 by Steve Zara :

I thought that reducing the size of components wasn't the problem. The problem is getting rid of heat.

It's not a problem if you want people to keep replacing their computers every few years. I swear I've replaced 3 laptops now due to overheating. The smaller that circuits are and the closer they get, the more of a problem it becomes (or the better it gets, from a computer retailer's POV).

This is a pretty cool discovery, though.

Mon, 09 Jan 2012 03:53:46 UTC | #906569

Schrodinger's Cat's Avatar Comment 12 by Schrodinger's Cat

Comment 11 by Starcrash

I swear I've replaced 3 laptops now due to overheating. The smaller that circuits are and the closer they get, the more of a problem it becomes (or the better it gets, from a computer retailer's POV).

I wish I'd been the inventor of the industry that can sell you a can of air for £11 to sort out that problem. Hmm...could also start a business selling 1000 times recycled Thames water as 'pure water' for a posh price.

The biggest enemy of Moore's Law is good old fashioned dust. Maybe someone should invent a dust removing nanobot.

Mon, 09 Jan 2012 08:44:59 UTC | #906581

Graxan's Avatar Comment 13 by Graxan

One of the main areas of development in opposition to the problem of heat production is producing circuits that run on lower voltages. This is with all integrated circuits including memory, as most modern modules also come fitted with heatsinks. As an example, memory modules that used to run at 5V dropped to 3.3V and now run at 1.3V. The same has happened with GPUs and CPUs with modern more powerful chips now producing less heat but being more powerful. This has been a byproduct of miniaturisation with lower voltages needed on the current 45nm/32nm products than previous chips and a lowering of power consumption throughout.

With reference to the above comment of the brain using blood to cool it's 'circuitry'. This isn't the only way the brain seems to have gotten around these problems. Remembering that the brain isn't a single integrated circuit, it shares the load of different tasks between different areas of the brain or effectively uses multiple cores - so maybe this is the natural way to go as it seems processors are also forced into. The brain also uses tiny electrical signals to operate, around 20 Watts in total power consumption - if a current-tech silicon CPU with the same power as the brain was constructed it would require 10 MegaWatts to operate. (According to Kwabena Boahen, a computer scientist at Stanford University)

So - physics (and biology) offer a much higher ceiling to the limit of processing power than we currently find ourselves at.

Mon, 09 Jan 2012 09:49:45 UTC | #906588

foundationist's Avatar Comment 14 by foundationist

Heat is only one of the problems that lead to the size limits in traditional electronics. Another is the size of the functional components. You can´t build a traditional Si-based transistor from a couple dozen atoms. But there is a lot of interesting research towards molecular electronics, employing organic molecules as transistors or diodes. Articles for example here, here or here.

Mon, 09 Jan 2012 10:54:50 UTC | #906590

IworshipRD's Avatar Comment 15 by IworshipRD

Won't be long before humans are filled to the brim with this type of nano technology.

Mon, 09 Jan 2012 14:02:10 UTC | #906614

antcowan's Avatar Comment 16 by antcowan

though

though> t's not a problem if you want people

to keep replacing their computers every few years. I swear I've replaced 3 laptops now due to overheating. The smaller that circuits are and the closer they get, the more of a problem it becomes (or the better it gets, from a computer retailer's POV).

This is a pretty cool discovery, thoughthough

You need to clean out the lint from the cooling fan on your laptop every 6 months or so which may require opening it apart to get access. Laptop still going after 2 years.

Mon, 09 Jan 2012 17:01:35 UTC | #906678

keyfeatures's Avatar Comment 17 by keyfeatures

comment 6 by Quine

Simple question from a simple lay-lass. Is non-clocked / non-oscillator dependent asynchronous logic non-binary? Tired to pick my way through the link and some wiki stuff but couldn't work out if I was confusing the system with the algorithm. Does my question even make sense?

Mon, 09 Jan 2012 17:46:54 UTC | #906684

neverstopjamin's Avatar Comment 18 by neverstopjamin

Not necessarily.

Mon, 09 Jan 2012 19:12:26 UTC | #906703

Quine's Avatar Comment 19 by Quine

keyfeatures: Simple question from a simple lay-lass. Is non-clocked / non-oscillator dependent asynchronous logic non-binary?

That is a good, but not simple, question. The short answer is that it us usually still implementing binary operations. Binary synchronous logic has been so widely used because it has distinct design and manufacturability advantages. For example, you can have all kinds of terrible noise in your circuits and still have synchronous logic work as long as the noise has a chance to die down before the next clock edge. That limits how fast you can clock it, but you don't have to go to the kind of trouble you would have to if it were asynchronous and any noise glitch could keep propagating. That is just one of many problems, but I remember that one because about forty years ago I set out to build a one-hand computer keyboard using asynchronous SSI-TTL chips, and ran straight into the stability issues.

You could design non-binary circuits, and there have been projects to do so, especially in neuro-net or fuzzy logic applications. The simplest of those have a middle state of "unknown" between the "1" and "0." It gets more and more difficult to deal with multiple states represented by the voltage at a single point, especially when that voltage fades down (or up) as the signal propagates through a network.

These issues have been discussed in the circuit design books for over thirty years, and you can find all kinds of different approaches. As with most areas of design, folks will keep making incremental improvements to what they are already doing, until some hard obstacle makes them change. In this case, one of the obstacles we have been watching in the road ahead has been quantum effects of shrinking sizes. Another is the increasing heat density problem. There are others that have to do with manufacturing processes. At some point, we will need to do things in a different way, and at that point any number of techniques that have been researched in the past may flip over into the "worth the trouble" status that they did not have before.

Mon, 09 Jan 2012 20:51:44 UTC | #906724

Red Dog's Avatar Comment 20 by Red Dog

Comment 15 by IworshipRD :

Won't be long before humans are filled to the brim with this type of nano technology.

This isn't what people usually mean by Nanotechnology. Yes its technology and its at the nano level but the term refers to technology to make physical changes at the molecular level as opposed to store and manipulate data.

Mon, 09 Jan 2012 23:34:38 UTC | #906761

keyfeatures's Avatar Comment 21 by keyfeatures

comment 19 by Quine

Many thanks for the answer. My reason for asking was not so much due to a fascination with circuit design per se (although this thread suggests perhaps I need to develop one) but rather how this might correspond to my assumptions about 'reality'. Reasoning that the on / offs that drive the patterns around us must necessarily correspond to the tick-tocks of underlying oscillators, evidence for alternative non-clocked / non-binary circuit models clearly challenges this. As usual my lazy, impatient brain would like to skip the detail and get to the answers the fastest way possible.

Can you suggest any reading / links that are a good intro to circuit design and in what ways this relates to the algorithm design that can run on them?

Tue, 10 Jan 2012 15:11:13 UTC | #906900

crucialfictionofjesus's Avatar Comment 22 by crucialfictionofjesus

Please Sir, may I leave the room? My brain is full.

Wed, 08 Aug 2012 06:42:24 UTC | #950512