We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message
80,258 News Articles

Gordon Moore looks back, and forward, 40 years

Intel founder says his predictions were actually hit and miss

Forty years after he coined the most famous law in computing, Gordon Moore still has a few words of advice for the industry.

For software developers: Simplify! Your interfaces are getting worse. Nanotechnology? Don't believe the hype; silicon chips are here to stay. Artificial intelligence? Try again, folks! You're barking up the wrong tree.

Next week sees the 40th anniversary of Moore’s celebrated prediction: that the number of transistors on integrated circuits would double roughly every two years.

Christened later as Moore's Law, his observation became something of a self-fulfilling prophecy for the industry, driving computer makers to keep pace with the expected rate of advancement.

Moore, now 76, was director of research and development at Fairchild Semiconductor when his paper was published in Electronics Magazine on 19 April 1965. Three years later he founded Intel with Robert Noyce, becoming its CEO in 1975 and chairman four years after that.

His law had little effect at first, he said. The first big impact he recalls is when Japanese manufacturers entered the memory chip business in the 1970s. For a while, the Japanese struggled to find their step in a business where the technology seemed to advance in an unpredictable fashion.

"Once they saw the memory series developing – from 1KB, to 4KB, to 16KB – they had a method by which to plan where the industry would end up, and they were very successful at intersecting the trajectory and taking a leading position," he says.

Moore reread his paper about a year ago, he said, and was pleasantly surprised to find that it also foresaw the use of computers at home, although he had forgotten he made that prediction by the time the first home computer appeared. In fact, as CEO of Intel years later, he would dismiss home computing altogether.

"An engineer came to me with an idea about a home computer," he recalls. "I said, 'Gee, that's fine but what would you use it for?' He could only think of a housewife using it to keep recipes on. I didn't think that would be a very powerful application, so I didn't think Intel should pursue a personal computer at that time."

In general, the computing industry has done "a pretty good job" over the years, he says. But he singles out software interfaces – and by implication Microsoft, which has dominated PC software for decades – for particular criticism. By cramming ever more features into applications, software makers may actually be moving backward, not forward, he says.

"As people make improvements in the interface, the complexity seems to grow, and I think if anything we're losing ground a bit in general purpose computing," Moore says. "They want to offer so many new functions in applications, it's difficult to simplify everything at the same time."

Regarding nanotechnology, he is "a skeptic" and has little faith in it replacing silicon-based integrated circuits for mainstream use any time soon.

"There's a big difference between making one tiny transistor and connecting a billion of them together to do a useful function," he said. "That's something I think people often overlook."

Far from being outdated, the integrated circuit is spreading into new fields, such as gene chips for disease analysis, airbag sensors and "microfluidics," which he describes as a tiny chemistry lab on a chip. In a sense, he notes, silicon chips have become nanotechnology, since they include features smaller than 100 nanometers, a popular measure for what constitutes nanoscience.

Asked about artificial intelligence, he said computers as they are built today will not come close to replicating the human mind because they were designed from the outset to handle information in a different way. Scientists need to figure out more clearly how the mind works, and then build a computer from scratch to mimic it.

"I think computers are actually going in the wrong direction" when it comes to replicating human intelligence, he says.

Still, they may mimic parts of human intelligence, such as the ability to recognise language and distinguish, for example, between when a person is saying "two" or "too."

"I think when it recognises language that well, then you can start to have an intelligent conversation with your computer and that will change the way you use computers dramatically," he said. That level of intellect may be anything from 10 to 50 years away, he added.

He's excited about the future of computing, which will bring "mind boggling" developments, he said. "I sure wish I could be around in 40 years to see what happens," he said.

Asked to come up with a new law that might carry the industry forward for another 40 years, Moore declined. He acknowledged several times that he is no longer as close to modern computing as he once was.

"I think I'll rest on my laurels on this one," he said.


IDG UK Sites

Windows 9 launch event live: Windows 9 launch live blog - find out first as the new Windows is...

IDG UK Sites

Windows 9 and the death of the OS as a must-have product

IDG UK Sites

Video trends: 4K is here โ€“ HDR video, VR and 3D audio is coming

IDG UK Sites

Best iPhone 6, iPhone 6 Plus deals: iPhone 6, iPhone 6 Plus tariffs, contracts and prices UK