These are interesting times for CPU makers. Gone are the days where a few hours’ laptop battery life was considered efficient and where the only computers people had in their homes were noisy, hot desktops. Now, the pre-built desktop PC is all but a dead man walking: in 2013 the market collapsed with desktop sales falling 9.8 percent globally. In emerging markets the story was even worse: a fall of 11.3 percent as users sought smaller, cheaper, less-power hungry devices.
In 2014 there was a bit of a boost as businesses replaced PCs when support for Windows XP ended, but in 2015 shipments again fell. According to analysts, there will only be a "moderate decline" overall because more Windows tablets and hybrids - 2-in-1 laptop/tablets - were sold.
Overall, the result has been upheaval for the silicone industry’s main players. Less than a decade ago, Intel and AMD had the world at their feet. Intel’s distinctive audio logo rang out wherever laptops were sold and AMD’s future was considerably bright thanks to its 2006 acquisition of graphics powerhouse ATI. These chip giants haven't quite kept up with the times, though. The tech landscape is fast changing and Intel and AMD's apparent slowness to switch focus to mobile computing has allowed other chip manufacturers – most notably ARM but also the likes of VIA and Qualcomm – to dominate this huge new market.
Intel vs AMD: Why it matters
If you’re buying a traditional laptop or PC, AMD and Intel are your only choices for processors, but don’t make the mistake of thinking the PC’s slump in popularity means either company is sliding towards irrelevance. Both have ground to make up but in 2014 Intel’s total revenue was $55.8bn (around £36bn) and it was sitting on a cash pile of $5.67bn (around £3.7bn). Intel doesn't make all its money from PC and laptop processors, of course. It also produces graphics processors, wired and wireless network adaptors, server and workstation processors and components, plus set-top box parts. While you won't find an Intel processor in many smartphones or tablets, the firm does produce many SoCs for mobile devices.
AMD is the smaller of the two companies by some margin. For one thing, while Intel builds its own chips in over a dozen fabrication (fab) plants in the USA, Ireland, Israel and China, AMD sold off its last fab in 2009. Today, just like ARM, VIA, MediaTek and others, AMD designs its own chips but outsources the manufacturing. Producing microprocessors is formidably expensive and AMD’s revenue pales in comparison to Intel’s: merely $5.51bn (£3.5bn).
Intel vs AMD: History and breakthroughs
Both companies have a history of innovation. When Intel produced the 8080 processor in 1974, it lay the groundwork for the x86 processors which provided the foundations for desktop PCs for nearly 30 years. It’s an astute marketeer, too: its mid-2000s Centrino platform, consisting of a low-power processor, a wireless chip and a mobile chipset, took the market by storm with its reputation for desktop-class computing power and long battery life. Its shift from the x86 brand to “Pentium” (copyrighting a series of numbers proved impossible) was a similar stroke of PR genius.
The ability of Intel’s marketing department to outspend and out-think others continues. The success of Intel’s Ultrabook trademark might be perilously tied to Microsoft’s stumbling efforts with Windows 8, but the company’s understanding that consumers need short, snappy brands rather than clock frequencies and other jargon endures.
AMD’s position as underdog is a consistent one. Marketing consultant Mercury Research reported AMD hit a record 22 percent share of the market in 2006; now the company hovers around the 17 percent mark, thanks in part to its dominance of the console market: both the Xbox One and PlayStation 4 have custom 8-core AMD 'Jaguar' processors at their hearts.
Arguably, AMD’s largest recent innovation was its acquisition of Graphics Processing Unit (GPU) manufacturer ATI in 2006. The $5.6bn transaction (about £3bn) saw AMD join Intel in being able to deliver integrated graphics chips - that is, GPUs that live on the same chip as the CPU. The result is less graphical horsepower, but vastly reduced power draw and heat output. Forget fire-breathing, discrete graphics cards (last year's Radeon R9 280X drew around 250W at its peak and needed two cooling fans) – AMD understood that the future of silicone lay in reducing power consumption and size as much as in increasing computational power. These days, people don't need more power: they want better battery life from portable devices.
AMD vs Intel: Challenges
On the face of it, both AMD and Intel were well-placed to answer the needs of users as the sales of mobile devices exploded. The desktop PC market was in steady decline, laptop sales were on the rise, and the mobile phone was begging for reinvention. Intel already had an incredibly strong reputation with its laptop Centrino platform, and while AMD’s Turion competitor was a distant second, the race was on to win a market that knew mobility was the future of computing.
Intel started strongly. Remember the netbook? Before the netbook, spending less than £500 on a laptop would net you something slow and bulky with limited battery life. The first netbooks – the likes of the Asus Eee PC 701, released in the UK in 2007 – cost under £200, weighed under a kilo and, while unlikely to be seen at many LAN gaming parties, offered enough processing power to run basic work applications and – critically – applications that ran in web browsers. The processor at its heart? An ultra-low voltage version of the humble Celeron.
The netbook was a critical and commercial success, and Intel capitalised with its Atom processors. This was Intel silicone at its cheapest: bought in batches of a thousand the earliest Atom CPUs were reputed to cost manufacturers under $30, and for a few years the netbook ruled. Consumers wanted small, cheap computers and Intel, with its wealth of experience in mobile processors, was perfectly placed to answer the call.
The problem arrived in tablet form.“We don't know how to make a $500 computer that's not a piece of junk,” said Steve Jobs in 2008. “Netbooks aren’t better than anything,” he added at the 2010 launch of the first generation iPad. Apple’s chief operating officer Tim Cook agreed, describing netbooks as “not a good consumer experience”, and thus the iPad came to be.
The issue for Intel and AMD was not that they failed to anticipate consumer’s preference for mobile devices. The problem was the form factor: the iPad sold 300,000 units on the first day of its availability in 2010. In picking traditional form factor laptops and netbooks, with traditional desktop operating systems built around traditional x86 hardware, Intel and AMD had backed the wrong horse. In fact, Intel, Microsoft and HP had tried to make tablets a success years before the iPad, but the combination of Windows (an OS designed for the keyboard and mouse), short battery life and chunky, heavy hardware meant no-one wanted to use them.
The problem for Intel and AMD wasn’t that the iPad – and following tablets from the likes of Sony, Samsung and others – didn’t need processors. It was that they needed a new type. And the kingdom of the SoC (system on a chip) – in which a computer’s entire functions are embedded on a single chip – was already ruled by British processor giant ARM.
ARM’s processors are a completely different architecture than the traditional chips favoured by Intel and AMD. ARM’s Reduced Instruction Set Computing (RISC) processors are physically simpler than x86 processors, which means they cost less and draw less power. As the iPad – and the stampede of tablets which followed – took off, it seemed AMD and Intel had missed a significant boat. Fast forward to 2015 and the netbook is dead, slain by high-quality tablets that perform well, offer long battery life, and cost much less than a standard laptop.
Intel vs AMD: New form factors
Even Microsoft, long-time ally of x86 hardware, piled on the misery for Intel and AMD. Windows RT, released in late 2012, was the first version of Windows that would run on ARM-powered devices, theoretically giving Microsoft access to low-cost tablets and – potentially – freezing Intel out even more. However, the Windows RT platform flopped: in 2013 Microsoft had to take a $900 million write-down on its unsold Windows RT devices, and the company’s chief financial officer Amy Hood understated things spectacularly when she said “we know we have to do better, particularly on mobile devices.”
While we were impressed with the Surface Pro 3, it's the best of a relatively bad bunch of so-called "two-in-one" devices which supposedly offer the best of both worlds: one minute a full Windows laptop, the next a tablet. The problem is that Windows 8's touch interface wasn't that good, and few developers made apps for it. Now, Microsoft's immediate future hangs on the success of Windows 10.
Intel isn’t hanging its hopes on Microsoft, though. At CES 2015, it unveiled the Curie module, a button-sized module for wearable devices. This uses the Quark SE SoC which can be powered by a coin battery. For its relatively slow start in the world of tablet, wearable and ultra-portable computing, Intel still has plenty left in the tank.
Change focus to gaming – worth around £1.72bn to the British economy according to the Association for UK Interactive Entertainment – and there’s an entirely different story to be told. Intel does deal with graphics processing, of course, but its expertise lies in integrated graphics. Integrated graphics are ideal for small laptops: an integrated graphics processor doesn’t add much to the price of a laptop, doesn’t draw too much power and – contrary to popular opinion – does offer enough 3D processing oomph for the odd game.
For anyone looking to play the latest releases at detail settings that put the latest consoles to shame, though, discrete graphics cards have always been the answer, and it’s here that AMD has a significant edge. AMD’s current crop of graphics card run the gamut from low-profile, passively-cooled cards up to its latest R9 390X cards, which retail around £400 for the card alone. Discrete graphics aren’t the only gaming arena AMD’s strong in, either. As well as having its chips in both the Xbox One and PlayStation 4, it also supplies the GPU in Nintendo’s Wii U. It might not have much to shout about in developing platforms such as tablets or hybrids, but gamers have plenty to thank it for.
Intel vs AMD: Which should you buy?
If you’re building a desktop PC, the choice between AMD and Intel is as real as ever. The choice is as complicated as ever, too: visit any well-known online retailer and you’ll be faced with a choice of over 600 CPUs. If you’re driven by budget, AMD has a strong command of the lower price-points, but if you opt for AMD it doesn’t mean you exclude yourself from high-end computing: the company’s top-end Athlon processors put up a tough challenge to Intel’s flagship Core i7 CPUs.
Intel’s is dominant, though, and across mid-range and high-end processors there’s an enormous amount of choice. For powerful, everyday computing the Core i5 continues to serve well: you can pick one up for around £150 and up. True power users – those editing video, rendering 3D animations, or those who simply want to get to the top of the SETI@home leaderboard, can opt for Intel’s Core i7 chips.