These are interesting times for CPU makers. Gone are the days where a few hours’ laptop battery life was considered efficient and where the only computers people had in their homes were noisy, hot desktop PCs.
A decade ago, Intel and AMD had the world at their feet. Intel’s distinctive audio logo rang out wherever laptops were sold and AMD’s future was bright thanks to its 2006 acquisition of graphics powerhouse ATI. These chip giants haven't quite kept up with the times, though.
The tech landscape is fast changing and Intel and AMD's apparent slowness to switch focus to mobile computing has allowed other chip manufacturers – most notably ARM but also the likes of VIA and Qualcomm – to dominate this huge new market.
Although things looked somewhat bleak a few years ago, there's been something of a resurgence of gaming PCs and the choice of laptops is broader than ever, and it's the tablet market which is seeing a decline.
What are Intel and AMD up to now?
At the start of 2018, both companies have strong ranges, particularly for desktop PCs and the so-called HEDT segment - high-end desktop.
The two firms are even collaborating on a mobile processor, with AMD supplying the graphics chip for Intel's latest Core i7 with Vega M.
Intel doesn't really compete in the gaming graphics arena, but AMD is - as ever - battling Nvidia. Its current Vega platform has brought it (almost) level with Nvidia, but Nvidia is now poised to launch a new range of consumer graphics cards based on its latest Volta architecture.
Why does Intel vs AMD matter?
If you’re buying a traditional laptop or PC, AMD and Intel are your only choices for processors, but don’t make the mistake of thinking the PC’s slump in popularity means either company is sliding towards irrelevance. Intel doesn't make all its money from PC and laptop processors, of course.
It also produces graphics processors, wired and wireless network adaptors, server and workstation processors and components, plus set-top box parts. You'll even find Intel chips in many smartphones: certain models of the iPhone X have an Intel modem.
AMD is the smaller of the two companies by some margin. For one thing, while Intel builds its own chips in over a dozen fabrication (fab) plants in the USA, Ireland, Israel and China, AMD sold off its last fab in 2009. Today, just like ARM, VIA, MediaTek and others, AMD designs its own chips but outsources the manufacturing. Producing microprocessors is formidably expensive.
History and breakthroughs
Both companies have a history of innovation. When Intel produced the 8080 processor in 1974, it lay the groundwork for the x86 processors which provided the foundations for desktop PCs for nearly 30 years.
It’s an astute marketeer, too: its mid-2000s Centrino platform, consisting of a low-power processor, a wireless chip and a mobile chipset, took the market by storm with its reputation for desktop-class computing power and long battery life. Its shift from the x86 brand to “Pentium” (copyrighting a series of numbers proved impossible) was a similar stroke of PR genius.
The ability of Intel’s marketing department to outspend and out-think others continues. The success of Intel’s Ultrabook trademark might be perilously tied to Microsoft’s stumbling efforts with Windows 8, but the company’s understanding that consumers need short, snappy brands rather than clock frequencies and other jargon endures.
AMD’s position as underdog is a consistent one. Marketing consultant Mercury Research reported AMD hit a record 22 percent share of the market in 2006; now the company hovers around the 17 percent mark, thanks in part to its dominance of the console market: both the Xbox One and PlayStation 4 have custom 8-core AMD 'Jaguar' processors at their hearts.
Arguably, AMD’s largest recent innovation was its acquisition of Graphics Processing Unit (GPU) manufacturer ATI in 2006. The $5.6bn transaction (about £3bn) saw AMD join Intel in being able to deliver integrated graphics chips - that is, GPUs that live on the same chip as the CPU.
The result is less graphical horsepower, but vastly reduced power draw and heat output. Forget fire-breathing, discrete graphics cards – AMD understood that the future of silicone lay in reducing power consumption and size as much as in increasing computational power. These days, most people don't want faster performance: they want better battery life from portable devices.
What went wrong?
On the face of it, both AMD and Intel were well-placed to answer the needs of users as the sales of mobile devices exploded. The desktop PC market was in steady decline, laptop sales were on the rise, and the mobile phone was begging for reinvention.
Intel already had an incredibly strong reputation with its laptop Centrino platform, and while AMD’s Turion competitor was a distant second, the race was on to win a market that knew mobility was the future of computing.
Intel started strongly. Remember the netbook? Before the netbook, spending less than £500 on a laptop would net you something slow and bulky with limited battery life. The first netbooks – the likes of the Asus Eee PC 701, released in the UK in 2007 – cost under £200, weighed under a kilo and, while unlikely to be seen at many LAN gaming parties, offered enough processing power to run basic work applications and – critically – applications that ran in web browsers.
The processor at its heart? An ultra-low voltage version of the humble Celeron.
The netbook was a critical and commercial success, and Intel capitalised with its Atom processors. This was Intel silicone at its cheapest: bought in batches of a thousand the earliest Atom CPUs were reputed to cost manufacturers under $30, and for a few years the netbook ruled. Consumers wanted small, cheap computers and Intel, with its wealth of experience in mobile processors, was perfectly placed to answer the call.
The problem arrived in tablet form.“We don't know how to make a $500 computer that's not a piece of junk,” said Steve Jobs in 2008. “Netbooks aren’t better than anything,” he added at the 2010 launch of the first generation iPad. Apple’s chief operating officer Tim Cook agreed, describing netbooks as “not a good consumer experience”, and thus the iPad came to be.
The issue for Intel and AMD was not that they failed to anticipate consumer’s preference for mobile devices. The problem was the form factor: the iPad sold 300,000 units on the first day of its availability in 2010. In picking traditional form factor laptops and netbooks, with traditional desktop operating systems built around traditional x86 hardware, Intel and AMD had backed the wrong horse.
In fact, Intel, Microsoft and HP had tried to make tablets a success years before the iPad, but the combination of Windows (an OS designed for the keyboard and mouse), short battery life and chunky, heavy hardware meant virtually no-one wanted to use them.
The problem for Intel and AMD wasn’t that the iPad – and following tablets from the likes of Sony, Samsung and others – didn’t need processors. It was that they needed a new type. And the kingdom of the SoC (system on a chip) – in which a computer’s entire functions are embedded on a single chip – was already ruled by British processor giant ARM.
ARM’s processors are a completely different architecture than the traditional chips favoured by Intel and AMD. ARM’s Reduced Instruction Set Computing (RISC) processors are physically simpler than x86 processors, which means they cost less and draw less power. As the iPad – and the stampede of tablets which followed – took off, it seemed AMD and Intel had missed a significant boat.
Fast forward to 2015 and the netbook wass dead, slain by high-quality tablets that perform well, offer long battery life, and cost much less than a standard laptop.
What happened next?
Even Microsoft, long-time ally of x86 hardware, piled on the misery for Intel and AMD. Windows RT, released in late 2012, was the first version of Windows that would run on ARM-powered devices, theoretically giving Microsoft access to low-cost tablets and – potentially – freezing Intel out even more.
However, the Windows RT platform flopped: in 2013 Microsoft had to take a $900 million write-down on its unsold Windows RT devices, and the company’s chief financial officer Amy Hood understated things spectacularly when she said “we know we have to do better, particularly on mobile devices.”
Now it's 2018 and there's a new generation of Windows 10 laptops, running on Qualcomm processors, such as the Asus NovaGo.
Intel isn’t hanging its hopes on Microsoft, of course. It is shifting focus to new technology, such as wearables. Plus, it is also dabbling in drones, having produced the Aero Compute Board and combining this with its RealSense cameras.
For its relatively slow start in the world of tablet, wearable and ultra-portable computing, Intel still has plenty left in the tank.
Gaming is the new battle
Gaming is worth around £2bn per year to the British economy – and here it's AMD which holds the more dominant position. Intel does produce 3D graphics chips, of course, but its expertise lies in integrated graphics.
Integrated graphics are ideal for laptops: an integrated graphics processor doesn’t add much to the price of a laptop, doesn’t draw too much power and – contrary to popular opinion – does offer enough 3D processing oomph for the odd game.
For anyone looking to play the latest releases at detail settings that put the latest consoles to shame, though, discrete graphics cards have always been the answer, and it’s here that AMD has a significant edge.
AMD’s current crop of graphics card run the gamut from low-profile, passively-cooled cards up to its latest RX Vega 64 card, which costs around £500. Discrete graphics aren’t the only gaming arena AMD’s strong in, either.
As well as having its chips in both the Xbox One and PlayStation 4, it also supplies the GPU in Nintendo’s Wii U. It might not have much to shout about in developing platforms such as tablets or hybrids, but gamers have plenty to thank it for.
Should you buy an Intel or AMD CPU?
If you’re building a desktop PC, the choice between AMD and Intel is as real as ever. The choice is as complicated as ever, too: visit any well-known online retailer and you’ll be faced with a choice of hundreds CPUs. If you’re driven by budget, AMD has a strong command of the lower price-points, but if you opt for AMD it doesn’t mean you exclude yourself from high-end computing: the Ryzen processors put up a tough challenge to Intel’s CPUs, as does Threadripper.
However, Ryzen hasn't been a slam dunk for AMD. Games weren't initially optimised for Ryzen, and Intel processors are still the best choice if you want the best performance in any application regardless of cost.
Also, for upgraders currently running an Intel CPU, the upheaval of a new motherboard, chipset and socket is quite a barrier to switching to AMD. Intel is likely to remain dominant and across mid-range and high-end processors there’s an enormous amount of choice. For powerful, everyday computing the Core i5 continues to serve well (with the current range-topper being the six-core i5-8600K).
Ryzen 5 mounts a similar challenge here, though, also with six cores for the same or less money. It's here that AMD could win out, especially as most people are better off with a mid-range CPU and spending what they've saved on a better graphics card.
The vast majority of games still don't take full advantage of multi-core processors, especially those with more than four cores, but with the latest mid-range chips you're effectively getting those two extra cores for free and future games will use them.