Remember when CPU processor speeds were the driving force behind new computers? Going from a 500 MHz to 1 GHz then 2 GHz machine meant noticeable improvements. Then chip vendors started adding more cores. But for the style of computing consumers use today, it’s not about the CPU anymore.
It’s all about graphics processors. Thanks to today’s visually intensive style of computing, a good GPU can improve the user experience much better than a fast CPU. That’s why Intel is pushing graphics chips such as Larrabee, while AMD is set to unveil integrated chipsets that combine CPUs with GPUs, the result of its acquisition of ATI in 2009.
All PDF documents now run through the graphics processor, they told me, as does Google Earth and multiple other web applications. The same goes for PowerPoint slides, Word and other parts of Microsoft Office, starting with Office 2007. On Macs, the visual interface on the file system is handled through the GPU, which makes flipping through thousands of photos and movies much easier. On the consumer side, the rise of the such graphical interfaces helps people visually navigate through ever-increasing amounts of information.
GPUs are good for applications that require a processor to crunch a lot of data in parallel; they’re not good for step-by-step processes that require decision-making at each step.
As the large content vendors and even carriers try to deploy media content in multiple formats for televisions, personal computers and mobile phones over IP networks, they’ll either have to pay more for storing those multiple versions or pay for real-time transcoding, either in the data center or on the network. The increasing delivery of visual media over an IP network and the increasing amount of electronics data stored in corporate databases all represent an opportunity for GPUs that mean the chips might move out of the graphic niche.