Affiliate Disclosure: By buying the products we recommend, you help keep the site alive. Read more.
There’s absolutely no doubt that competition between chip makers is steadily increasing not only for PC processors but for mobile and other-purpose processors as well. The big five that need to be mentioned are Intel, AMD, nVidia, Qualcomm, and Apple. All these companies have different takes on how to evolve their processors, which will make it interesting to see whose strategy will allow them to rise to the top. My opinion, however, is simple – choose more cores over better cores.
Before I get into the nitty gritty, please note the wording of the article title. This article is about why I believe more cores will beat better cores, which implies that this is will ring true (again, according to my opinion) sometime in the future. As for now, with applications still predominantly single-threaded and unaware of multiple cores, better cores are the winners.
Over the years we’ve seen our processors turn from single-core lightweights all the way to eight-core monsters (or 16-core if you include servers). Obviously, having multiple cores is beneficial and lets the system work on more data at the same time than if the system only had a single core. But at this point, a new question arises – is there a point where it’s more beneficial to stop adding cores and just make them better? Will having 12 cores instead of 8 make much of a difference? We may feel that having 4, 6, or 8 cores reaches the “good enough” plateau as far as the number of cores goes, but we could do a lot better.
Why More Cores Will Be Better
Of course, having more cores and better cores is the best solution, but what if you have to choose? If I was the one choosing, I’d go with more cores. Why? The inspiration to my answer lies in how GPUs work.
GPUs are packed with cores. In fact, some of the latest cards have 2,048 cores to brag about. They have that ridiculous amount of cores because it lets them work on data at the same time. With more cores, more data can be crunched. Yes, GPU cores are only good at one type of work (which is why we still use CPUs, not GPUs), but the same concept can be applied to CPUs as well.
With more cores, more data can be crunched by the CPU, and you get a speedy system that zips through anything you throw at it, provided that it’s programmed to be aware of all your cores. In short, many good cores will eventually be better than a few great cores.
The Current Plans of Big Chip Makers
Intel currently seems to be maintaining a 4-core limit (6 for their extreme series of products) but is making continued refinements to their cores. However, nVidia is also increasing its number of cores. So is Qualcomm with its Snapdragon processor, although somewhat more slowly while it also makes custom adjustments to the stock ARM designs. Even Apple is gaining cores with its iPhone/iPad processor, but at a very slow rate.
AMD is also trying to make their cores better, but previous roadmaps have shown that AMD is still adding cores and wanted to churn out a 10-core processor for consumers. AMD already has a 16-core behemoth for servers. And yes, they aren’t exactly cores, but that’s how they’re marketed and that’s what I’ll call them. There’s been a lot of controversy about the whole module approach, which you can read about in commentary here (pro) and here (against).
Which strategy is the best? Right now, who knows? Maybe you have an opinion?
What will really happen in the end is something that we can only find out through patience. However, as more software is becoming adaptable to numerous cores, the advantage will eventually shift to those processors that, as an entire component, can output the most work. Until then, we’ll just have to be happy with what currently works the best.
What’s your opinion, more cores or better cores? When do you think we will finally know which choice is better? Any other thoughts? Let us know in the comments!