What’s Inside Your Computer: The Story Of Every Component You Need To Know
Whether you’re buying a new computer or building your own, you’re going to be subjected to a lot of acronyms and random numbers. It can be hard to cut through the cruft and get to the meaningful information. This article is here to help.
I’m going to dive into every major component inside a modern computer. I’ll explain what it does, its history, the important specs you need to understand and who the major players are.
You’ll learn what you need to consider when you’re buying one — whether as part of a computer, or as a separate component.
So without further ado, let’s get started.
A (Very) Brief History of CPUs
You’ll often see people describe the Central Processing Unit (CPU) as the brain of a computer. They’re wrong; the CPU isn’t the computer’s brain — it is the computer in the most literal sense of the word. It is the component that does the computing.
Every command you send to your computer — whether it’s a key press, a mouse click or a complicated command line instruction — is converted into binary and sent to the CPU to be dealt with. The CPU performs a series of simple mathematical operations that when done thousands of times per second can produce staggeringly complicated results. The CPU then issues its own commands to the operating system which may be as simple as “add the letter K where the input is” or “select the file the mouse is hovering over” or as complex as “solve Pi”.
While the development of the CPU has roots that go back to the abacus — a device first used more than a thousand years BCE — the dawn of modern personal computing starts with the 1978 release of one of the first commercially available 16-bit chips: the Intel 8086 microprocessor. The 8086’s successor, the 8088 was selected for use in the first IBM PC. The 8086’s legacy is felt today, any command written for an 8086 has an equivalent on any modern Intel chip and can still — in theory — be run.
On a CPU, there are billions of transistors: tiny silicon circuits capable of switching or amplifying an electrical signal. These form the basis of everything the CPU does. Through the work of thousands of intelligent scientists and engineers, this network of microscopic electronics gives rise to the operating system and web browser you are using to view this post. The power of a CPU is roughly dependant on the number of transistors in its circuit.
Moore’s Law , which has held roughly true since the 1970s, was formulated by Gordon E. Moore, one of Intel’s cofounders. It states that the number of transistors per square inch of circuit space will double every two years. This is why the CPU in your computer today is more powerful than an original Intel 8086.
Regardless of that difference in power — and it is a huge difference — there is a clear line from the 8086 through the various Pentium chips to the Core i Series that Intel sells today. The 8086 was the chip that led to the computer as we know it.
CPU Size: The Vital Stats
Laptop manufacturers don’t advertise their wares by telling you how many transistors are on the CPU. Instead, they talk about clock speed, how many cores it has and what model of CPU it is. There are also a couple of less discussed technical specs that matter. It used to be simple to compare CPUs: bigger numbers equals better performance. That’s no longer the case. Now you have to consider a couple of different things.
The most common CPU specification is clock speed. It’s simply a measure of how many operations a CPU can make per second. All else being equal, bigger is better. The problem is that all else is rarely equal.
The biggest CPU development in the past decade has been the proliferation of affordable multi-core CPUs. A multi-core CPU has multiple processors on a single chip. A dual-core has two processors, a quad-core has four and so on. It makes intuitive sense that more cores equals more power and that is true for some tasks; for others it isn’t.
The advantage of a multi-core CPU is that it allows tasks to be done in parallel. If the task you’re doing on your computer is something like video encoding which can easily be parallelised, the more cores the better. Each processor can work on rendering a single frame at a time and combine them all at the end. A quad-core won’t be four times faster than a single-core CPU because nothing with microprocessors is ever as simple as it would seem, but it will be significantly faster. However, parallelising tasks introduces a lot of extra work for software developers. Tasks that are harder for developers to parallelise — like the computations underlying computer games — often don’t see many benefit from multi-core CPUs.
Depending on what you’re trying to do, a $300 dual-core processor can be as fast, if not faster, than a $500 quad-core. If you’re buying a computer, think carefully about what you’re using it for before you spend a few hundred dollars on extra cores that you will never benefit from.
While model names are just a label given by the manufacturer, they can reveal a lot about the extra features a CPU comes with. For example, a huge part of the difference between Intel’s mid- and high-end CPUs is the cache size. The cache is memory on the CPU where it can store instructions. The CPU can pull instructions from cache far faster than it can from anywhere else so the bigger the cache, the better.
Intel isn’t the only company producing CPUs though it is the largest. Advanced Micro Devices — better known as AMD — and VIA Technologies also produce x86 CPUs. In the early 2000s, AMD’s chips were actually superior to Intel’s, however, that changed with the Core i series.
For other devices like smartphones, the CPU is normally integrated with some of the other components on a single chip. Qualcomm, Texas Instruments and Samsung are some of the many large manufacturers of system-on-a-chip devices.
CPUs At A Glance
The CPU is the bit of the computer that does the actual computing. While it used to be easy to pick the best CPU — go for the one with the biggest numbers! — the rise of multi-core processing has changed that. In general, the higher the clock speed, the faster a CPU is and the easier a task can be parallelised, the greater the advantage of multi-core CPUs. Even when two CPUs have very similar clock speeds and the same number of cores, there are other factors at play. Cache size is one of the most important and is often the differentiating factor between mid- and high-end CPUs. Again, bigger is better.
Let Me Introduce You To My Motherboard
If you’re building your own computer, the motherboard will be one of the most important components you’ll choose. If you’re buying one, it won’t even be listed on the spec sheet. The motherboard is the printed circuit board (PCB) that connects all the other components together. It also has a lot of the additional ports and connectors — like USB, I/O ports and HDMI in many cases — that are common to every computer.
Before the microprocessor, the idea that a computer would fit on a single PCB was laughable. They were just too big with too many different parts. With the microprocessor, it became possible for an entire computer to be housed inside a small case. All the components would be connected using a singled PCB. The modern motherboard logically evolved out of these early PCBs.
Yo Motherboard So Much Spec
Motherboards don’t have a major direct effect on performance. They are the linkage that lets the other components do the work. However, they do determine what components you can include in your computer, and therefore indirectly affect its performance.
Motherboards come in a number of different sizes with cases to match. Most are designed off the ATX standard. The smallest motherboard commonly available is the 170 mm x 170 mm mini-ITX and the largest is the 356 mm x 425 mm Workstation ATX. There are various sizes in between.
The larger the motherboard, the more ports it will have. If you are trying to build an extremely powerful computer, you will need more ports to connect multiple video cards, terabytes of storage and countless sticks of RAM. If you are just building a home theatre PC , you can get away with a far smaller motherboard and far fewer additional components.
Most motherboards have a number of standard internal ports. There’s always a CPU socket, RAM slots and ports for connecting cables to storage drives. All but the smallest motherboards have Peripheral Component Interconnect Express (PCIe) slots.
PCIe slots come in a few variations that allow you to connect different peripherals. Video cards, wireless cards and any other internal expansion normally connects to a PCIe slot. There are different sizes of PCIe slots that offer a different number of connections to the CPU. The larger the slot, the more information the peripheral can send and receive per second.
The four sizes are x1, x4, x8 and x16. The number represents the number of connections, or lanes. Powerful video cards will need a PCIe x16 slot while a wireless card will only need an x4 or even an x1 slot.
Motherboards also provide external ports. USB, audio and video I/O, Ethernet and various other connections are all standard.
If you’re buying a motherboard, you’ll need to select one based on its compatibility with the CPU you want to use, how big you want your computer to be and how much expandability you need it to have. Different motherboards support different CPUs. For example, an Intel CPU won’t work on a motherboard that supports AMD CPUs. Between size and expandability there’s normally a balance to be found. For example, if you plan on using two video cards in parallel, you will need a minimum of two PCIe x16 and that decision instantly eliminates almost any motherboard smaller than a standard ATX board.
If you’re buying a fully-built computer, all the features of the motherboard will be listed in the computer’s overall spec.
The major consumer motherboard manufacturers are ASUS and Gigabyte Technology. Both make motherboards for Intel and AMD CPUs in a variety of sizes with different port combinations. If you need something for a powerful gaming PC or a HTPC, either company will be able to provide it. Major manufacturers of fully built computers often make their own motherboards to connect their components.
Motherboards At A Glance
If you’re building a computer, the motherboard matters. If you’re buying one, you won’t even know it exists. It is the PCB that links all your computer’s components to the CPU. There are different sizes available with different internal and external ports. A CPU socket, RAM slots and storage connections are all standard. PCIe slots come on all but the smallest boards. Choosing a motherboard involves selecting one that works with the CPU you want to use and has enough ports for all the other components you want to add.
Random and Confusing: An Introduction to Computer Memory
Random Access Memory (RAM) — often just referred to as memory — is where the CPU stores the things it’s operating on, or likely to be operating on soon. This is different to storage, like hard drives, where data is kept indefinitely.
The difference between memory and storage is mainly down to how data is accessed. On a physical hard drive, the speed that data can be retrieved at depends on where it is kept. Disks can only spin so fast and the reader arm has to move to different points. With RAM, all data can be read equally quickly no matter where it is actually stored. The other important difference is that RAM is volatile, data is only stored while there is power running through it. This is a limitation that hard drives don’t have.
RAM’s speed is what makes it so important. It can be a 100,000 times quicker for the CPU to access data held in RAM compared to retrieving it from a hard drive. When you are using an application, whatever you are working on is copied from the hard drive to RAM when you open it. Every time you or the application does something, the CPU pulls the information it needs about the file from the copy in RAM rather than the copy on the hard drive. When you save the file, it is copied back to the hard drive. This is why you lose files when your computer crashes — RAM can’t store information without a current passing through it.
If you run out of space in RAM, your computer slows down dramatically. The CPU has to fetch information from the much slower hard drives rather than from memory. Insufficient RAM is one of the main causes of computer slowdown.
No RAMbling: What The Stats Mean
RAM can be one of the most confusing components. Most listings on Amazon look like someone dropped a calculator in a bowl of alphabetti-spaghetti. It’s not as bad as it seems.
First, there’s RAM size which is measured in gigabytes. It is exactly what it looks like: a measure of how much stuff can be held in RAM. There’s always a gigabyte or two of RAM required for the operating system but anything extra is free to be used by any application that needs it. The more RAM, the better, although you are never likely to need the maximum your operating system can support. For the past few years, 8 GB of RAM has been the acceptable baseline. Most users won’t need more. If you do a lot of multimedia editing or gaming, 16 GB or 32 GB isn’t out of the question.
In the past decade, there’ve been three generations of RAM: DDR, DDR2 and DDR3. At the time of writing, DDR3 is the current generation but DDR4 is coming along in the next few years. DDR stands for double data rate. Each generation has doubled the rate of data transfer of the previous one. Unless you have an old computer that you need to replace the RAM in, you shouldn’t even look at any anything that isn’t DDR3 (or if you’re reading this in 5 years’ time, DDR4).
Next, there is transfer speed. This is how fast the CPU can pull data from RAM. It’s typically measured in MHz, and limited by the motherboard. DDR3 RAM will normally have a speed of between 1066 and 2400 MHz. This represents the total transfer speed and not the actual memory clock speed. The RAM’s memory clock speed is normally between 133 MHz and 300 MHz; the apparent speed is far higher because of the compounded doubling of the data rate you get with later generations of DDR RAM. Like with the CPU, faster is better but there are other considerations.
Finally there’s the CL value which is a measure of the RAM’s latency. It represents the number of clock cycles it takes to return user requested data. The lower the CL number, the faster data is returned. With DDR3, it’s generally be between 6 and 16 clock cycles. CL values are typically correlated with transfer speed: the higher the transfer speed, the higher the latency. This makes it a trade off between overall RAM speed and RAM latency.
There is a difference between the largest manufacturers of RAM and the most popular consumer facing companies. Samsung is the largest manufacturer but most of their output is bought by other manufacturers rather than regular consumers. Corsair, Kingston and Crucial are the largest consumer brands of RAM. There are also smaller manufacturers who make RAM especially for gaming like G.SKILL.
RAM At A Glance
RAM is where the CPU stores everything it is likely to work with soon. The files and applications are copied from storage to memory so they can be accessed quickly. Not enough RAM is one of the most common causes of computer slow down. Choosing RAM is easier than choosing a CPU. First, you need at least 8 gigabytes, more if you’re doing RAM intensive work. What RAM you choose matters a little less. The faster the RAM, the longer its latency. These two values roughly trade off. If you are building your own computer, see what RAM is recommended for how you plan on using it. If you are buying from a major computer manufacturer like Apple or Dell, their RAM will be almost certainly be perfectly adequate.
Spinning Over Storage
Hard disk drives (HDDs), and more recently solid state drives (SSDs), are the other side of the memory-storage system. They are the primary method of storing large volumes of digital data.
HDDs use a spinning magnetic disk to store binary data. An arm hovers over the disk and reads the polarity of the magnetic field. Changes in it correspond to binary ones, no changes to binary zeros. The first HDDs were developed by IBM in the 1950s. They were a cheaper replacement for earlier and slower forms of storage such as tapes. Early HDDs were massive: the housing of the IBM 350 RAMAC was the size of two refrigerators. It had a whopping 3.75 MB capacity.
Since then things have changed dramatically. The highest capacity HDDs available today can hold eight terabytes of data and fit inside any 3.5″ drive bay. SSDs have also started to become more prominent.
The first modern SSDs began to arrive in the early 1990’s. There’d been solid state technologies before that but they’d been closer to RAM than storage. Unlike RAM, SSDs hold data even when they don’t have a current running through them (read more about how SSDs work ). SSDs use an integrated circuit to store data rather than a magnetic disk. They’re significantly faster than HDDs because of it. The flip-side is that they are far more expensive and have lower capacities (here are a few of the best SSDs to buy right now ). Until the mid-2000s, they were only used in super high-end computers because regular users couldn’t afford the premium cost for what is a reasonable, but not exceptional, speed boost.
SSDs also have a number of other small advantages. They use less power and, because they don’t have moving parts, run silently without vibration. They also can’t have their data wiped by a large magnet. This is what makes them so suitable for phones and other mobile devices.
As the costs came down and the capacities went up, more and more manufacturers used them in their devices which further drove innovation and price decreases. For example, from 2007 on Apple have been the world’s largest purchaser of SSDs. Almost every device they make now comes with an SSD as standard.
Although they are becoming more common as the main storage device in high-end laptops, SSDs still haven’t replaced HDDs as the primary storage medium for most computers. Even though you can get one with a decent capacity for under $100, the high capacity SSDs are an order of magnitude more expensive than a comparable HDD. People who build their own computers often use both: a small SSD for the operating system and then a large HDD for file storage.
It’s even possible to get hybrid-drives. These are HDDs that have a small SSD built in. The most accessed files on the HDD get moved to the SSD so that they can benefit from the faster read speed.
Storage (Stat) Wars
For storage, the main stat that matters is capacity. Like with memory, it’s measured in gigabytes (GB) but larger drives will be measured in terabytes (TB). The bigger the drive is, the more it can hold.
HDDs also have spin speed. Most drives spin at either 5400 or 7200 revolutions per minute. The faster a drive spins, the faster data can be read from it — high performance drives can spin up to 15,000 RPM. At 7200 RPM, drives generally cost a small premium over slower drives of the same capacity.
The majority of HDDs are produced by just three companies: Seagate, Western Digital and Toshiba. Between the three of them, they have acquired almost every other manufacturer. Even big name brands like Samsung have sold their hard drive divisions to one of the three.
The big manufacturers of SSDs are mainly the same with the addition of SanDisk, who have been making SD Cards for portable devices for years and the consumer RAM manufacturers, Crucial and Corsair.
Storage At A Glance
HDDs and SSDs are the main method of storing digital data. HDDs are used for capacity and SSDs for performance. It’s possible to combine both in one computer so as to maximise the benefits and minimise the weaknesses of both. With storage, you should get an SSD if the limited storage won’t be an issue. If you need the high capacity, then the decision is made for you unless you can afford a ridiculous premium.
First Look At GPUs
Graphics Processing Units (GPUs) are a specialised microprocessor. While a CPU may have four cores, a high-end GPU will have thousands. They were originally developed to output a graphical user interface (GUI) to a display — they’re designed to be extremely efficient at manipulating polygons — but now can be used to do a lot more because of their parallel design.
GPU come in two main types: integrated graphics and PCIe video cards. Integrated graphics, like the Intel HD Graphics line, are embedded in the CPU. Video cards on the other hand, tend to have a far larger GPU, with its own cooling and RAM, mounted on a PCIe card.
Arcade systems used early precursors of GPUs in the 1970s. Before GUIs became common in computers, CPUs were well up to the task of controlling the display. When all that there was on the screen was thirty words and a flashing cursor, there was no need for a separate microprocessor. As computer interfaces evolved and got more complex in the 1980s, it became more efficient to offload graphics to a specialised processor.
GPUs were especially important for tasks that involved rendering 3D objects. The first 3D add-on video cards emerged in the 1990s and were the forerunners of modern GPUs. They revolutionised what was possible with computers and created the digital effects and modern PC gaming industry.
In the past decade, there has been a push from GPU manufacturers for software developers to use their devices as a more general purpose processor. The parallel architecture of GPUs makes them far more efficient than CPUs at certain tasks. Cracking passwords and mining bitcoin are two of the many things GPUs can do more efficiently than CPUs. By using the GPU to accelerate the most intensive work in any given program, the CPU can handle everything else and the entire system runs faster. More and more professional applications like Apple’s Final Cut Pro are beginning to support GPU acceleration.
Looking Sharp: GPU Specs
The most common GPU specs are the amount and kind of graphics RAM (GRAM) it has and — if you’re buying a GPU separately — the PCIe port it connects to. RAM is just as important for a GPU as it is for a CPU. Integrated graphics use the system RAM but dedicated GPUs come with their own. There are also different generations of GRAM. The current one is GDDR5 but you can still find some GDDR4 video cards around. GPUs aren’t as RAM intensive as CPUs. Unless you’re using your computer for playing the newest games or video editing, you’re unlikely to stress even a mid-range GPU. There’s no need to go overboard and spend thousands of dollars on a video card that you won’t benefit from. Even Intel’s integrated graphics can output at 1080p without even flinching.
The situation with PCIe ports is similar. The current generation is PCIe 3.0 and it’s twice as fast as its predecessor, PCIe 2.1. If you’re building your own computer, you should get a PCIe 3.0 card and a compatible motherboard. If you’re buying a pre-assembled computer, you won’t know what PCIe slot is being used.
NVIDIA and AMD are the major discrete GPU producers while Intel is the leading integrated graphics manufacturer. NVIDIA and AMD sell their graphics chips to other manufacturers like ASUS or Gigabyte who mount them on graphics cards for sale to consumers.
GPUs At A Glance
The GPU is a specialised microprocessor with a parallel architecture. Originally designed just for outputting a GUI to a display, they are now used to accelerate other computations. GPUs can either be integrated with a CPU or mounted on a PCIe card. High-end GPUs far outstrip most users needs. The majority of people can get by with integrated graphics or a mid-range video card.
That’s Not All Folks
This article has only touched on the major computer components. There are all sorts of auxiliary parts like powers supply units, fans, water cooling systems, wireless cards and TV tuners that I haven’t mentioned.
Some of them, like the power supplies, are vital while others, like wireless cards, add extra functions that are nice but not essential. However, I haven’t skipped any common component that contributes to the computing — the actual number crunching that results in this web page being open on a screen in front of you.
Whether you are buying or building your own computer, I hope this article was useful.