Virtually every television available today supports high-definition (HD) video. But there's still a bit of jargon to wade through when it comes to display technology. Particularly, you might get confused on the differences between the terms HD Ready, Full HD, and Ultra HD.

Let's take a look at the distinction between HD Ready and Full HD, how they compare to Ultra HD, why these terms are used, and what they mean in practical use.

HD Ready vs. Full HD

In the most basic terms, HD Ready TVs (and set-top boxes) are capable of displaying 720p video, which is 1280x720 pixels. Full HD TVs and boxes can show 1080p video, which is 1920x1080 pixels. The HD Ready standard came about in Europe around 2005, so that people could be sure they were buying TVs that actually supported HD.

However, it's not quite this simple. Depending on where you live, the definition of HD Ready is slightly different. Specifically, the US and Europe define it differently.

HD Ready Large Logo

In the US, HD Ready for a TV means that the display can output 720p images. In most cases, this also indicates that the TV has a built-in digital tuner, which is necessary to accept digital TV broadcasts (which have largely replaced analog signals). This same HD Ready logo is also printed on several projectors, computer monitors, and other devices, which don't have a tuner.

In Europe, the HD Ready logo doesn't mean that a TV has a digital tuner. The output must be 720p to get the HD Ready logo, but the sticker only indicates that support.

There are other logos/stickers used in the past that aren't as common now. HD Ready 1080p means that the TV is capable of outputting 1080p video without distortion, while HD TV 1080p means the 1080p-capable TV also has a digital tuner.

Full HD 1080 Logo

Worldwide, the golden Full HD 1080p logo is a standard that denotes the display can show 1080p images. It does not indicate anything about a digital tuner, but in the US, most Full HD TVs have one.

What Is High-Definition? 720 vs. 1080 Explained

Logo aside, what is the actual difference in the quality?

TVs show video as a series of lines; resolution is simply the amount of pixels that make up a display, both horizontally and vertically. The shorthand numbers used for resolution (720p and 1080p) represent how many vertical lines can your TV display at one time.

1920x1080 resolution (1080p) means that there are 1920 pixels horizontally and 1080 pixels vertically. 720p resolution is 1280x720 pixels. Having a higher resolution results in a sharper image, because there's more information on the screen at once.

SD vs HD Resolutions Compared
Image Credit: Raskoolish/Wikimedia Commons

As you can probably tell from the discussion above, "HD" isn't a well-defined term. Technically, high definition just means anything that's better that standard definition. In the US, standard definition is 480i (640x480px). In many other places in the world, standard definition is 576i (768x576px).

Read about the differences between NTSC and PAL for more about the history of these resolutions.

Interlaced vs. Progressive Displays

In addition to the resolution, it's also important to know the scanning type of the display. There's a difference between 1080p and 1080i; they don't use the same technology to display video.

The p in a display type stands for progressive scan, while the i stands for interlaced scan. In progressive scan, the video displays all lines in a given frame (one image of the video) at the same time.

In interlaced scan, each frame is divided into two fields. One field contains all even-numbered lines, while the other has all the odd-numbered lines. There two fields rapidly switch back and forth quickly enough that the human eye sees motion.

Interlaced video conserves bandwidth, and was thus used in older analog TV broadcasting. While efficient, it's also more susceptible to distortion, especially for fast-moving video. In the US, most TV broadcasts today are either 1080i or 720p, with the latter preferred for sports since they move quickly.

A 1080p ("Full HD") TV can display progressive scan HD signals from video game consoles, Netflix streaming, and similar. These TVs can also show interlaced signals, but since the process of deinterlacing isn't perfect, you can sometimes spot imperfections.

An HD Ready TV might mention that it can display 1080i video, but this isn't quite the same as "Full HD," as we've seen.

Where Will You See the HD Ready and Full HD Logos?

You'll typically see the HD Ready or Full HD logo on TVs, but they show up on other similar gadgets too. These include projectors and monitors, as well as set-top boxes.

HD Ready Projector

Remember that video will play at the lowest resolution supported by any device in the chain. For example, if your TV is Full HD (1080p), but your set-top box is only HD Ready (720p), your TV will show 720p video. A PlayStation 4 capable of outputting in 1080p won't be able to show that 1080p video on a 720p TV.

Some TVs will attempt to upscale the video, but this is a workaround that doesn't always result in better-quality images.

Are "HD Ready" and "Full HD" Relevant Today?

We've explained this so that you understand the distinction between these terms mostly used for marketing. But today, you don't really need to worry about the "HD Ready" or similar tags on most devices.

720p resolution has become the default minimum for nearly every display device. If you're buying a TV, monitor, projector, or anything like that, it will almost certainly support 720p video at least. Unless it's extremely cheap, chances are that it supports 1080p as well; the Full HD tag lets you know for sure.

But when considering a purchase, you should go beyond these stickers and check the actual product details of a display before you buy it. Online, look in the specifications for a field titled Resolution or similar, which should have a value like 720p or 1920x1080. When in a store, look at the device's box or ask an employee for more details.

In general, unless you're looking to spend as little money as possible, we don't recommend buying any display that's under 1080p. While 720p is still referred to as "HD," 1080p is the HD standard in most people's minds. It's used for Netflix streaming, Blu-ray discs, game consoles, and similar.

What About 4K and Ultra HD?

After HD became the baseline, new technology has brought us even better display options. 4K TVs, monitors, and other displays are now affordable for most people. In most cases, you can treat "4K" and "Ultra HD" as interchangeable.

As a result, you may see stickers labeled Ultra HD or 4K Ultra HD on TVs, monitors, and projectors now. Like "HD," the "4K" moniker is not an exact standard. It refers to any resolution that has around 4,000 pixels horizontally, but the exact count differs between TV and cinematography usage.

Read more: How 4K TV Resolution Compares to 8K, 2K, UHD, 1440p, and 1080p

4K TVs are typically 3840x2160px, which is exactly four times the amount of pixels in a 1080p display. In addition to the 4K or Ultra HD name, this resolution is sometimes called 2160p, in line with lower-resolution naming conventions.

See our comparison of 4K and Ultra HD for more info. At even higher resolutions, there's also 8K Ultra HD or Full Ultra HD, which is 7680x4320px. However, 8K resolution is rarely seen in actual use so far, and will take some time to adopt.

Other Measures of TV Quality

Now you understand the differences between HD Ready and Full HD, and how these compare to Ultra HD. In a lot of ways, these terms are outdated since 1080p and 4K TVs are readily available and affordable now. Either way, you shouldn't buy a TV without checking the specific product details; don't go off these marketing stickers alone.

Remember that the resolution is only one factor that goes into the quality of a TV, too. You should consider the viewing angles, features, HDR support, and similar when buying a new display.

Image Credit: semisatch/Depositphotos, Rubenlodi/Wikimedia Commons