Before the days of mobile devices and laptops, our entertainment needs were mostly filled by one source, the television.
The TV proved to be the single most innovative consumer technology until the computing age, and to this day, it remains a powerhouse in the entertainment realm.
But how did we get here, what’s next, and how much do you know about the technology that makes the tube so popular?
Let’s dig in and discover what’s what in terms of TV technology.
History of Television Technology
Perhaps the most impressive part of television history was the fact that the technology was not invented by a single inventor but through collaborative effort, shared technology and individuals who sought to push the tech to its limits. We’re going to discuss a lot of the technology found in television history, as well as current technology that you’re probably using in your home today.
But, before we get too far ahead of ourselves, it’s important to know what got us here. Let’s have a quick history lesson.
At the end of the 19th century and early into the 20th, there were two very divided groups of television pioneers. On one side, you had early inventors attempting to build the mechanical television system – based on earlier technology by German university student Paul Nipkow – called the Nipkow disc. On the other, inventors favored an electronic television system using cathode ray tube technology.
Mechanical Televisions & Electronic Televisions
Mechanical televisions used a spinning disc (known as the Nipkow disc) with a spiral pattern containing holes. Each hole scanned a line in an image which – in theory – allowed image transmission over wire and on to a screen. This technology dates back to 1884 and while Nipkow was granted a patent for it, he never built a working prototype. Around the turn of the century the patent had expired, and others had begun work using the technology in order to create the first television pictures.
While mechanical televisions could never be considered a success, the science and the technology behind Nipkow’s creation led to a television discovery we’re still using to this day, known as the television scanning principle. This principle describes the process in which light intensifies small portions of an image (lines) at any given time, before repeating the process by moving to the next line. Today, we call this principle “refresh rate”. Needless to say, electronic television ultimately won the battle.
Cathode Ray Tube (CRT) Technology
Electronic television technology made use of cathode ray tube – or CRT – in which the “cathode” consists of a heated filament inside a vacuum tube made of glass. The “ray” is a stream of electrons which reacts with the phosphor-coated screen on contact, changing its color properties thus producing images.
RCA, Franklin Roosevelt and the Birth of American TV Culture
The first working prototype saw the light of day in 1927. Philo Farnsworth showcased the CRT technology to display an image consisting of 60 horizontal lines. The image? A dollar sign.
In 1929, Russian inventor Vladimir Zworykin improved upon existing CRT technology and demonstrated the first television system with the features we’ve come to expect from a CRT – or “tube” television. The patent for this technology was later acquired by RCA, and turned into the first consumer television sets. These consumer models were rather niche items and not available to the general public until 1933.
In 1939, RCA television sales exploded after President Franklin Roosevelt delivered a televised speech at the opening ceremony of the 1939 New York World’s Fair. This set in motion a series of events that would see television sets begin to make their way into every household in America. The speech — while impressive use of technology at the time — was recorded. The first live national broadcast took place in 1951 when President Harry Truman’s speech at the Japanese Peace Treaty Conference in San Francisco was transmitted to local broadcast stations utilizing AT&T’s transcontinental cable technology.
Fun fact: Television was actually invented before sliced bread.
The First Color TV
Until 1953, households that owned a TV were limited to black and white pictures. Color technology was actually available in the early 1940s, but due to the ban on the production of television sets and radio equipment (for consumers) by the War Production Board from 1942 to 1945, opportunities for further testing and development were halted. This production ban was due to both supply issues as demand for metal alloys and electronic parts soared during war time, and a lack of available production assistance due to a bulk of the workforce serving in the war.
Although inventors such as Jan Szeczepanik had been working on colored television technology pre-dating the first working black and white prototype television, the first practical applications came when CBS and NBC started to use experimental color field tests in 1940. The two networks were both successful in their efforts to record programs in color, but due to the ban on production of televisions and the inability to project color pictures on to existing black and white sets, the development was ultimately put on hold for consumers until 1953, when the first consumer color television sets saw widespread release.
The first national broadcast in color occurred in 1954 as NBC broadcasted the Tournament of Roses Parade on New Year’s Day. Due to high prices of the television sets, as well as a lack of color programming (due to high costs) the color television was mostly a non-starter until 1965. That year, major broadcasters reached an agreement that over half of all prime-time broadcasts would be in color and the first all color broadcasts would occur just one year later. By 1972, all television programming was broadcast in color.
Fun fact: The first remote control was released in 1956 by the Zenith Electronics Corporation (then known as the Zenith Radio Corporation) and called “Lazy Bones”.
Additional Projection Television Technologies
While CRT technology dominated the television market mostly unchallenged for decades, additional television technologies started to emerge in the latter half of the twentieth century.
The two technologies that follow started their lives as projectors (featuring a projection unit and a separate screen), both made their way into all-in-one units during their heyday. Both are still around, but the paths taken are quite different. LCD projectors are on their way out but the tech still exists in computer monitors and television sets. DLP, on the other hand, had a rather successful (although short) run in the TV market, but the technology seems to have found a home making cinema and home projectors instead.
DLP televisions are no longer made, and LCDs are still around, but the technology is changing.
The LCD (liquid crystal display) projector took a step in a different direction than the traditional CRT console. Instead of relying on an all-in-one unit, the projector needs a surface to project a picture on to; typically a wall or a pull-down black, white or grey screen.
The projector itself displays images by sending light through prism, or a series of filters into three separate polysilicon panels. Each of these panels is responsible for a color on the RGB (red, green, blue) spectrum of the video signal. When the light passes through the panels, the projector opens or closes each of these crystals in order to form a specific set of colors and shades on to your backdrop.
The LCD projector mostly died out in the late 90s and early 2000s as it was replaced by newer and more efficient DLP (digital light processing) technology.
To produce an image on a screen, DLP projectors (or televisions) rely on a white lamp that shines bright light through a color wheel and a DLP chip. The color wheel is in constant rotation and features three colors; red, green and blue. Creating a specific color is achieved by synchronizing the timing of the light and color wheel in order to project that color (as a pixel) on to the screen. The wheel and light create color while a digital micromirror device creates shades of grey depending on the way in which it is positioned.
DLP televisions use the same basic technology, only mirroring the display as they project from the rear (making it appear backwards without mirroring the image) rather than the front.
The television market started to fizzle in the latter part of the 2000s (pre–2010), but the projectors still account for most of the front projection units sold.
These units currently dominate the cinema market due to their incredible ability to reproduce color.
Current three-chip DLP projectors are capable of producing an estimated 35 million colors. The human eye can only detect about 16 million of these.
Recently Deceased Television Technologies
Unlike the LCD projection model we talked about earlier, the typical LCD screen is a rear projection unit that features similar technology, but mirrors the image off the back of the monitor in order to flip the image so that you view it as intended. Apart from that, and the fact that this unit is completely self-contained, the technology is essentially the same.
LCD screens using the CCFL backlight (pictured above) – while still available – are all but dead. Aside from superior technology, LCD had some significant problems. One of the most notable is the expense of producing larger (40-inch and above) models. In addition, the picture quality diminishes when viewed at an angle, and there are significant problems with response time when it comes to refreshing images, which leads to motion blur or delay (lag) when reproducing fast-moving images. This makes these TVs a rather bad choice for gaming or sports.
Plasma televisions sort of revolutionized the TV market for a time. Offering extremely wide viewing angles, relatively low prices, and the ability to produce amazing contrast ratios, plasma TVs were on top of the world for about a decade before additional technologies came along and started to steal market share.
Plasma TVs work by trapping noble gasses (and others) in tiny cells trapped between two layers of glass. After applying high voltage electricity to the cells, the gas within them creates plasma. By applying varying levels of energy to each cell, the gas rapidly heat and cool in order to produce colored light. This colored light makes up the pixels on the front of your display.
While once popular, plasma wasn’t free from issues. The most notable of these is the power requirements which led to real problems with heat production, efficiency and a shorter lifespan than other technologies.
Liquid Crystal on Silicon, or LCOS TVs received its death certificate in 2013.
The technology was a rather complicated one, and never really became all that popular with consumers. LCOS displays use a beam of bright white light passed through a condenser lens and a filter. From there, it is split into three beams with each beam passing through another filter in order to turn the beams of light into either red, green, or blue colors. These newly colored beams come into contact with one of three LCOS micro-devices (one for each color) and then pass through a prism which directs the light to a projection lens that magnifies and projects it onto your screen.
While LCOS technology had some real advantages, such as creating blacker blacks than DLP or LCD, it ultimately failed due to a lot of the same weaknesses that plagued LCD TVs, such as motion blur and a comparatively narrow viewing angle. In addition, LCOS suffered from light output issues that diminished the brightness of the screen, leading many consumers to complain about dull color and low contrast.
What’s Current and/or Next?
Hold on to your hats, as this might get slightly confusing. The LED television is actually an LCD screen. That is, fundamentally an LED TV uses the same technology as a typical LCD screen with the only major difference being in the way it’s backlit. While a typical LCD screen uses a cold cathode fluorescent light (CCFL) in order to produce bright and vivid color, the LED (or LED-backlit LCD display) uses light emitting diodes (LEDs) to provide the backlight.
The benefit in the technology switch is mainly in power consumption (LED backlighting is 20 to 30 percent more efficient than CCFL), although performance gains in terms of dynamic contrast, viewing angle, cheaper cost of production and a wider range of color offer additional bonuses.
Organic light-emitting diode (OLED) technology uses a layer of organic materials positioned between a positive conductive layer of substrate and a negative emissive layer. When connected to a power source, two electrodes – the anode and cathode – ensure the flow of power in the correct direction. When power is flowing properly, the charge produces static electricity that forces electrons to move from the conductive layer, down towards the emissive layer. The changing electrical levels produce radiation that displays as visible light.
Currently LED and OLED TVs are deprecating previous technologies such as LCD (CCFL) and plasma. In fact, 2014 essentially saw the death of the plasma TV. Not a single major manufacturer added a plasma display to their 2015 lineup. LCDs with the CCFL backlight are also dead in the water.
OLEDs use far less power than plasma or LCD models, making them a safer bet in a consumer switch that’s geared toward more efficient electronics.
Now, OLEDs aren’t perfect. While the technology continues to improve, there are still doubts that the display will last as long as an LCD or even a typical LED television. Apart from that, the organic compound used within an OLED screen is quite susceptible to water damage, more so than any other television tech currently on the market.
Everything You’ve Ever Wanted to Know About Resolution
From standard-definition 480i, to enhanced definition (480p and 576p), high definition (720p, 1080i and 1080p) and now 4K (2160p), resolution has no doubt come a long way. But how’d we get there, and what do these numbers actually mean?
Interlacing Versus Progressive Scan
TV resolution is measured using an “i” for interlaced, or a “p” for progressive (we took a look at this, and other TV jargon previously). The standard definition television (NTSC) resolution is 480i, while 4K, for example is 2160p. But what’s the difference?
Interlacing takes advantage of the fact that our eyes can’t pick up information as fast as it’s displayed. If you think of a television screen as a series of lines numbered 1 through 100 (a made up number), interlaced technology splits the lines into evens and odds. First the television will produce an image on the even-numbered lines, and then 1/60th of a second later it will produce an image on the odd-numbered lines. Due to the speed in which this happens, the viewer has no idea it’s even going on (typically).
Progressive scan technology draws all of the lines simultaneously. This is the current standard that modern televisions use to measure resolution.
You’ve seen the numbers, but what they mean? For example, what information goes into creating the numbers, such as 720p and 1080p we see on our televisions?
This is actually quite simple. Televisions are measured with both width and height to determine the total resolution. For example, a 1080p television is actually measured as 1920 x 1080. The first is the horizontal measurement, or the width, while the second is vertical, also called the height. Each of these numbers equates to a single pixel on the screen. So, in this case, a 1920 x 1080 display actually features 1,920 pixels from left to right and 1,080 pixels from top to bottom. The width measurement is always the one that the “p” is added to if it’s a progressive scan television (which all newer TVs are).
As an additional example, let’s look at the newer 4K standard. 4K TVs feature a resolution of 3,840 x 2,160. This makes the it 2160p.
Exploring Television Features
Okay, so we’ve explored some TV history, some of the core technology (as well as some obsolete technology) and we’ve summed up all you need to know about resolution. Now it’s time to dive into the features found on modern televisions so that you can separate the must-have features from the gimmicks that you can just as easily pass on.
Curved screens are everywhere. You can’t walk into a big box electronics retailer without seeing one of these models front-and-center just enticing you with its beautiful picture. Thing is, it’s mostly a gimmick — well, depending on who you ask.
According to Dr. Raymond Soneira of DisplayMate — a display diagnostic and calibration company — there are some benefits to the curved screen. He says:
“This is very important for a display technology that produces excellent dark image content and perfect blacks, because you don’t want that spoiled by ambient light reflected off the screen.”
The short version of Dr. Soneira’s argument is that the curved television reduces glare by limiting the angles at which they’re often produced. He goes on to say that the curved screen provides a better viewing angle due to “foreshortening” which is an effect caused by sitting on one side of the television which makes the side closest to you appear slightly larger than the opposite (furthest) side.
Several prominent review sites, such as CNET have all come to the conclusion that Dr. Soneira’s arguments don’t hold much water. The reduce in glare and reflections is true, but the curved screen actually enhances the reflections that it does pick up, making it basically a wash.
For now, it’s strictly a marketing gimmick designed to squeeze extra dollars out of consumers seeking bleeding-edge electronics, and is a feature you should pass on.
There’s no denying that 4K resolution is beautiful. But is it for you?
Well, it’s not that simple. While 4K is beautiful, there really isn’t all that much content available for it. Some YouTube and Vimeo videos, some planned Netflix content, and the upcoming release of 4K Blu-ray is really about all you can expect as far as content that actually takes advantage of your increased resolution.
HDTV cable and satellite sources are going to be in 1080p for the foreseeable future. There are real concerns with internet speeds and bandwidth limitations for streaming video, and outside of that all you’re really left with is 4K Blu-ray.
Is it worth it? I don’t know. If you’re looking to future-proof your home theater, it’s probably not a bad decision to go 4K. For the rest of us? It’s really not important to rush out and buy a television with 4K resolution. Prices are dropping, 1080p is going to be around for another half decade or more, and there really isn’t all that much that makes it worthwhile to spend the extra cash at the register.
Me? I’d wait.
3D was a pretty hot technology in the recent past. Futuristic-looking glasses, while rather awful looking provided some pretty cool effects if you could find the right content in which to utilize it. That’s the thing though; there really wasn’t (and isn’t) that much in true 3D content available aside from a few Blu-rays and some streaming movies here and there.
Ultimately the fad started to fizzle, and then we saw a bit of a resurgence when 3DTVs started simulating a 3D picture on normal broadcasts, streaming movies, and physical discs, and some without requiring those hideous glasses. It’s not all that impressive.
3DTV is largely a fad, and we’re beginning to see the manufacturers recognize that consumers just aren’t all that interested. Save the money and purchase a bigger TV instead. Better yet, if you have a friend with a 3DTV, ask them how often they view content in 3D. I’m willing to bet the answer is “never.”
While most new TVs include 3D, it’s not something that’s worth buying a new television for.
Hear me out on this one. Smart TV, with its apps, widgets and features is undeniably cool. Picking up your TV remote and switching from ESPN to Netflix, to Angry Birds, and then to Facebook is definitely convenient, but at this point in time it’s really not needed.
If you’re purchasing a new television (meaning, not used), the choice is really made for you. Smart TV dominates the market, so the only decision you’re really left with is which interface you prefer. However, if the decision is whether to upgrade your existing TV which – while not “smart” – has a great picture and features that you’re happy with, it’s certainly not worth it to upgrade just for smart functionality.
Roku, Amazon Fire TV, Apple TV or even a Blu-ray player with built-in apps are all better options than most Smart TVs, and all can be had for less than $100. Not to mention, Smart TVs are becoming a bit of a security risk.
120Hz/240Hz/600Hz etc. are all mostly subjective numbers. While in the true sense of the technology, a faster refresh rate is always better, but the problem with most of these markings is that there’s no real standardization process. For example, a 120Hz refresh rate on a high-end TV could actually be remarkably better than a 240Hz refresh rate on a gimmicky lower-end TV.
In addition, almost all major television manufacturers (LG, Samsung, Sony, etc.) have their own meaningless terms, such as Clear Motion Rate, TruMotion, and SPS. None of these mean anything and there isn’t any one of these technologies that’s better than the other.
So, what do you do? Ignore the hype and use your eyes.
Again, this is pretty inconsistent at best, and an outright lie at worst. Currently, there isn’t a single standardized way to measure contrast ratio, and every manufacturer is sort of inventing the process as they go. Much like refresh rate, a TV that touts a 1,000,000:1 contrast ratio might still look vastly inferior to a “lesser” contrast ratio of 500,000:1.
LCD manufacturers attempted to combat the dreaded viewing angle problem by attempting to quantify the angle in which their televisions were viewable. It’s mostly crap.
While LCD (non-LED LCD) TVs are on their way out the door, this marketing gimmick still comes into play for some TVs. The idea of quantifying what sort of viewing angle a display has is all but impossible without taking the TV into your own home and factoring in differences in light, programming, and positioning of the TV itself. Don’t trust the viewing angle claims.
Input and Output
This is a feature of a television that can’t be ignored. While there is no correct answer as far as how many inputs or outputs a device should have, it’s important to note the type of inputs (HDMI, USB, etc.) and outputs that you require to hook your new TV up to your existing – or new – home theater equipment.
Networking and Wi-Fi
If you do find yourself purchasing a new television, one feature that you shouldn’t be overlooking is connectivity. While all Smart TVs have built-in Wi-Fi, modern sets also feature a number of cool connectivity options. On my Samsung, for example, their “Anynet” feature allows me to effortlessly connect my new television to my media server, which allows me to stream content over a household network to any connected television. I use this so often that I’m not sure how I’d live without it at this point.
Keep It Simple
There are a million and one additional features – some real, some hype – but none of them really matter. Choosing a television is much simpler than the salesman would have you believe. Ultimately the best way to choose a TV is to look for the features you want, mostly ignore the specs, and use your eyes to determine which picture looks the best to you.
It’s really that simple.
What kind of TV is in your living room/family room/theater room? Which feature would be most important to you if you were going to purchase a new TV tomorrow? Let me know in the comments below!
Image credits: A young boy watching television via Shutterstock, Telefunken 1936, Cathode ray Tube, SMPTE Color Bars, Trinitron via Wikimedia Commons, LCD Projector, LCD TV with CCFL, LCOS, Interlacing Demo, Resolution Chart, Samsung Curved TV by Karlis Dambrans, #94 Something you don’t like by schvin, Samsung SMART TV 1 by Vernon Chan via Flickr