The Internet as we know it is almost completely unrecognizable from its Department of Defense-funded origins of the mid 1960’s, where as part of a DARPA project, UCLA and Stanford were wired up with the first packet-based network with copper links rated for a whopping 50 thousand bits per second.
Fast forward to today, and it’s estimated that 2.93 billion people are active Internet users, tripling over the past decade. Every second, Google handles 40,000 search queries, Facebook processes over 40,000 likes and comments, almost 2 hours of video are uploaded to Youtube, while 2000 hours of video are being watched. All in a single second.
A spider web of cables, satellites, data centers, switches, and routing systems have connected the world in ways not thought possible just a few years ago, and one of the fundamental enabling technologies has been the fiber optic cable.
Tripping the Light Fantastic
Fiber optics are used in medicine, high end audio, Christmas Decorations, and cool ceilings, but the biggest impact has been their use in bringing high bandwidth Internet to the world.
Optical Fibers can be thought of as light-hoses and are amazingly simple in their operation; shine light in one end, and it emerges, roughly as powerful, at the other end. Some of the energy is simply absorbed in the form of heat as the light passes through the material, and other energy is lost from scattering due to tight turns in the fibre, or imperfections in the surface of the fibre.
Another path loss is from individual pulses of light getting ‘blurred’ in time as slightly different frequencies ‘bounce’ in the fiber on a slightly different path. Over the distance between your amplifier and your playstation 4, that won’t make any difference at all, but over the many kilometers that these cables regularly run in single, solid spans, this ‘blurring’ can completely obliterate a signal, smearing it across it’s previous and next pulses, making any signal unintelligible.
This means that to actually use the fiber, you have to slow right down and leave big gaps between pulses to be sure there’s no interference. This inter-symbol interference is controlled in two ways; one: Use very, very specific and stable frequencies (laser colors) for transmission, and use extremely narrow fibers, so the different ‘bouncing’ paths make no real difference.
The combination between absorption, scattering, and inter symbol interference puts certain pressures on fiber design: Fibers need to be as thin as possible without breaking, they can’t be bent too far, and they have to be very, very clear. Today’s fiber manufacturing processes are exceptional in this regard; it’s estimated that if the ocean was as clear as optical fiber, you’d be able to see right to the bottom of the Marianas Trench, bottoming out at around 36,000 feet below sea level.
These few micro-meter thin cables are also heavily protected and have many layers of metal and plastic latticing around them, especially at the very delicate end-terminations. Most fibre systems in use today operate on a ‘multiple medium’ method, i.e. rather than trying to pump oodles of data through a single thread, it’s simpler and often cheaper to turn on another thread and double your overall rate.
The Web of the Web
Once they’re encased in several inches of protection and dumped (carefully) on the seabed, these bundles of over 250 individual threads can today send over 10 Terabits per second from Dublin to New York with less than 100ms of delay. To put that in perspective, you could send the First-Three-Season Box Set of Game of Thrones consisting of 15 Blu Ray discs across the Atlantic in literally the blink of an eye. (~300ms / blink). And that’s just in the wet stuff; optical fibres criss-cross the world along highways and in dedicated tunnels providing the sturdy backbone to the modern Internet. But it wasn’t always this way.
During the tail end of the 90’s, hundreds of optical networking companies sprang up, betting that things could only get better, and invested billions of dollars into establishing top rate world wide webs of fiber that would comfortably hold all the Internet’s growth for years to come; just in time for it to be more or less useless.
In mid-2000, the Dot-Com bubble imploded, and by 2002, $5 trillion was wiped off the stock market. Major telecoms players either collapsed outright such as Nortel Networks, Global Crossing, and WorldCom, or were swallowed at a low rate such as Lucent. But their legacy is a monumental labyrinth of ‘Dark Fibre’ that remains unused, unsellable, and unnecessary.
But even without the massive traffic capacity that these networks were built to handle, the Internet grew and expanded. The Web became heavier and more interactive. Video started to be a regular sight on the web, which previously had only been a pipedream. Network gaming took off, with the launch of the now seminal Counter-Strike games in 1999, and massively multiplayer online role playing game (MMORPG), “World of Warcraft” in 2004, pitting players across the world against each other.
These applications didn’t just require speed but also demanded low latency; that is, the time between a player performing an action and then receiving a reaction from the online world had to be as little as possible, otherwise immersion was completely lost.
Around about the same time, Internet giants such as Google were beginning to buy up swathes of this dark fiber network to provide private high-speed connections between their world-wide data centers supporting their search, advertising, and burgeoning video distribution network, Youtube, which they’d just acquired in 2006.
Fast forward to today, with Netflix taking up approximately a third of US users broadband, the growth of 4K Video in, massive growth in distribution networks such as Steam and Spotify , and the explosive growth in ‘big-data ‘ collected by retailers and other firms, sent around the world between networks of data centers, the backbone of the Internet is finally fulfilling the dreams of those early fiber prospectors.
So what’s next for this network, and what is it going to mean for us?
Faster Than the Speed of Love
These threads of glass or plastic, no thicker than a human hair, are capable of massive data transmission rates: the current record is held by NTT at 111Gb/s over a practical distance (2km, or 1.25 miles).
And one of the most amazing things about the way optical fibers work is that to a certain extent, once it’s in the ground, you can still keep upgrading the end points to make the line faster and faster as technology improves; that’s why dark-fibre networks put in the ground nearly two decades ago are still valuable commodities, with plenty of room for improvement.
Lets say that the telco’s manage to put full rate, 100 Gigabit over each thread of the backbone with advances in transceivers, and start supplying end users with full-rate 10 gigabit fiber-to-the-premises within the next 5 years. This growth would be roughly in line with the order of magnitude change in in the past 5 years in the UK from less than 10 Mbps to over 100Mbps.
So, what are we going to do with these big fat pipes? The easy answer is “More, Better, Faster”; the idea of ‘downloading’ or ‘buffering’ will hopefully go out the window. We’ll consume more media in general, high definition video conferencing will be more reliable, video game latency will be minimized to the point where immersive feedback will be the norm rather than a rare-alignment-of-the-planets event. Combined with advances such as the Oculus Rift , gaming is probably going to be the area that gets the most immediate benefit.
The Fiber ‘Experience’
What’s more interesting is the knock on impacts for other industries, the potential opportunities for new business models, and how this predicted future can shine a light on some of the weirder mergers and acquisitions of the past few years.
In my opinion, Facebook wants the ad-revenue from the burgeoning online-3D experience field that Google monopolized with Youtube in the burgeoning area of online video. Their big difficulty will be in making sure that they keep the experience open enough for developers to use the platform without aggravating end users. It’ll be interesting to watch how they walk that line, but with Facebook okaying an Oculus port of Minecraft earlier this year, we could be seeing a resurgence of development around a single, relatively open device not seen since the explosion of app-development around the iOS platform .
Will Facebook allow developers to interact with the Oculus? Will Facebook create a Sony-Style “Lounge” where you can hang out with your friends in a virtual environment rather than the Facebook Home Page between applications? Will Facebook enable content creators to deliver high quality 3D games, shows, and environments for users to explore while controlling the ‘digital billboards’ in the same way as EA enabled in-game ads in the Burnout franchise?
Speaking of games, will we even need game consoles any more? If you can beam down 4K video with a latency less than a few dozen miliseconds, why not just cloudify the gaming process to dedicated hardware? Maybe OnLives time has finally come? This will probably be true of many experiences; where it is simply cheaper to maintain thousands if not millions of personal-virtual machines, accessible from anywhere in the world through thin-clients such as phones, tablets and Chromebooks .
But what about all those gigabytes of 4K video we’ll be downloading and streaming on-demand? Why would we need broadcast networks anymore? This issue will be particularly difficult in North America, where cable providers operate as de-facto monopolies in many areas, often requiring expensive and arguably pointless channel bundle subscriptions to get the best Internet connections.
The big players are already complaining about the likes of Netflix ‘freeloading’ on their services that they ‘invested’ in. Will the cable companies do what they’ve done for the past 10 years and delay rollout of faster networks to residential areas? Or will they be forced into action by the growth of Google Fiber, Google’s gigabit fiber service that currently operates in three cities in the US? Will monopolies be crippled by desperate towns going it alone in Atlanta and Germany?
Beyond residential applications, what impact will this growth of data mean for businesses and governments? Dr Jonathan Milliken, currently working with an unnamed high performance networking startup, previously of Queen’s University Belfast’s Electronics Communications and Information Technology Institute, where he published the first device-to-device Wifi Virus, said this on the subject of security in the gigabit age:
“Gigabit Internet is the next stage in enabling the provision of almost all services via an online method. It works perfectly in tandem with cloud computing, virtualisation and Big Data. However, from a security perspective it makes the age old problems yet more prevalent. As new services and new capabilities are rolled out they are rarely built with security in mind. The higher the connectivity the higher the security risk”
So will this new age of ultra-high speed connectivity and availability of data provide us with the immersive, virtual reality experiences that we’ve been promised for decades? Or will the data-hungry corporations and nation states continue to swap data about those experiences in order to customize services to our desires, before we realize it – or simply slurp up our personal data for ‘monitoring’ reasons?
Whatever Happens, the Insatiable Thirst Continues
Whatever happens, it’s clear that our collective addiction to data is not going to slow down. Over the history of technology, many have said “No-one will ever need more than X”, and every single time, advances not only in technology but in culture have proved them very… very… wrong.
However, there’s a clear dark side to all this growth; we will be even more reliant on the Internet now as a fundamental part of our lives. But as nations around the globe continue to be dependent on this wide open resource, we as it’s users have to be aware of the risks we take with our personal information every day.
With the Gigabit Internet already here, and 10 Gigabit expected within a few years, it’s clear that we’ll be getting more ‘stuff’, at higher qualities, lower delays, making the world a smaller place than ever, with volumetric 3D experiences, tapping into the power of distant computers. The technology is nearly ready to go. The question is, do our policy makers and local loop providers want to maintain your free and open access to the Internet?
These are just a few thoughts about how this growth phase of the Internet is going to pan out. I’d love to hear yours in the comments!