After trying to love Bing, flirting with the enhanced privacy of DuckDuckGo, and (shutter) even trying to see if Yahoo was any better than the last time I used it (circa 2004), I still can't find a better option for online search than the reigning champ, Google.

The gap is narrowing, but ultimately Google is still king due do the relevancy of the results, the speed at which sites are indexed and searches are executed, and their constant push to get even better.

After a half dozen (or more) unsuccessful attempts over the years to find a better alternative, it led me to try and decipher what makes Google better than competitors in this space.

Here's what I found.

A Short History of Search Engine Tech

First, I feel that it's probably wise to look at some of the technology that drive search engines, and the companies that got us here.

The Creation of Archie (1990)

archie-search-engine

Archie is widely thought to be the first search engine. Although seemingly primitive now, Archie was a tool that indexed publically-accessible FTP sites in an attempt to allow users to find specific files. This was pre-world wide web, so you'd have to access the Archie servers using a local client or through Telnet, which is a networking protocol that allows for the connection of two computers on the same network.

Archie spurred innovation in additional technologies that operated in a similar fashion, such as Veronica and Jughead, VLib, Gopher and others. These technologies were the pioneers of the search industry, even though they were quickly replaced by more advanced technologies.

World Wide Web Wanderer (1993)

Created in 1993, the World Wide Web Wanderer was the first bot created to index websites and automatically insert them into a database. The Wandex, as it came to be, started as a way to measure the size of the web by counting active servers worldwide in order to measure growth. Instead, it became the blueprint for modern web spiders, which are responsible for crawling the web in order to index its contents.

Excite (1993)

excite-modern-day-homepage

Excite was a project by six undergraduate students at Stanford University that attempted to use statistical analysis of relationships in common language in order to improve relevancy in indexing. Excite - then known as Architext - used journalists in order to write brief website reviews for improved usability when compared to more primitive link-based search engines and directories.

The Search Engine Boom (1994)

1994 was probably the single biggest year in the history of search engines. Within a 12-month span, we saw the launch of Infoseek, AltaVista, WebCrawler, Yahoo! and Lycos.

  • Infoseek: Allowed users to submit their webpages in real-time and indexed them automatically based on user-defined categories.
  • AltaVista: The first search engine to use unlimited bandwidth, allow natural language search queries, and the use of search modifiers in order to approve results.
  • WebCrawler: The first crawler that indexed entire pages as opposed to just titles or titles and descriptions.
  • Lycos: The first search engine to sort by relevance. Lycos also consistently had the largest database of stored sites, and at its peak (circa 1996), the service had crawled and indexed over 60 million websites.

Search engines during this time period (and several years later) made heavy use of meta tags that featured title, description and keyword elements added to a website by the webmaster. These tags, as well as keyword density within the text of a website, were the primary ranking factors that aided spiders in properly identifying and indexing content.

It also opened the door to web spam as early SEO experts quickly learned to manipulate this system by stuffing keywords into tags and content in an unnatural manner in an attempt to game the system.

Ask/Ask Jeeves (1997)

ask-homepage

Ask - and their Ask Jeeves service - used human editors to try to list popular user search queries into listings of both partner and paid inclusion sites. The service did use automation in the form of a ranking system called DirectHit, which attempted to rank sites by popularity, but they found that this was too easy to manipulate and ceased use of the engine a few years later.

MSN/Bing (1998/2009)

MSN launched their own search engine to piggyback on the success of other Microsoft products and segue into online search. The search engine relied on Overture, Looksmart, and other paid inclusion directories in order to provide search results until Google proved the success of their backlink model.

This proof of concept by Google led MSN to adopt a similar strategy - although not nearly as good - in 2004.

In 2009, MSN rebranded as Bing, and improved the user interface, to include inline search (suggestions that appear as you type), content that appears on the search results page (as opposed to having to click a link to one of the search results), and a host of new features designed to make search easier and more accessible.

How Google Became Best (And Why It Still Is)

By the late 90s, search engines had become nearly unusable due to the relevancy (or lack thereof) of search results and the prevalence of web spam. Before its launch in 1998, and for half a decade after, competitors really had no answer to the mounting issues caused by web spam and search engine manipulation - which ultimately became known as SEO (search engine optimization).

Enter Google.

google-flat-logo

Google started as a project - known as BackRub - by Larry Page and Sergey Brin in 1996. The aim of this project was to revolutionize search by using links as the primary ranking metric in order to improve relevancy. The theory - known as citation notation - revolved around the idea that each site that linked to another site was essentially expressing their confidence in the content offered within its pages. These links served as votes, and the anchor text within the link served as a way to index sites based on what others were issuing these votes for.

The technology was exciting, and that led Brin and Page to attempt to sell it to competitors in the search world for a reported $750,000. Luckily, no one was interested. Back to the drawing board, Page and Brin continued to improve the technology and by 2000, Google was the largest search engine in the world.

How Google Stays on Top

Adaptability might be the best way to describe how Google remains the most-used search engine in the world. As marketers and search engine specialists learn to manipulate the system, Google changes it. When new technologies - such as social media - start to play an active role in our conversations, Google finds a way to integrate them into the algorithm to improve relevancy. When search habits change, Google changes with them.

Google is constantly watching, analyzing and evolving. By utilizing the massive amounts of data they collect from each of us using their products, they are able to constantly deliver a better, more relevant, and user-friendly product than any of their competitors.

What the Future of Search Looks Like

search-engine-magnifying-glass

While Google is on top and there don't seem to be any competitors on the horizon capable of unseating them, things change quickly on the web. Here is what Google - and others - are doing to improve the relevancy of our search results both now, and in the near future.

Understanding User Intent & Semantics

Search engines - including Google - used to deliver results based on how well queries matched up with webpages. Now, Bing, Google and Yahoo! are all attempting to improve their results based on understanding the intent behind them, known as semantic search.

For example, if a user searches for "Best Seafood" - results in years past would rely on which website was deemed to be most authoritative for those keywords. Now, search engines are placing a greater emphasis on understanding what you are looking for specifically, so they can deliver the most optimized results. In this case, the user is most likely looking for the best seafood restaurants nearby, or recipes. Understanding that across trillions of possible search queries is where things get complicated.

Greater Emphasis on Location

The above example for a query about seafood is one of the primary reasons location is becoming so important. You probably aren't looking for seafood restaurants in San Francisco if you live in Singapore (but if you are, just add "in San Francisco" to your query). Location-based search is changing everything as all major search engines are now utilizing your location in order to deliver customized results for your searches.

Improved Mobile Experience

Firefox Mobile

2015 was the first time in history that mobile traffic exceeded that of laptops and desktops, and it's still a trend with rapid upward movement. In fact, most web traffic originating from smartphones is search-related. Because of the changing time we live in, it's important to deliver a polished mobile experience to mobile users.

Google is doing just this by releasing a new mobile algorithm that hides sites that don't display properly on mobile devices. This algorithmic change is only for mobile traffic, but it should enrich the experience for mobile users and lead the way for further mobile innovation.

Yahoo! is another major search engine with a focus on mobile. CEO Marissa Mayer (a former Google employee) has shifted the focus for Yahoo! in order to prioritize mobile as a primary means of regaining relevancy.

Knowledge Graph

Bing and Google are both largely invested in utilizing their massive databases of information in order to provide quick bites of information for specific keywords and phrases. In order to improve user experience, both are attempting to eliminate additional steps in which the user must take to find information, and instead relying on knowledge graph technology.

For example, movies, celebrity names, or location-related queries now feature information about the query at the top of the page. For quick searches, this may save you the additional steps of having to dig deeper through the results or visit multiple sites to find the information.

This has big implications on mobile, as well as voice search (such as Siri or Google Now) as most searches are quick attempts to retrieve infromation as opposed to exhaustive searches for research.

knowledge-graph-screenshot

Finding New Ways to Improve Algorithms

Search engines are constantly at war with one another, as well as the unknowns that lurk just around the corner. In an attempt to be the best, each must deliver highly relevant results and quickly. As such, algorithms on all major search engines are constantly changing in an attempt to stay ahead of the curve, and find new metrics which may lead to better results.

One of the most exciting examples of this might be Knowledge-Based Trust score, based on a patent held by Google. While Google has outright denied any intent to use this patent in their search algorithm in the near future, the implications of it are huge. If you've ever been fed bad information after an online search, this new scoring system could change all of that by ranking sites based on results in Google's Knowledge Vault (fact stores pulled from indexed content) and how well they match up.

Ultimately, trying to predict what search engine technology outside of the next year or two looks like is a supreme exercise in futility. That said, it should continue to get us one step closer to an age where we're met with the answer to any question in a near-instant fashion.

Is there a better search engine than Google? What makes it better? What search-related technologies or changes are you most excited to see in the near future? Sound off in the comments below. I'd love to hear your take!

Image credit: Search engine via Shutterstock, Firefox Mobile by Johan Larsson via Flickr, Website code optimization via Shutterstock