No one knew in 1998--the year Google first appeared as a search engine--that it would go from serving a mere 147 million users to a potential 3,739 million. Nor in a world where the now-defunct-browser Netscape was suing Microsoft could anyone envision that a nobody named Google would eventually leave Microsoft trailing behind as the preferred internet browser of more than 70% of internet users.
Yet, from those small beginnings a giant has emerged. And nowhere is the giant's presence felt more than in internet marketing. It's been setting the rules of engagement, dancing with marketing managers since December of 2000.
The Google Toolbar
In 2000, the predominant internet browser was Internet Explorer (IE). Because it played nicely with IE, the Google Toolbar made it easy to search the internet from any page. This was great for internet marketing because Google's PageRank meant savvy web marketers could implement SEO strategies that helped users find the pages they managed. The 'Google Dance' had officially begun.
While the term is now outdated because Google no longer rebuilds its rankings on a monthly basis, there's a sense in which marketing managers must adapt their moves to Google. It's still attracting the most users and leads marketers through every engagement. Google's choreographing the connection between user and website.
Google’s purpose has always been to deliver a better user experience than its competitors. Updates are part of ensuring they continue doing that.
The response, of course, has always been mixed. Consider the first minor change in Google’s algorithm in September 2002. Tons of discussion ensued as to what Google was doing. Little has changed since then.
The best quote from 2002 comes from one senior member of the WebmasterWorld.com forum. “How on earth can they justify dropping sites that were ranked in the top 10 and are now page 20 and NOTHING at all has changed on the sites from the last month? The biggest thing is they move the toilet mid-stream without a hint they are going to do it...(change the rules) Googles a joke... tired of their games.. off to support ANY other search engine... enough of this every month change the rules nonsense... goodbye Google... Good riddance.”
It’s a sentiment many have expressed, yet Google’s continued to dominate. Why? Possibly because users have appreciate Google’s efforts from the start to ensure its search results serve their best interests. It also possible other search engines just didn’t do as good a job.
February 2003 saw the first named update—Boston. Algorithm changes and major index refreshes rolled out with this one. However it wasn’t until Cassandra rolled out in April that Google began targeting certain SEO practices—massive linking between co-owned domains, hidden text and hidden links—today considered black-hat SEO and a sure way to damage a website’s rankings long-term.
By the end of the year, marketing managers could no longer predict when an update was coming, because Google switched in July from monthly index updates to daily updates. The term ‘Everflux’ replaced ‘Google Dance.’
The final update in November 2003—the Florida update—did more to establish the importance of quality in SEO strategy than anything yet. Keyword stuffing could no longer trick Google into sending users to off-target content. On-page content had to align to the keywords. We’ll cover more on how Google’s targeted deception later.
‘Everflux’ was a positive change because data was more up-to-date. However, it also impacted Google’s performance. So Google introduced a supplemental index in September, where documents with lower ratings went—only to be searched if Google didn’t find a match in its primary index. Did that create an uproar! (Marketers still disbelieved Google’s claims the supplemental index’s filter wasn’t a penalty system when the index was updated in 2006.)
By late 2005, growth forced Google to undergo a major infrastructure overhaul. Big Daddy took over three months to complete. One major change was how Google handled URL canonicalization, redirects and DNS.
By 2009, the company needed another massive infrastructure change. They introduced the Caffeine plan in August 2009. The final results weren’t felt until June 2010. Google realized its efforts to speed crawling, expand the index, and integrate indexation and ranking in nearly real-time. As a result, Google claimed its index was 50% fresher than before.
Again in April 2012, Google made 52 changes. Included was a 15% increase in its base index. These changes were concurrent with a major algorithm update, Penguin.
This became an official Google goal in November 2003 with the release of the Florida update. Austin wrapped up what Florida missed in January 2004. Identification of meta tag stuffing, recognition of anchor text relevance, detecting link ‘neighborhoods’ and another crackdown on invisible text targeted improving page relevance.
Brandy brought in Latent Semantic Indexing (LSI) so keyword analysis could extend to synonyms. This meant Google could start evaluating search intent. Those who paid attention to LSI began developing SEO strategies that went beyond a limited set of keywords. SEO experts began promoting the concept of long-tail keywords. However, high volumes of specific keyword densities remained a dominant strategy.
Along with ongoing updates to demote the value of low-quality links, link farms and paid links, 2005 was the year of nofollow. This reduced the number of spammy links, which had been deceiving Google into sending users to irrelevant content. Thus ended the effectiveness of a strategy some SEO experts still propagate—using blog comments to tweak SERPs. It stopped working over 10 years ago!
Since then, Google continues to roll out updates to eliminate practices it believes robs users of its search engine of the quality experience they deserve.
- Jagger demoted the value of low-quality links, link farms and paid links in 2005.
- May Day introduced changes which reduced the effectiveness of long-tail keywords in 2010.
- Negative Reviews demoted sites using negative reviews to gain ranking.
- Overstock.com Penalty punished Overstock and J.C. Penney for link schemes uncovered by the Wall Street Journal in 2011.
- Attribution updated Google’s ability to sort out who content should be attributed to, so scrapers weren’t rewarded for stealing content.
- DMCA ‘Pirate’ began penalizing sites when they were reported for violating copyright law.
- Penguin, the ‘over-optimization penalty,’ tweaked spam measuring factors again in 2012, crunching down on keyword stuffing, etc. even more aggressively than before.
Targeting a Positive User Experience
In addition to its efforts to stop trickery of the system, Google has implemented some major changes over the years. For example, a major upgrade took place in May 2007. By integrating traditional search results with News, Video, Images and Local each search rendered more options for users to explore—and in the format they preferred. Real-time Search in December 2009 made the concept even more user-friendly by integrating social media feeds as well Google News.
Real-time results also brought the freshness of content to the fore. Any question as to whether Google’s algorithm rates newer as better is answered with a definitive, “Yes,” since its launch.
Between its search tweaks, Google introduced ‘Suggest’ in 2008. Now when a user began typing in search words, a dropdown menu with potential queries appeared. It remains a wonderful experience for poor typists and powers Google Instant, a strategy that begins crawling Google’s databanks as soon as Google anticipates what the user may be looking for.
These features are just part of a positive user experience. Arriving at relevant search results is another. That’s what the Panda/Farmer updates began addressing 2011. Over several months, websites across the world were hit for thin content and high ad-to-content ratios. Article directories, which had been a major tool for building links, were hit especially hard—never to recover. (When was the last time you saw an article from eZineArticles.com appear in search?) By the end of 2012, Panda had been ‘refreshed’ 23 times.
One other positive thing happened for users in 2012. Google introduced the Knowledge Graph (KG). Now a box with information about people, places and things appeared with search results whenever Google had information in its knowledge base. July 2013 saw a 50% explosion of KG entries.
Google’s “In The News” box first appeared in October 2014. Not limited to official ‘news’ channels, the name was changed to ‘Top Stories’ in late 2016.
Possibly one of the most significant updates for users was the Mobilegeddon announcement of April 22, 2015. It was the first step toward Google’s announcement in 2016 that it is moving toward a mobile first indexing system. Both updates are designed to give mobile users the best possible experience using Google search on their mobile devices. That motivation seems to be what’s behind Google’s warning it’s going to start targeting websites for demotion in rank for using intrusive interstitials (popups) on mobile.
Targeting a positive user experience remains a moving target, which explains why Google’s recalled some of the updates it’s rolled out. It promoted Authorship Markup at SMX Advanced 2011, which allowed photographers and writers to tag their original works. By the end of 2013, authorship markup started disappearing. In June 2014, Google dropped all authorship photos from its SERPs. By the end of August, Authorship no longer existed.
Targeting a Customized Experience
Another breakthrough in user experience came in 2005. In June, Google introduced personalized search based on tracking a user’s search history. It’s used search history ever since to choose which organic search results a user is most likely to show interest in. It’s also the foundation of its AdWords program.
One of 2005’s introductions, which would go on to play a major role in Google’s success today, was Google Local/Maps. This empowered businesses to use Maps to emphasize their local presence. It also created a local audience segmentation. From this time forward, local options have been prioritized over non-local when available, though desktop search isn’t without flaws. Cities with large populations in other countries or states such as Vancouver or Portland may still need modifiers to guarantee local results.
However on many mobile devices, this isn’t a problem. By using location information, search delivers information for the immediate area surrounding the user.
Targeting a Safe Experience
In August 2014, Google announced sites with HTTPS/SSL would see a boost in ranking. It was great for web hosts as businesses now had to host their entire website under HTTPS protocol, not just their shopping carts. It’s been a good mover for users as well. Non-secured sites still pose a risk when users submit forms. The data can be captured and potentially combined from other unencrypted submissions.
As a marketing manager, you can expect Google to continue to evolve to match the market. While it’s one of the forces shaping the market, it’s also a tool you can use to forge tremendous opportunities for your business—especially if your goal is also to give users the best possible experience.