Google is the most prominent search engine today, staying well ahead of competitors such as Bing and Yahoo Search. This is why most digital marketing campaigns, especially for search engine optimization or SEO, focus on this particular software system regarding content and web design.
One of the reasons for Google’s popularity is its overarching goal of providing excellent user experience through high-quality results for the subject matter you want to learn more about. The company continues to develop the system by rolling out updates that address technical issues or loopholes that shady entities can abuse to deceive consumers.
An Algorithm is a step by step method which is used to help solve a problem in computer operations and data processing. Algorithms are designed to manipulate data in a number of ways to help a person or program come to a solution to the problem they may be having issues with. Internet search engines like Google, update their algorithms annually to help promote better search functions for website users.
A Google algorithm update is an update in which parameters and search algorithm data is modified to help provide better Google search results for users, along with enhancing search rankings of websites. These types of updates made by Google are designed to try and penalise websites which don’t conform to their webmaster guidelines and praise those who do. Google continues to update their algorithm regularly to help provide a better overall experience for website visitors and website owners.
Here are the updates that transformed Google into the powerhouse that it is today; beginning with the latest:
Another broad core algorithm update came in March this year, which some analysts say still targeted the health and medical niches. However, Google insists that it did not focus on a particular industry with the previous and current update.
The team acknowledged a broad core algorithm update after several sites reported massive impact in their rankings. This update was believed to have targeted health and wellness websites. Other niches affected by the change were beauty and personal care, arts and entertainment, electronics, and food.
Google implemented the mobile page speed update in July 2018 which saw this element become a ranking factor for mobile results. There was no evidence of significant mobile ranking shifts as this update only affected the slowest sites. In the same month, Chrome version 68 began warning users who were visiting sites without a security certificate.
After trying out 300-character snippets on November 2017, Google retracted the change and rolled back most of it to the previous limit on May 2018. In the next month, the team launched a new SERP feature which transported videos from organic-like results into a video carousel.
Google confirmed a significant core update in March, but the company did not call it Brackets. It led to volatility spikes early in the month and continued for about two weeks. Later in the month, the search engine announced they’ll be launching the mobile-first index with site migration happening gradually.
Before launching a snippet length increase in November 2017, Google has been testing this update for quite some time. When it started, the meta description limit doubled from 155 to 300 characters. Then, in December, an unconfirmed Maccabees update was suspected by the SEO community because of ranking volatility. However, the search engine was adamant that only several small updates occurred.
Google pushed through with its emphasis on website security by warning users who were sharing information on sites with unencrypted forms. It’s not an algorithm update, but it did prove the team’s sincerity in ensuring users’ data safety and may have had an overall impact on a website’s credibility.
The search engine launched a penalty to discipline aggressive interstitial web pages and pop-ups which may harm the mobile user experience. Google warned webmasters five months in advance before the update was rolled out so they had time to make the necessary modifications to their sites. Meanwhile, Fred came in March which affected the rankings of several websites, but the update wasn’t officially confirmed.
Google AdWords got a noticeable improvement with the removal of right-column ads and launching top blocks that display four ads. The change affected organic search click-through rates or CTRs since unpaid search results are no longer prioritized. Penguin 4.0 had two phases; one in September and the next in October. The updates devalued bad links instead of penalizing sites.
Google admitted to a core algorithm change which affected quality signals. It was dubbed as Phantom 2 in May and led to several websites reporting about massive ranking losses. Panda 4.2, which was another data refresh, came in July. In October, the search engine talked about RankBrain, which is a machine-learning based engine and how it influenced rankings.
At the start of 2015, Google sent an email to webmasters about mobile usability issues on the websites; leading industry analysts to anticipate an algorithm update for this factor. True enough, the search engine rolled out the change in April of that year.
Pirate 2.0 was an update on the DMCA Penalty which was launched in August 2012 to crack down on websites with copyright violations. The new version which rolled out in October targeted specific sites to combat software and digital media piracy. In the same month, Google implemented Penguin 3.0 which appeared to be a data-only update. By December, it was announced that it would shift to continuous updates; thus, the term Penguin Everflux.
Google announced that it would prioritize secure sites or those that have security certificates. Moreover, the team confirmed that adding encryption would add a slight rankings boost. The search engine also removed authorship entirely from search results and lessened the number of images shown in SERPs. Meanwhile, Panda 4.1 got an update in September that targeted affiliate marketing, keyword stuffing, and security warnings.
The search engine rolled out a significant update called Pigeon on July 2014. Google enhanced its algorithm for local searches such as queries related to events or businesses near users. It aimed to deliver more relevant local results. Its expansion came in December when it was implemented in the UK, Canada, and Australia.
Panda got another update with version 4.0 in May 2014 where it devalued sites with thin content that affected a whopping 7.5 percent of all queries. Payday Loan 3.0 came a month after 2.0 and penalized websites with low-quality external links and over-optimization.
Google updated its page layout algorithm, which was commonly known as Top Heavy when it launched back in January 2012. It targeted websites with too many ads and too little content. Payday Loan 2.0, on the other hand, was rolled out in May 2014 and cracked down on sites that engaged in link buying and other black hat tactics.
Hummingbird is a core algorithm update that allows for more semantic search and makes the most of Knowledge Graph, which was the SERP-integrated display providing supplemental information on users’ queries that they launched last May 2012. Meanwhile, Penguin 2.1, also known as Penguin #5, affected one percent of searches but it was considered as a relatively minor data update and not an algorithm modification.
In August, Google announced that they were prioritizing websites with high-quality, in-depth content since as much as 10 percent of users want to learn about a broad topic. The team recommended webmasters to provide authorship markup and create compelling posts to improve their SEO.
The Payday Loan update targeted niches which have spammy queries such as payday loans and porn. The update rolled out over two months. It went after illegal link schemes which impacted 0.3 percent of searches.
Penguin 2.0 provided significant improvements in the quality of algorithms. Google continued to crack down on web spam with more restrictions. Several websites felt tremendous traffic loss caused by lower visibility. Phantom wasn’t officially announced, but many reported lower visitor volume in the first week of May.
Panda #24 rolled out on January 22, 2013, which affected 1.2 percent of queries. #25 came after two months. Matt Cutts announced that this update would be the final one before Panda was integrated into the core algorithm.
Panda #20 was implemented on September 27, 2012, and overlapped with the EMD update. It officially impacted 2.4 percent of queries. #21 came in November and is a bit smaller than the previous since it only affected 1.1 percent of searches. #22 was data-only and ran on November 21 while #23 came in December and was deemed as a refresh with a higher impact than Panda #21 and #22.
Penguin #3 is a minor update on October 5, 2012, affecting less than one percent of queries. Meanwhile, Page Layout #2, which rolled out on October 9, is also a slight update on the first format which focused on cracking down sites with too many ads and too little content.
The Exact Match Domain, or EMD update on September 27, 2012, led to massive devaluation and affected 0.6 percent of query volume. Its primary objective is to target low-quality sites that may be ranking well due to exact matching. It didn’t impact websites with keywords in their domain names.
Also known as the pirate update, the Digital Millennium Copyright Act or DMCA update cracked down on websites with copyright violations. This law criminalizes the production and distribution of a company’s assets. A DMCA takedown notice informs an individual or organization that they’re illegally hosting or linking to copyrighted material. The June/July 86-pack of updates consisted of Panda data, and algorithm refreshes and modifications to site clustering.
From June to September, Google rolled out Panda updates with versions 3.7, 3.8, 3.9, 3.9.1, and 3.9.2. These were data updates and had minor impact on the community; only affecting less than one percent of the results.
Called by some as Penguin #2, Cutts announced on May 26, 2012, that the 1.1 version was more of a data refresh and affected less than one percent of English searches. The same month had a 39-pack update including improvements on Penguin, enhanced link-scheme detection, modifications to title/snippet rewriting, and changes for Google News.
Google launched the Knowledge Graph on May 16, 2012, which is a SERP-integrated display that offers supplemental data related to users’ queries. A month before, the search engine rolled out 52 minor changes for April including ones that were tied to Penguin.
Google continued to take measures ensuring the quality of sites that are ranked through the Penguin update on April 24, 2012. It decreased ranking for sites that violated the search engine’s existing guidelines and enhanced vigilance against spam factors such as keyword stuffing.
March had a 50-pack update which was officially announced on April 3, 2012. Two of the purposes of this set of changes were to identify quality content and improve link text analysis. Later in the same month, Google admitted to a bug which treated some websites as parked domains by mistake; leading to a drop in rankings for the sites. Panda 3.5 and 3.6 were rolled out on April 19 and 27, respectively, with minimal impact on SEO.
Panda 3.3 was rolled out on February 27, 2012, which came three days after its first anniversary. It was described as a data refresh instead of having new or changed ranking signals. Its 3.4 version launched on March and influenced 1.6 percent of search results. Meanwhile, Venice came as part of February’s 40-pack update and appeared to have prioritized local results for queries.
Also known as the first Page Layout algorithm update, SEO analysts called it Top Heavy because of the emphasis on ads above the fold or the items that a user sees at first glance. There should be a balance between advertisements and content especially in the top part of a page.
January 2012 saw the launch of the Panda 3.2 update, but there wasn’t evidence that the algorithm changed. The search engine also went for more personalization by pushing Google+ social data and user profiles into the results with a toggle button to turn off the feature.
After November’s 10-pack of updates, Google released another 10 in December. The company also officially announced that users could anticipate these types of posts every month. On January 30th, minor changes were launched including better detection of landing page quality as well as an assessment of sitelinks and snippet relevancy.
Matt Cutts announced the 10-pack of updates through a blog post on November 14, 2011. These algorithm updates were minor but did mark a shift in how the search engine announces changes to its system. On the other hand, while there is no official 3.0 version, industry analysts deemed the update on November 18, 2011, as Panda 3.1. It only had minor changes and influenced less than one percent of searches.
The Freshness update came in November 2011 when the search engine started prioritizing more recent content in SERPs. This algorithm worked with the infrastructure changes that Caffeine entailed to provide more relevant results to users.
On October 18, 2011, Google announced they would be encrypting search queries to tighten security. However, it harmed organic keyword referral data for some sites; making SEO more difficult.
The Panda “Flux" were minor bursts of updates that occurred on October 3, 5, and 13. Some webmasters called them as Panda 3.0 and 3.1.
Panda 2.5 came on October 5, 2011, but the specific details were unclear with some websites getting significant gains while others suffered losses in rankings.
Google introduced expanded sitelinks in August 16, 2011, which helped users navigate through a website on the search results quickly by displaying key sections that you can easily click on. With this, even though people type a general query about a brand, they get quick access to the different parts of that company’s domain.
In September, the search engine rolled out pagination elements that allowed websites to divide a long post into different pages while informing Google that said webpages are from a single article.
Panda 2.0 and 2.1 launched in April and May 2011, respectively. These updates integrated new signals into the ranking algorithm including data about websites that users blocked through the results page or when using the Chrome browser.
The search engine continues to update sites and data impacted by Panda, which occur separately from the main index. The 2.2 version aimed to improve scraper detection. Panda 2.3 consisted of minor enhancements in the system while 2.4 was when the update became available internationally.
Google introduced the +1 button that’s similar to the Like feature, which shows a thumbs-up icon, on Facebook. It showed up directly next to result links and affected the SERPs of a user’s social circle for both organic and paid results.
In February, the search engine conducted a significant algorithm update that had an impact on about 12 percent of search results. Panda brought to the table a new way of evaluating websites. It laid down the groundwork of the battle against spam, content farms, scrapers, and sites with a high ad-to-content ratio.
Google rolled out an update that improves how content attribution is sorted out to stop scrapers on January 28, 2011. This came as a response to high-profile spam cases. It’s believed to be a forerunner of the Panda update.
By December, Google and Bing announced that social media platforms like Facebook and Twitter help a website’s ranking. In the same month, the search engine also began cracking down on sites that rose up the ranks because of negative reviews.
Google Instant was an expansion of the search engine’s Suggest feature and utilized the same technology; launching on September 8, 2010. It displayed search results as the user was typing the query. SEO experts at the time anticipated a significant impact on digital marketing, but it only made a ripple. Meanwhile, Instant Previews, which debuted in November, allows users to check how a website looks by hovering over the links in the results page.
A month after May Day, Google rolled out Caffeine. The team provided a preview of the update back in August 2009, which gave users a glimpse of a massive infrastructure change focusing on real-time crawling, index expansion, and ranking integration. It led to a boost in the search engine’s raw speed as well as a 50 percent fresher index.
In May 2010, Google made changes to its algorithm that affected long-tail keywords. These are three or more words that form a phrase specifically for your brand. The update foreshadowed Panda by penalizing sites that provide low-quality content.
Google implemented a real-time search that incorporated data from Twitter feeds, Google News, and other freshly indexed pages as they were posted. The results page automatically shows users content from these sites which you can pause once you feel that you have enough data for your query.
In September 2009, Google implemented the Vince update which appeared to prioritize big brands. Cutts deemed it a minor change, but webmasters believed that it had long-term implications. The search engine clarified that the enhancement was more about generic queries and factoring trust. Meanwhile, earlier in February, the company partnered with Microsoft and Yahoo once more to support a new link element that helped bots pinpoint canonical versions of pages without disturbing the experience of human visitors.
Dewey was the result of a large-scale shuffle that occurred late March 2008 and early April. The consequences for the update were unclear, but there were reports that the search engine showed different results depending on the server that sent them. Some also believed that the company was pushing its properties like Google Books, although this has never been proven.
In June, the Buffy update was launched, but it wasn’t a major one; just an accumulation of smaller changes in the algorithm. However, it still caused anxiety to webmasters whose websites were ranking well for a particular keyword and, the next thing they knew, their sites went down in the results pages.
Universal search paved the way for Google’s expansion with news, books, videos, and images being integrated into search results. It had a new navigational interface which aimed to improve user experience. May 2007 also marked the end of the old 10-listing SERP format.
Throughout 2006, Google continued to update its supplemental index which was rolled out in 2003. While the search engine reassured SEO experts that it doesn’t affect their sites’ rankings, most webmasters saw the inclusion of their pages in this secondary index as a sign of death for the rank of that particular domain.
The Big Daddy update did not affect rankings and search results in December 2005. It focused on improving uniform resource locator or URL canonicalization, which is the practice of choosing the best URL among several similar choices. The update also changed how Google handles site redirects, indexing, and crawling.
The next official update did not arrive until the last quarter of 2005. Jagger targeted low-quality links such as reciprocal links, paid links, and those from link farms. The massive set of changes were rolled out over three months from September to November.
While Gilligan isn’t considered an official update by the search engine, the community did witness significant changes in the rankings. Matt Cutts, a guy from Google, admitted that the company updated its index with new pages, backlinks, and PageRank data which may have lead to the shake-up in results.
Search becomes more personal as Google rolled out the personalized search update. Before this, users had to customize their settings and profiles to get the information they’re looking for. With this feature, the search engine uses data from your history to provide better results.
Bourbon came with a Google spokesperson announcing that the search engine will implement 3.5 changes in search quality. The update cracked down on duplicate content even within the same site. It affects SEO since the relevance for a single content is split between two pages. Plus, Google always tries to locate the source and show it to users first.
A month after the nofollow attribute was rolled out, Google launched Allegra which caused a big stir in the webmaster community. However, it did not create massive excitement, unlike the Florida update. It continued to penalize websites with suspicious inbound links which led to some sites ranking well after this update while others lost their place in the top results.
In January 2005, Google collaborated with Yahoo and Microsoft to continue the battle against web spam by enhancing how the system identifies spam links. They introduced the nofollow attribute. It’s a value assigned to an anchor text which signals to search bots that the hyperlink shouldn’t affect the ranking of the link’s target in its index. It’s essential in ensuring your site’s reputation, especially when dealing with paid links.
For this update, Google developed a new tactic of indexing called latent semantic indexing or LSI. It became a foundation for today’s indexing methods by using the thematic relevance of blogs rather than focusing on keyword density. Brandy analyzed link neighborhoods and rated the significance of anchor texts per page. It took several days to complete when it began on February 17th, 2004. It ended the 20th of February 2004.
Austin served to supplement the Florida update. It tied up loose ends that the latter wasn’t able to address. This patch cracked down on more spam techniques such as abusing meta data or incorporating invisible text through words with font size 0 or white text on a white background.
This prominent update paved the way for SEO rebirth. With Florida, Google cracked down on web spam. Black hat practices such as keyword stuffing and cloaking were rendered useless and punishable for suspension by the search engine. Several sites lost ranking during this time which led to a lot of furious business owners.
Indexing is the practice of collecting, analyzing, and storing data to optimize bots for speed and performance when searching for relevant files for a query. Google launched the supplemental index with the intention of indexing more documents without decreasing the quality of performance.
Google Dance finally reached its end with the Fritz update. Webmasters breathed a collective sigh of relief as they no longer experienced the anxiety-inducing effects of monthly algorithm adjustments. This is because the search engine has opted to update its search index daily instead of once a month. This significantly reduced the fluctuation in rankings as a result.
Esmeralda is more of an adjustment with Google’s search infrastructure rather than an addition to its algorithm when it was rolled out. It signified the end of monthly updates for the search engine and marked improvements with greater continuity. This is the time when Google Everflux, or the term that describes how positions in SERPs continually shift in short bursts, was introduced.
Dominic changed the way Google measured backlinks or the number of websites that mentioned and incorporated your link in their pages. This was launched just a month after Cassandra was released. Link building through link farms, or a set of web pages used for linking to a target page, became difficult.
In April 2003, Google rolled out the Cassandra update which was an effort to eliminate the practice of link spamming especially between co-owned websites. Hidden content was also targeted. Sites that manipulated bots through links and texts in their source codes that weren’t visible to users suffered massive ranking losses.
Boston was Google’s first official update. It got its name from the location of the “Search Engine Strategies” conference where it was announced. A significant consequence of this update was Google Dance, which pertains to the monthly update rolled out by the search engine that results in ranking shifts. Before Boston, there was a major reshuffling in September 2002 which may have foreshadowed the upcoming influx of changes that the search engine was planning.
The Google Toolbar is a free browser plugin launched which enables users to search for keywords within a page even if the website doesn’t have its own search engine. It has a Find feature that allows you to locate search terms in a blog post easily by highlighting the words for you. When it rolled out, the toolbar was initially made available to Microsoft Internet Explorer.