Online Search Become More Semantic thanks to Google Algorithm Updates

During the early days of Google, its search platform wasn’t all that good compared to today’s search platform. In those days, search engines including Google were little more than keyword-matching and link-counting mechanisms. Ranking high in search engine results could be accomplished by essentially using two important processes, and they are:

  • Stuffing your keyword phrase into your website or webpage as many time as possible.
  • Getting as many inbound links as you possibly could, irrespective of their quality.

Google Updates

For those early SEOs who utilized the system i.e. gaining high rankings while adding little for actual searchers, but the fun would not last for long. Every time Google found a weak spot in its ability to deliver high-quality, relevant search results, the company made necessary fixes to address it. The major changes to its search algorithm started with “Boston” update in 2003 and from that time onwards Google named its major algorithm updates and made regularly.

PageRank – The First and Well-Known Algorithm Used by Google

Unlike other search engines, Google’s aim was to provide relevant, instant and simplified search results to users. As a start, Google developed the PageRank algorithm which is used to rank websites in its search engine result pages (SERP). The name ‘PageRank’ was named after Larry Page who is one of the co-founders of Google. The search algorithm is considered as a way of determining the importance of webpages in its search results. According to its founders, PageRank works by counting the number and quality of links to a webpage to determine a rough estimate of how important the website is.

Google Page Rank

The PageRank algorithm relies on the uniquely democratic nature of the internet by using its vast link structure as an indicator of the value of an individual webpage. In essence, Google interprets a link from page X to page Y as a vote, by webpage X, for webpage Y. But, Google looks at considerably more than the sheer volumes of votes, or links a webpage receives. For example, Google also analyzes the webpage that casts the vote. Votes cast by webpages that are themselves “significant” weight more heavily and help to make other webpages “significant”. By using these and other parameters, Google provides its views on webpages’ relative importance.

Of course, important webpages mean nothing to you if they do not match your search query. So, Google combines PageRank with complex text-matching strategies to find webpages that are both important and relevant to your search. The search engine goes far beyond the number of times a term appears on a page and examines all of the factors of the webpage’s content, including the content of pages linking to it. All of this is done to determine if it is a good match for your search query.

Importance of Google Algorithms in Online Search

According to the internet research firm Netcraft’s estimates, there are more than one billion active websites on the web, and the task of sorting through all of these sites to find helpful and relevant information is enormous. That is why search engines use mathematical instructions better known as algorithms that will tell servers how to complete assigned tasks. When you do an online search on Google, its algorithm does the work for you by searching out webpages that contain the keywords you used to search, then assigning a rank to each page based several aspects, including how many times the keywords appear on the page.

Webpages that are ranked high in Google’s SERPs mean that the best links relating to your search query are theoretically the first ones Google lists. For webmasters, being listed prominently on Google can result in a big boost in site traffic and visibility. In 2007, the popular search engine beat Microsoft as the most visited website on the web, and with that much traffic, getting a good spot on a Google search result page could mean a huge boost in the number of website visitors.

When it comes to Google’s keyword search functionality, it is somewhat similar to Bing. Yahoo and other search engines. Basically, search engines use automated programs called crawlers or spiders that move around on the web from link to link and building up an index page that includes certain keywords. Google references this index when a user enters a search phrase, and the search engine lists the webpages that contain the same ‘keywords’ that were in the user’s search phrase. Google’s web crawlers may also have some more advanced functions such as being able to determine the difference between webpages with actual content and redirect sites i.e. pages that exist only to redirect traffic to a different webpage.

Google’s Efforts to Suppress Manipulative Schemes on its SERPs

After Google’s founding in 1998, its search platform attracted a loyal following among the growing number of internet users, all thanks to its simplistic and ‘iconic’ design. PageRank, hyperlink analysis and other off-page factors were considered as well as on-page aspects such as headings, meta-tags, keyword frequency, links and site structure, were used to enable Google to avoid the kind of manipulation or illicit practices seen in search engine that only considered on-page aspects for their rankings.

Although Google’s PageRank algorithm was more difficult to manipulate, but many webmasters had already developed link building techniques and tools to influence the Inktomi search engine (which is later acquired by Yahoo), and these strategies proved to be effective in manipulating PageRank. Many websites focused on exchanging, purchasing and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of websites for the sole purpose of link spamming.

Due to the rise in popularity of internet search, search engines including Google incorporated a wide range of undisclosed factors for ranking websites in their SERPs, during the 2004-07 year period. The leading search engines, Google, Yahoo and Bing, do not reveal the algorithms they use to rank webpages. In 2005, Google started personalizing search results for each user, and depending on their history of previous searches, the company crafted results for logged in users.

Black-Hat SEO Techniques or Spamdexing

With Google started introducing PageRank and other algorithms to its search network, webmasters who are looking for ‘shortcut’ ways to earn better search result rankings and for that they use spamdexing or otherwise popularly known as black-hat SEO techniques. They are illegal techniques which are used for the deliberate manipulation of search engine indexes. These techniques involve number of methods such as repeating irrelevant phrases, to manipulate the prominence or relevance of resources indexed in a manner inconsistent with the purpose of the indexing system. It could be considered to be a part of SEO, though there are many legitimate or ‘white-hat’ search engine optimization techniques that help in improving the quality and appearance of the content of websites and serve content useful to intended users.

Google, Bing and Yahoo use a variety of algorithms to determine page relevancy, and some of these include determining whether the search query in the URL or body text of a webpage. These search engines check for instances for ‘black-hat SEO’ and will remove those pages that use those illegal techniques from search engine indexes.

Also, SEO professionals can quickly block the results-listing from entire websites that use ‘black-hat SEO’, perhaps alerted by user complaints of false matches. According to industry experts, spamdexing emerged in the mid-1990s as search engines at that time were less useful. These days, websites that use unethical methods to rank higher in SERPs are categorized as spammy websites and they face penalty action from search engines, and they are stripped of their rankings in search engine results. Commonly, black-hat SEO is broadly classified into two types:

  • Content Spam – For example, keyword stuffing, article spinning, hidden text, meta-tag stuffing, machine translation, etc.
  • Link Spam – Link farms, hidden links, spam blogs, link-building software, cookie stuffing, guest blog spam, page hijacking, etc.
  • Other forms of Spamdexing – URL redirection, mirror websites and cloaking

5 Major Google Algorithms that Changed Whole SEO Landscape

For providing definite search standards for websites and at the same time battling black-hat SEO, search engines put forward Webmaster guidelines. Websites following these guidelines will Google and other search engines find, index and rank websites in their respective SERPs. These days, it is a must for website to follow these guidelines so that they will be prominently displayed in search engine results. Google has its own Webmaster guidelines which are given under 3 headings – Quality, Technical, Design and content guidelines. Among these the Quality guidelines outline some of the illicit SEO techniques that may lead to a website being removed entirely from the Google index or otherwise impacted by a algorithmic or manual spam action.

From its first PageRank algorithm to this day Google rolled out several major and minor algorithm updates to its search network to improve its search relevancy, loading speed, user experience and to filter out spammy webpages from its SERPs. Here we are only focusing on the five major Google algorithm updates that visibly changed the whole online search or SEO landscape.

Google Panda Update

For filtering those websites with spammy or thin-quality content, Google introduced the infamous Panda algorithm updates with the first rollout in February 2011. According to Google Webmaster Blog, the algorithm update aims to reduce the search visibility of “low-quality websites” or “thin websites” and return high-quality websites near the top of Google’s search results. After the initial rollout, there was a surge in the SEO rankings of social networking sites and news websites and a drop in search result rankings for websites containing large amounts of ad-based content.

Google Panda Update

This change reportedly impacted the rankings of almost 12% of all search engine results. Soon after the Panda rollout, many websites, including Google’s webmaster forum became filled with complaints posted by copyright infringers/scrapers claiming that they got better SEO rankings than websites with original content. At one point, the internet search giant openly asked for data points to help detect scrapers/copyright infringers better. After the original rollout of Google Panda in 2011, there were several updates and the impact went worldwide in April 2011.

Google Panda can be considered as a filter that prevents low-quality websites and/or webpages from ranking well in the SERPs. According to the patent for Google Panda filed on September 2012, it states that the algorithm creates a ratio with a website’s inbound links and reference queries, search queries for the website’s brand. That ratio is then used to create a sitewide modification factor. This sitewide modification factor is then used to create a modification factor for a webpage based upon a search query. If the webpage fails to satisfy a certain threshold, the modification factor is applied and, therefore the webpage would rank lower in the SERP.

For the first two years, Google Panda updates were rolled out about once a month but Google stated in March 2013, that future updates would be integrated into the algorithm and, therefore, less noticeable and continuous. On September 2014, Google released a “slow rollout” of Panda 4.1 for that entire week.

Google Penguin Update

If Google Panda algorithm is aimed on sites with spammy or bad-quality content, then Google’s Penguin algorithm which first announced in April 2012 is aimed at websites with bad-quality links. According to Google, the update is aimed at reducing the search visibility of websites that are not compliant to Google Webmaster Guidelines and uses black-hat SEO tactics for artificially increasing the ranking of a webpage by manipulating the number of links pointing to the webpage. Such link-building practices are commonly described as link schemes.

Google Penguin Update

After the original Penguin update rollout, Google estimated that it affects nearly 3.1 percent of English search queries, about 3 percent of search queries in other languages such as Arabic, German and Chinese. Also, Google said that its Penguin update affects a bigger percentage of queries in “highly spammed” languages.

On May 2012, Google rolled out Penguin 1.1 which is according to Matt Cutts who is the head of webspam division in Google, was supposed to affect less than 1/10th of percent of English search queries. The guiding principle for the update was to impose penalty on sites that are suing manipulative techniques to gain high SEO rankings. Google specifically mentioned that doorway pages which are built only for attracting search traffic are violating Google Webmaster guidelines, and they will be severely affected by algorithm updates.

In January 2012, Google rolled out the Top Heavy Update of Penguin, also known as Page Layout Algorithm Update targeted those websites with too many ad content, or little content above the fold. On October 2012, Penguin 3 was released and it affected 0.3 percent of search queries, and on May 2013 Penguin 2.0 affected 2.3 percent of queries. According to many experts in the SEO industry, Google released Penguin 3.0 on October 2014 which according to Google was an algorithm “refresh” with no new signals included.

Google Payday Loan Update

Google rolled out the first Payday Loan algorithm update on June 2013, and it affected approximately 0.3 percent of all US search queries, and it affected more than 4 percent of all Turkish queries, where spammed search queries found to be more prevalent. Actually, Google Payday Loan is a set of algorithm updates and data refreshes for its search platform initiated to help and identify and penalize webpages/sites that utilize black-hat SEO. Google launched the Payday algorithm to filter out spammy or thin-quality websites that are using manipulative tactics to boost their search result rankings for heavily trafficked search queries such as “Viagra”, “casinos”, “payday loans” and various pornographic search terms.

After its original rollout on June 2013, Payday Loans 2.0 update released on May 2014, followed several weeks later by Payday Loan update 3.0. The algorithm 2.0 update focused on targeting spammy sites, while the 3.0 update focused on addressing spammy search queries. While Google Payday Loans algorithm shares some similarities with several other algorithm updates such as Google Hummingbird, Google Penguin and Google Panda, each of these algorithm updates has an individual improvement focus on the Google search engine.

Google Hummingbird Algorithm Update

On the company’s 15th anniversary, Google rolled out the first Hummingbird algorithm update on August 2013, and it is the first major update to Google’s search algorithm since the “Caffeine update” in 2010. But even that was limited primarily to improving the indexing of information instead of sorting the information.

Hummingbird Update

Unlike Google Penguin and Google Panda updates which impose penalty action on spammy websites, Google’s Hummingbird update focuses on conversation search as it leverages natural language, semantic search, and more to improve the way search queries interpreted. Unlike Panda and Penguin updates that would target each individual word in the search query, the Hummingbird update considers each word but also how each word makes up the wholeness of the search query. That is, the whole sentence or conversation or meaning which is taken into account rather than particular words. The goal is that webpages matching the meaning do better, rather than pages matching just a few words. The Hummingbird can be seen as an extension of Google Knowledge Graph, and the update is aimed at making interaction more ‘human’.

According to Google, Hummingbird places greater importance on-webpage content making search results more relevant and appropriate, ensuring that Google delivers internet users to the most appropriate webpage of a site, rather than to a homepage or top-level page. SEO received little changes with the addition of Hummingbird through the more top ranking results are ones that provide natural content that reads conversationally. While keywords within the search query still continue to be significant, the Hummingbird update adds more strength to long-tailed ones, effectively provisioning to the content optimization rather than just keywords. As Hummingbird update also focuses on voice search, webmasters will now have tune search queries that are asked naturally i.e. targeting key phrases that begin with “Why, Who, How, and Where” which will prove beneficial towards SEO.

Google Pigeon Algorithm Update

For providing better local search experience for users, Google rolled out Pigeon Update on July 2014, and it is aimed to increase the ranking of local listing in an online local search. According to many SEO experts, the changes due to the rollout were visible, especially in Google Maps. As of now the rollout of Pigeon update will be targeting US English local search queries and shortly expand to queries of other languages and locations. This update will provide the results based on the user location and the listing available in the local directory.

The main task of Pigeon update is to provide preference to local search engine results, and this is quite useful for user and the local business entities. On the day of release, there was a mixed response from the SEO industry with some complained about the search rankings lowered whereas others reported some improvements in their search result rankings. As per SEO experts understandings

Google Pigeon algorithm has location and distance as key aspect of the search strategy. The local directory listings are getting preferences in online search results. To improve the quality of local search, Google is relying on important aspects such as location and distance to provide better local search experience for users. According to Google, this algorithm update will modify the local listings in the search engine results along with this he local directory websites are given preferences.

How to Recover from Google Penalty Action

As said before, algorithm updates especially Penguin and Panda will filter those sites/pages that are fully compliant to Google’s Webmaster Guidelines, and they are stripped of their SEO rankings or will impose penalty action on those sites. If you suddenly see a traffic drop, you will have to find out what caused it. There are two main penalties you can get, and they are manual action and algorithmic action. Both of them will adversely affect your search presence and recovering from such a penalty takes more time. You need to find the root cause of the issue and rectify those mistakes. Some of the common reasons for penalty include spammy content and bad quality links pointing to your site. To recover from the penalty, you need to remove all the bad-quality links from your site and/or replace spammy content with original, relevant content.

If you are not able to the recovery process by yourself, then consult with experts who are good in this field. They possess the right tools and techniques to recover your site from manual or algorithm penalty. Keep in mind that the whole recovery takes more time and you cannot expect results, so a better way to avoid a penalty is to make your site search-friendly and must be fully compliant to Google Webmaster guidelines.