For over 20 years, Google has made consistent changes to its algorithm. We want to give a detailed explanation of all of the updates that made Google the trillion-dollar industry it is today.
Google's success comes from analysis, customer feedback, and updates to help improve its technological infrastructure. However, there have been reports of spam, DMCA attempts, and spam that have attempted to reduce Google's search engine quality.
Still, Google remained strong and continued to make iterations of their software. This allowed them to stay ahead of their competitors and continue to resonate with their users. Are you ready to learn more? Then let's get started!
Google’s main purpose is to provide the best web searching experience to its users. Google accomplishes this by creating a fast, easy, and simple way to find the most relevant and important information available.
Google’s innovative technology usage has won them multiple awards:
PC Magazine: Technical Excellence Award
The Net: Best Search Engine
Webby & People's Voice Award: Best Technological Advancement
A growing number of companies such as Salesforce, Cisco, Yahoo, Washingtonpost use Google's search algorithm to search on their websites. Now in 2022, it is worth over 1 trillion dollars, all due to one search tab.
Usually, Google releases its updates in 2 weeks.
Historically, Google gives out updates every few months. That amounts to around 2-3 updates a year. We can see how Google has been focused on providing the best search experience possible throughout these constant updates.
The December 2019 core update was the largest update that Google released. The September 2019 update did not resonate well with webmasters and SEO specialists, as they said it didn’t have as much impact as the previous updates.
When Google makes an update, it allows your site to perform well or badly on the SEO rankings. Knowing when Google sends out the updates allows us to adapt to its algorithm, helping us become more aware of site changes more often.
Every time Google releases a new core update, you should be looking at the rankings and analytics of your page. This will help you see what changes occurred and create data-driven decisions to help your marketing, web, and analytics team.
For example, let's say that your website receives a lower number on its monthly rankings. You notice in Google Analytics that the number of your page views has gone down. This could mean that your website needs to have more engaging blog posts or videos to rank higher in Google's algorithm.
The May core update is live.
The algo weather tools showed some increase on this day, but no update was announced. Large changes were reported for many affiliate and E-Commerce sites. Google did not confirm this was related to previous product reviews updates, but it does seem that pages with product reviews were affected. This update was interesting enough for us to do a full writeup on: The May 16-18 unannounced Google update.
There was SERP turbulence on this day but no official announcement of an update.
SERP turbulence. It is worth noting that this weekend was a Bank holiday in the UK and other parts of Europe, which can cause changes as well.
Google's knowledge graph appears to have updated on this day. Many people reported losing their knowledge panel following this update.
The product reviews update that started on March 23 finished rolling out on this day.
Google released their third product reviews update. Here is their announcement. Marie discusses this update in episode 229 of the SNYCU newsletter. This update finished rolling out on April 11, 2022.
Significant shifts seen in the search results. Some wondered if these changes could be connected to the end of the rollout of the Page Experience update (Feb 22-Mar 3.) We do not think this is the case. SER article on this possible update
Significant shifts seen in the search results. Many site owners suspected losses seen were due to the rollout of the Page Experience update (Feb 22-Mar 3). However, we could not find a connection here. It is worth noting that February 24 was the day on which Russia invaded Ukraine. Many SERPs changed as a result of the war. Also, many people changed their search patterns which could affect your traffic. SER article on this possible update.
Google announced that the page experience update began rolling out on desktop. The page experience algorithm encompasses a number of explicit ranking factors including interstitials, use of https, mobile friendliness, and most importantly – Core Web Vitals. As of March 3rd, the update has finished rolling out.
Not an officially announced update, but we saw several sites with increases or decreases on this date. Barry Schwartz wrote that while the Semrush sensor was high, there was not as much chatter in the search community as expected.
Google has added another anomaly on its performance report graph. Starting February 1st, performance metrics suffered a logging issue for Search, Discover and Google News. Google noted that the error was solely confined to reporting and did not affect search rankings.
Google is reporting a logging issue for "Search data for images" in GSC from January 24-27. Search Engine Land shares that you might be seeing a drop in clicks and impressions in your performance report but Google says this does not reflect an actual traffic drop and has no impact on actual rankings.
On December 17th, Google’s team verified to many surprised and confused webmasters that there was a product review update that had been completed.
Alan Kent disseminated the information with a simple tweet, approximately 20 days from when it was first mentioned as a sweeping update.
The Google Product Review Update was far more extensive than almost any other update for over 8 months targeted at product review content. For some affiliate marketers, product review site owners, and more the update led to massive changes in rankings for some of their most high traffic pages on top keywords, essentially resulting in a great rearrangement of the SERPs based on new criteria for valuing the quality of a product review’s content.
Google’s constant push to purge low quality content and reward high quality content has made a massive impact on a variety of product review site owners. They’re seeking to encourage site owners to break the mold of the standard product review and to put more effort into creating the best individual reviews opposed to rewarding volume of product reviews.
Differentiation of your product reviews from the competition will be one of the most important factors in ensuring Google’s updated algorithm ranks your content appropriately based on its relevance.
This can include things such as but not limited to:
- Including unique photos taken directly of the product
- Audio, video, and other evidence of the product being used in action or to help assess the quality and features of products being reviewed
- Greater length and more in-depth info as well as comparisons to other products
Google opted to release a new update meant to revitalize the design of how top stories are presented to its users to make it easier for them to find relevant stories related to their searches more quickly.
The alteration matches an update that was initiated on mobile search versions. It’s primary focus is to make it easier for high trust sources to have their stories featured at the top of the SERPs.
Google’s algorithm is getting another massive update. It’s a follow up to April 2021’s update as part of an initiative to increase quality of product reviews to be more useful for users based on their feedback. Sites that rely heavily on succinct buying guides to rank for specific terms or products are likely to see a negative impact on their rankings because of the update.
Google’s team has revealed that it is putting greater importance on deep diving, more detailed research when estimating the value of product review content so that it can make a greater impact on user experience when searching for products they’re interested in.
There is a higher priority placed on product reviews that are created by hobbyists and those that specialize in the subject as to differentiate from generalists that are chasing quick/low hanging fruit terms to siphon traffic from more “legitimate” sources.
Their guidance has changed to reflect more accurately the opinions of what users deem as more valuable during their search process for assessing and comparing options for products that solve a problem within their niche.
Some sites that already take steps to provide more details in the form of video, audio, and other examples of practical knowledge using a product help the search engine algorithm and users better assess their overall quality in comparison to competitors.
This also includes providing multiple choices for how to purchase it instead of relying on one source for affiliation such as Amazon to purchase.
If possible, include a variety of media on each of your blog posts and other places where you host your content. The written content that your brand or site releases should add as much detail as possible with multiple options for purchasing from different vendors.
The biggest target for this release were spam sites and other sites that rely on using short pieces of content to rank without fully informing the user about all the info they need to make a wise purchase. The more complete your pieces of content regardless of if its video or text, the better off your rankings will be.
Google’s newest primary core update is one of its most complex. While everytime a new update can happen, SEO specialists, site owners, and digital marketers everywhere rush to panic, this broad update is harder to pin down what specifically Google is working on improving.
Like most core updates, SERP positions can take a hit or rise, and the timing had many ecommerce and other site operators in an uproar regarding the fact it would unroll during and before, one of the biggest holiday season periods that drive sales, especially during one of the hardest years on record.
WIthout much clarity on what exactly changed, Google remains consistent that as long as content is optimized and high quality, in the long term, rankings will stabilize, even if there are some short term changes.
Google released a new Spam update that has led to a few ranking changes for sites. While unlike other ones, there was no specification regarding what exactly was being targeted, whether it was links or content. The most up to date information that was provided was that site operators should focus on following Google’s best practice recommendations for optimizing for search.
This means, diversified, high quality, well researched content that informs users. It also means high quality links, proper interlinking of pages, site speed optimization, and ensuring the user experience is as good as possible.
A variety of ranking tools have reported massive changes in Google’s algorithm despite no observable changes, which leads to this update being primarily speculative. Whilst SERPs changes did occur, nothing was announced from Google’s side and so pinning down exactly what the cause was is hard to ascertain.
Beginning the middle of August, many search engine specialists became aware of a consistent issue with Google mass changing titles for pages as part of a “quality improvement” system that had some less than stellar results.
Title names for web pages were completely rewritten or heavily altered based on what someone was searching for opposed to keeping it static as intended. Google said that the point was to make it easier for people to find what they were looking for even if the title wasn't an exact match, by making it closer to what they were looking for.
Google analyzed pages and then made alterations because of the plethora of pages that either over optimized their title pages with tags or the lack thereof, using general word choices to explain what was within a particular web page for a site. HTML and tagging plays a large role in ensuring your site or those you operator are properly optimized and recognized via Google’s system.
This update caused several sites ranking positions to change depending on their link sourcing practices and backlink profiles. This targeted primarily spammy links across multiple languages so as to ensure less rankings for those buying non-relevant or mass links from poor sources were not upheld.
If your site experienced changes in rankings, it’s likely best to do an audit of your links going to and from your site and to disavow any low quality links that could be tanking your ranking. A clean backlink profile is an essential component of Google’s search algorithm, and as content and link specifications are altered in the coming year and beyond, organization and constant management is key.
Google rolled out the Product Reviews update as a response to the spammy product reviews on online marketplaces. The Product Reviews Update rewarded sites that had organic reviews. Organic reviews are made from users who tried the product first hand and have a review to back up their experience.
There are times when your e-commerce site has a change in review quality. You have to sift through your customer product reviews and eliminate the ones that are spam, fake, etc.
Ask yourself the following questions. Do your reviews:
Explain what makes a product stand out from the competition?
Discuss the benefits and disadvantages of a product based on prior research?
Identify important decision-making factors and how your product is performing in that area?
Explain how the product physically looks or works and provides unique insights that the manufacturer does not provide?
Cover why the product will be useful in certain situations and use cases?
Describe why your product has improved from previous versions?
Describe how the product was designed and how it benefits its users?
Remember, quality > quantity when it comes to reviews. Google will rank your business higher if you have more organic reviews. Survey your previous customers and ask them to give an honest review of your product. That way, you can receive better feedback on your business while increasing your visibility on Google's algorithm.
Google announced its Spam updates from June 23rd to June 28th.
Based on Google's post, it's very ambiguous how Google has made its updates. Still, we can see that lots of Google's security measures have increased. Here, we see that Google has directed its viewers to their Google Search Console.
Google Search Console is a tool that helps webmasters gauge the quality and performance of their websites. Unlike Google Analytics, GSC only focused on web traffic that comes from the web source - not from other sources such as traffic ads and site referrals.
While you can’t use Google Search Console to make direct changes to your site, you can use the tool to send your pages to Google’s index.
For SEO professionals, Google Search Console helps with adjusting your web strategy. The data received from Google Search Console can help your business find new opportunities, learn how traffic is coming to their website, and boost web performance.
Learning the quality of web traffic is an important analytical skill. Google Search Console made it easier for websites to make changes in their site behavior.
This improved Google’s algorithm by creating areas where page activity can be tracked and measured. It forced webmasters to improve their website to become higher on Google’s rankings.
Google announced its Experience Update, which was targeted towards the user browsing experience. Sites that had a better UX/UI design would be rewarded with higher search results ranking from Google.
To give more accurate insights, Google Unveiled the Page Experience Report. It evaluated the website’s page experience based on the following criteria.
Your website needs to be mobile-friendly. Doing this would ensure that the website screen was easier to see on both iPhone, Android, and other mobile devices. The website should not have mobile usability errors shown on the Google Search Console.
If you see a red sign on your chart, changes to a page on your site are causing the error. You have to press the "Validate Fix" button to ensure that the page is working again.
Your page needs to have HTTPS enabled in order to receive a "Good page experience" status by Google. If your site has a high ratio of HTTPS, it will receive a "Failure" mark. To fix this, reduce the number of URLs on the site.
The Core Web Vitals are used to measure the responsibility, stability, and speed of a page loading screen for users. It provides the following ratings:
Each page will receive a rating based on how well it works for the users. Note: Some of the data will take a day to appear on the Core Web Vitals report. To receive a good rating, ensure your website is configured properly and doesn't have any low-quality links.
The URL needs to have the Data from the Core Vitals Report. URLs that don’t have a Core Web Vial are not able to make a web page.
Missing data is a result of a new property in the Search Console, or there isn’t data in the CrUX that provides useful information. Here are some reasons why you might be missing data.
Not Enough Data: If you see this error, you don’t have a sufficient amount of data on your Core Web Vitals or for the Website URL.
Not Enough Recent Data: This error could state that you don’t have the CrUX data to make Core Web Vitals information.
HTTPS Not Enough Data: When seeing this error, it means that the Search Console cannot find info on the HTTPS URL on the website.
An HTTPS page is important to rank based on Google’s latest updates. To make one:
Make a Domain Property: Your website will need a verified domain property.
URL-Prefix Property: You can then create the URL-prefix property.
Google has made advice for if a previous core update has negatively impacted your site. There are no specific actions you'll need to take to recover. Google has provided a list of questions if your site was affected by a major core update.
After the updates, Google said there would be some recovery. But the largest changes you’ll see on your rankings will happen on the next core update.
In 2021, Google released two core updates. Google said that the update would take at least 1-2 weeks to complete. In reality, the June update was made in 10 days. On average, Google core updates take 14 - 30 days for a successful rollout.
This was a general update, but it did result in fluctuations in search rankings. If you notice this happening to your page, wait a few days before Google shows an update to your page.
Whenever Google rolls out an update, you should check your analytics system. The January Core update was Google’s general update for its US and global search engines. Since it was a sweeping update to search results, the update did not provide anything specific to help webmasters improve their page.
Google’s Core updates always show volatility on the web page’s analytics afterward. This is Google’s way of rebalancing its search engine and determining where your page ranks against the competition.
It's important to pay attention to how your page ranks within the following weeks of a Google update.
Don’t panic. Most algorithm updates cause a change in the SERP rankings throughout all organizations.
When analyzing your website for 2020, ask the following questions:
Does the website have well-targeted keywords?
Does the site have worthy information and backlinks its sources?
Is the website content relevant and informational to the user?
Google recommends that websites follow the EAT (expense, authority, trustworthiness). Basically, creating quality content that solves your customer’s problems is a great way to increase your site’s Google search rankings.
Google Glitch was an indexing issue that occurred in Google’s search algorithm.
What Can We Learn From Google Glitch?
Webmasters need to pay attention to their web analytics. It will tell you when your traffic has increased or is less than it was the previous week. Check your analytics to see if there is any drop-off in clicks and conversions. Doing so will ensure that you have the most detailed consumer information when they visit your site.
Add an extra memo to the analytics notion that the organic search results should be ignored during this glitch. Fortunately, Google fixed the glitch within 48 hours of its arrival. Thus, meaning that you should protect your website in the event search engine errors occur.
Previously, Google Analytics was called Universal Analytics. While Universal analytics helped SEO professionals and analysts know what’s occurring on their site, it was still limited. You could only use Google Universal Analytics to track website activity.
In 2020, Google understood that most of its users have mobile phones. That’s when Google released Google Analytics 4.
Smart Insights: GA4 allowed for machine learning reports to be made out of user data. It spots important trends for your data and you can create custom insights based on your business needs. For instance, you can configure GA4’s insights settings to alert you when there is a sudden drop in web traffic.
Google Ads works with GA4 to provide better Ads for your target audience. This allows you to refine your ads campaign and target your ads to reach more conversions.
Google Analytics 4 tracks data from multiple identity sources. This allows you to get a clear idea of how your customers are interacting with your page and what sources brought them to your site.
The contribution analysis feature scans for data to search for anomalies and looks for user segments that contribute to them. For instance, if you notice an increase in conversions, the automated feature creates the anomaly and provides user segments.
GA4 can create predictions with the help of machine learning. This allows you to automatically predict possibilities like user churn probability to better prioritize audiences and identify issues.
Previously, Universal Analytics used Goals to track user activity. However, this was swapped out for Custom Events on the GA4 update. Custom events can be created to track user behavior, such as website clicks and buying from a shopping cart.
This is a powerful feature because it's highly customizable, and you can track what leads to a conversion. For Google Ads campaigns, you'll need custom events to ensure that their behavior can lead to more conversions and revenue.
GA4 allows users to make life cycle reports on their consumer activity. This allows them to see their entire journey through their web page. This is more detailed than Acquisitions as it allows you to see how your customers are interacting click-by-click.
With these updates, getting analytic reports has been easier than ever. You can create funnel reports, cohort analysis, and create custom dashboards based on your information. Through the help of Google Analytics 4, marketing teams have the information needed to aid their users.
Google Analytics 4 brought machine learning analytics to Google’s algorithm. It allowed for analysis teams to create a property and receive detailed information on their audience.
The difference between Google Analytics 4 and Universal Analytics lies in its data reporting. Google Analytics 4 can receive data sources from websites and mobile applications. This overrides Universal Analytics because UA only allows you to track data from a website.
Universal Analytics has a great interface, but it appears slower than GA4. Google Analytics 4 has an interface that's easier to read and create your own analytic insights. Interestingly enough, Google allows UA properties to be converted to GA4. Because of this, Google's search engines are improved due to the improved information webmasters receive on their viewers.
Google created a site diversity update that improves situations where the website has more than two organic listings. Basically, the update wants to increase rankings by having diverse search results.
SEO users complained that Google shows too many listings when searching for their site. So if you do a search for a particular query, you'll see around 4-5 different search results from the web domain.
Google responded to their complaints by creating a diversity change in their search engine results. This allowed for better pages to appear on their algorithm and gave more light to mobile users.
What about Sub-domains? Sub-domains are treated the same as main domains in their algorithm. Make sure you know both domains to ensure that Google ranks it.
During this time, Google was seeking ways to innovate natural language processing.
BERT is an application that specializes in completing NLP tasks. It provides multiple solutions at once, which helps Google make its decisions faster. After adding BERT to Google's algorithm, we could see Google's first step towards machine learning. It can perform 11 common NLP tasks but with added speed and efficiency.
BERT is a bi-directional system, meaning that it looks at the words before and after entities. It uses Wikipedia to determine the context behind the word. Thus, giving the system a better understanding of the language.
BERT is not good at understanding what things are not. In scenarios where BERT hasn’t seen negation context or examples, the system will have a hard time understanding that.
Unlike previous updates, webmasters cannot optimize their sites for BERT. To improve your website after this update, you have to focus on the quality of your web pages.
The Brackets update was made as a general update to Google's algorithm. Each day, Google releases updates like this to improve its search engine results. This update is prioritized on the website's rich snippets.
There is a quality threshold for your website’s rich snippets. This means that low-quality snippets will result in lower page rankings. For instance, this update affected affiliate websites as some of them did not have the content or internal backing needed to suit Google’s algorithm.
Previously, Google made an update that increased the meta description characters to 300 characters. However, they found that change was causing a reduction in the quality of the descriptions.
Writing web descriptions is important for SEO professionals as it helps your page become more visible to users. Longer meta descriptions can be duplicated or made from an existing page. Google's search rankings won't index these pages unless you edit the descriptions and remove the duplicate information.
Unless something changes, you can still experiment with longer meta description tags. 300 characters is a safe bet for a meta description. Some snippets will have to be reduced, but the benefit of snippets far outweighs the risk.
That’s not to say you should pad your meta descriptions to reach the character limit. Make your snippets useful and engage so your site can receive new user clicks.
If you’re limiting your meta descriptions, make sure that they’re relevant to your blog post topic. That way, Google can index the description and add slight increases to your page ranking.
Don't panic! To fix this, simply rewrite longer descriptions on important pages. It makes sense to rewrite descriptions on important pages, especially if the cut-offs lead to bad results.
After this update, webmasters have taken further steps to ensure their site's meta description would favor them in Google's ranking.
On June 14, 2018, Google announced a video carousel for desktop SERPs. Before this update, they were only for mobile SERPs; this update was made to ensure that Google's desktop version was as compliant as its mobile version.
It didn’t take long to realize that Google increased the number of video carousels in their search engine.
When you search for a topic in Google's search engine, do you see a Rolodex of YouTube videos? That's the video carousel update at work. Google made this update to ensure that video content could rank higher on their algorithm.
For video content producers, the update was a chance to grow in Google’s rankings. Make content that’s original, engaging, and appealing to your audience. That way Google can reward you by placing your content on the top of their search engine results.
On June 20, 2017, Google announced its new job-matching platform Google Jobs. The platform was made to help employers and job seekers through collaboration and placing them in the relevant industry.
Google Jobs has provided multiple professional opportunities for people in tech and non-tech-related fields.
First, it would create prominent rankings within the search results. For instance, your postings can be placed on Google's new job search feature, including your ratings, reviews, logo, and job details.
Second, Google Job’s search engine results in more motivated applicants. Job seekers can filter out information based on criteria such as job title or location, so you’re more likely to get applicants that are interested in your company.
Third, Google Jobs has increased the chances of conversions and discovery. Conversions are important for any business. It tells you if your users are actively engaging with your business. And with discovery, you can see who is viewing your company.
Google Jobs is an update needed for the professional industry. It has helped with giving new professionals the chance to find employment while giving companies the ability to get qualified employees for their job positions.
Google released this update to help with protecting its users. Google Chrome would show security warnings in two situations: When users enter their data on an HTTP page or when an HTTP page is in Incognito mode.
Credit cards and passwords are not the only forms of sensitive data. Any form of data that's not accessible will be shown as a warning on the top right of the screen.
Through this security warning update, Google started to make its way towards the HTTPS level of security. This ensured that the user's data was protected, and they could utilize Google's search engine without losing valuable information.
After testing Google’s snippets for over 2 years, Google has finally released a new update. Google increased the text length for a snippet. Snippets are meta descriptions used to show a brief summary of your product or your blog.
Google created a new Meta Description limit. Previously, the meta description was maxed at 155 characters. Now you can create a meta description of up to 300 characters.
Google announced its second version of the “mobile-friendly” update. The goal of this update was to give Google a boost in its mobile algorithm. At the time, users were browsing Google through their phones, so Google needed to update its algorithm to suit their needs.
Google's mobile algorithm works on a page-by-page signal. This means that every page on a mobile app has the chance to be indexed. It takes time for Google to look at each page and optimize them within their rankings.
If you have a mobile website, optimize it by having a good UX/UI design, and keep your pages simple. Mobile web pages are smaller than desktop pages, so you’ll have to place your content in a manner that reaches your target mobile audience.
You might notice a few changes in your website’s traffic and ranking. Google is using this update to see if your webpage can reach your niche audience.
If your website is not mobile-friendly, or you want to check, use Google's mobile-friendly tool. This will help you determine if your site can be used on mobile phones and indicate what changes need to be made.
During this update, Google announced that Penguin would remain a part of their algorithm. Penguin is a search filter that captures websites that are spamming but are undetected by Google's main systems. Penguin would find spammy websites and penalize them by reducing their page authority. Even if sites fixed their page, the deranking still occurred.
In this update, Penguin became more page-specific. In previous updates, Penguin 3.0 was used for the entire website. This could result in lower rankings that could take years to fix.
Historically, the websites that used Penguin would refresh at the same time. Once a webmaster made an update to their website, it would take time for Google’s algorithm to account for the change.
To combat spam, Penguin 4.0 devalued spam by adjusting the ranking based on the spam signals. This helps a website reduce spam while not having its entire site penalized.
Penguin’s updates are real-time: After this update, Penguin’s data was updated faster.
After this update, Google stated that it would not roll out any more Penguin updates. That makes sense since Google's algorithm is a constant process that changes throughout the years.
During this update, Google stated that it wanted to give more search results to its mobile users. As more people started to use mobile phones, Google needed to adapt its search technology to keep its users. Here are the two ways the "Mobilegeddon" update helped users find new mobile content:
Previously, mobile phones would have to look at a desktop screen when looking at a Google search result. This made it difficult for the user to see the site, making them leave the page. With this update, Google changed the screen resolution so the page could fit on their user's devices.
Google created this update to help promote app content. Apps could be indexed in Google’s rankings, giving them more visibility.
This app bought Google's search algorithm into the mobile app stage. Now webmasters could create web pages that are targeted towards their mobile users. By doing so, they would see an increase in page activity and rankings.
Rank Brain is Google's machine learning intelligence system. It works to help Google make better search predictions and process results faster. While it is not a new ranking system, it is a part of Google's search algorithm.
Rank Brain is a new addition to the Google Hummingbird update. Hummingbird is Google's entire search algorithm, while Rank Brain is the algorithm used to create better search predictions.
Signals are certain aspects Google uses to determine page authority. For example, it will read text on a webpage; that's a signal.
If there is a video on the webpage, Google will count that as another signal. The calculations are to give a Page Rank increase each time the signal is triggered. Having a mobile-friendly website will act as another signal.
The answer varies. Google stated that they have over "hundreds" of signals to track your webpage. There might be over 10,000 sub-signals. Thus, giving your site a lot of opportunities to rank if it follows Google's guidelines.
Google’s Pigeon update is a new algorithm suited to create more accurate search engine results that are tied to traditional SEO ranking signals. These new changes would be seen in Google Web search and Google Maps search results.
As a result, the local search has increased after the Pigeon update launch. This update had a few glitches as spammy properties had exact matching keywords that trick the algorithm to rank higher in Google's search results. By August 1, 2014, Google fixed the issues that occurred during the Pigeon launch.
The Pigeon update created improved local ranking system integration. In addition, Google updated how it treated its consumer-based directories such as Home Advisor, Yelp, and more.
Before Pigeon was launched, Yelp claimed Google was outranking them in reviews. After the Pigeon update, Yelp and other directories would see better customer reviews being ranked higher in the search results.
The update decreased Google’s local pack number. Previously, the local pack would include 7-10 businesses in the local search. This would give businesses the ability to rank up and reach Page 1 of the SERP.
One year after Pigeon’s launch, the local pack was reduced to 3. Google wanted Pigeon to focus on having the local search resemble traditional search methods.
Google changed the way businesses ranked and appeared on their search engine. Pigeon was made to provide useful and relevant results for users in a local area by favoring areas that were within their proximity.
Basically, the Pigeon update gave users local search results in fewer clicks. But this came at the expense of website traffic. This update makes it easier for customers to view your brand or business when they are nearby.
Security is a top priority for Google. Beyond their own technology, Google wants to help make internet searching safer. That’s when they unveiled the HTTPS update.
During this update, Google made the HTTPS security protocol a ranking system. That meant that websites with HTTPS enabled would receive a small boost in rankings.
Here are some ways you can prepare your website for this update:
Define what web certificate you need: multi-domain, wildcard certificate, and single.
Don’t use robots.txt to block the HTTPS website from crawling.
Create 2048-bit key certificates.
Use relative source URLs for resources that are on the same domain.
Use protocol URLs for resources in different domains.
During this update, popular sites like 'The Pirate Bay" went down in Google's search rankings. Google created this update to reduce the traffic to these sites. Google's intention in doing this was to reduce the amount of pirated software that was downloaded.
These downloads affected the music and Hollywood industry. Most of the software that was downloaded was movies or music that was their original content. Thus, both the music and Hollywood industry aggressively pushed Google to make this update.
While Google has been ranking sites for DMCA compliance since 2012, they announced more demotion measures on October 21, 2014.
Torrent sites like Torrentfreak were affected by this update. Sites of that nature reported a loss of website visibility and traffic. The downranking affects sites that have a high number of DMCA takedown requests.
When Google users search the words "movie," "music," and "download," these websites were in the lower area of Google's rankings.
For example, top torrent sites vanished when searching for the movie "The Social Network." Websites like IMDB appear on the top results, as they have more authority and fewer DMCA takedown attempts.
The results of the search data and traffic rankings show that Google's downranking changes will severely impact pirated sites. These changes will bring traffic towards smaller unauthorized websites in the short term and reduce torrent downloads in the long term.
During this Time, Pigeon was only released to the US. As Google's user base continued to grow, they needed to ensure that this update was reflected internationally. On December 22, 2014, Pigeon expanded to Canada, Australia, and the UK.
This update was great for Google on a global scale. It allowed for countries outside the US to rank organically. As a result, it created more web pages that could rank in Google's algorithm. For foreign business and advertising campaigns, this update made it possible.
On August 20, 2013, Google wanted its Hummingbird update to be fast and precise. The update was an entire revamp of Google’s algorithm, not a simple update.
The Hummingbird update specializes in the Semantic Search. Semantic Search focuses on improving search results through analyzing the user’s intent and a subject’s contextual reference. Essentially, it focuses on what the user means and delivers relevant search results based on that information.
For example, if you search for the term "weather." Google will predict that you're searching for the weather in your location. While Google might now know exactly what you're searching for, it provides a wide range of search results to help you find what's needed.
The semantic web is an internet where machines perform the heavy lifting in creating search results. It gathers web information based on the user’s query and makes predictive search results based on their information.
Optimizing your website for the Hummingbird update is simple. Here’s how to get started:
Diversify the Content-Length: With websites, you need to have a mixture of short and long blog posts. This allows your reader to engage with your site without becoming too overwhelmed or bored. There is no perfect post-length, just enough to give your audience relevant information.
Create Visual Content: Visual content is more likely to bring traffic to your site. You can add videos, infographics, and elements like graphs and charts that will add spice to your web content.
Topic Appropriate Language: Google will reduce your site rankings if your blog posts are irrelevant to your target niche. Creating content with the correct terminology tells Google that your site has value and authority.
Schema Microdata: While Google stated that implementing schema microdata wouldn't increase rankings, it does have an effect on the indexing process long term. It can be difficult at first, but doing so will make your site visible to engine crawlers and secure SERPs.
With the Semantic search results, Google has started to improve its search predictions. As more data was gathered from the search results, Google started to use Artificial Intelligence to give users live search insights faster.
Google would make an innovative idea called "Google Glass." The user's voice would power the set of glasses. While some critics disliked the appearance of the glasses, there is a chance it will be accepted by the masses when the price point drops.
After the Hummingbird update, Google started to make use of the "Internet of Things." The Semantic Web helps users find their search queries with minimal effort. The Internet of Things was then created to aid the technology's reaction towards those results. Thus, giving the user complete control over their searching and technological experience with Google.
Google announced this update to improve their search engine’s ability to track relationships and people. They started this transformation with Social Search and added three new features as a result.
Personal Results: This feature enables you to find information about yourself, such as Google+ posts and photos. Only you are allowed access to your personal results, which is a great update for users who value privacy.
Search Profiles: This shows you which people you might want to follow based on shared interests.
People and Pages: Helps users find people with similar interests and allow you to follow them with a few clicks.
With these results, Google made its algorithm more social media-friendly. Thus, giving influencers and content creators the ability to rank well on their algorithm.
Google released Venice on February 27, 2012. The update would help with improving local results and rankings for local search results.
Before the Google Venice update, you had to track your website’s listings via Google Places. While this was good for getting local search results, there would be errors where the search results would add listings that were not specified by the search terms.
However, this change was fixed during the Venice update. The update allowed your search results to have local listings based on your IP address or physical location.
This update helped users find local businesses within their proximity. This is a useful feature when you're planning for an event or traveling. This helped SEO because it gave local businesses more traffic. That way, your local business can have more visibility and customers after the Venice update.
Both updates had a significant impact on how businesses were ranked on Google’s algorithm. The update helped shorter websites get more coverage in Google’s search rankings and created high-quality websites.
As a result, SEO for businesses became more competitive as now local businesses could rank highly on Google’s algorithm.
Google Penguin was released to reduce spam on their algorithm. The Google Penguin update penalizes pages that use too many advertisements.
If there are duplicate pages or pieces of content, your website will rank lower in Google's algorithm. However, if you've noticed an increase in traffic, that is due to Google Penguin adding more search visibility to your website.
In the Google Penguin update, you have to give your site a quick audit. Look at the pages and see if there is any duplicate content (i.e., similar blog pages).
Webspam occurs via keyword stuffing, which can lower your site’s ranking in Google’s algorithm.
Check both inbound and outbound links coming from your website. Google will penalize your page if you don’t have accurate links, so don’t overcrowd your page with unnecessary links.
Google, throughout the Penguin update, has found the difference between their page's SEO Tactics. White Hat pages employ good SEO optimization by using organic blog posts, content and make websites run faster.
White Hat SEO employs techniques on how to improve the creative aspects of the website. The result of a good White Hat SEO campaign is increased page awareness and better user retention.
Black Hat techniques are considered spammy and increase the web rankings for the short term. Some of the techniques include link schemes and keyword shuffling to increase rank.
The goal of the Google Penguin update is to reward pages that offer a better user experience. Google wants the organic and "good guys" to rank higher in their engine.
Google’s Penguin update is live for all of the languages within its algorithm. For instance, the Panda update changes approximately 12% of search queries. This algorithm affects 3.1% of the English search queries. Thus, if your website is outside the U.S., making minor adjustments to your page will increase your rankings on Google’s algorithm.
On May 16, 2012, Google announced that their knowledge graph project was now available in international languages (French, Japanese, Russian, Italian). Before this update, the Knowledge Graph was only available for English users.
Google knowledge graph is Google’s way of placing people, facts, and people together. This creates better-targeted search results. Thus, making it useful for those wanting more accuracy in their Google Search.
For example, go to Google Search and type “famous athletes” onto the search box. You’ll notice a picture carousel that shows “Professional Athletes” ranging from soccer, football, basketball, to even golf!
You can take advantage of the Knowledge Graph feature by getting more search traffic to your website. Start by having an idea of how you can engage with your target audience through the Knowledge Graph panel.
The Knowledge Graph is a set of Google's data. It shows information based on a variety of data sources to give the most accurate search results. Knowledge Graph helps you answer questions such as "How tall is the Empire State Building?"
Google receives the data from public sources and licensed information such as stock prices, weather changes, and sports scores. In addition, Google receives its information from content owners, including those who request changes to their knowledge panels.
You'll notice that Google bolds the keywords placed within the meta description if you do a website search. Since Google measures search results, it measures the relevance of the page to the user's search queries.
Knowledge Graph Optimization is a feature where webmasters help Google determine what their page is about. Let's say you're writing a blog post about "Bugs." Your success with the website depends on how well you use the search spider to track/interpret the word correctly.
Were you referring to "insects," "bacterial or viral infections that cause illness," or a "glitch in a computer system?". Through Knowledge Graph Optimization, Google can determine what you are meaning based on your content, keyword, meta description, title, and keywords.
Based on HubSpot’s report, over 76% of your marketing efforts should be made to help your users find what they want quickly. This includes buttons, layouts, calls-to-action, content substance, and KGO (Knowledge Graph Optimization).
The Appearance (10%): The appearance takes into account how a website is designed. Cluttered and slow interfaces will rank lower on Google's page. Use the time to
Website Experience (9%): 9% of the users stay on a website due to its UX/UI experience.
Easy to Find Important Components (76%): Make sure your website is simple for your users to get their needs met.
For instance, let’s say your blog is about ladybugs. That’s when the LSI (Latent Semantic Indexing) tool comes in. It will mention the class or bed bugs that cause the disease and the bug's life cycle.
LSI replaces the common words with synonyms. This gives Google an accurate representation of what you're meaning. Google knows that "purchase" and "buy" are interchangeable. However, phrases like "buy the idea" or "buy product" do not. Through this, you can tell Google exactly what word you want to be ranked and remove any synonyms that you're not referring to.
In 2013, Google's Knowledge Graph started to include statistical graphs on the search page results. Basically, the system works by predicting the next question. It will add related statistics to your graphs. This contributes to artificial intelligence because it gives you high-quality insights from the presented statistical data.
For instance, if you want to know more about customers living in the Philippines, Google might show you stats for China and India.
Amit Singhal (Google's Search VP) stated that this update was just the beginning. Google continued to invest its resources in making its search engine smarter. While there are still a multitude of problems that need to be fixed, Google is adding them to a new kind of search experience.
Remember, as technological systems evolve, psychosocially studies show that human behavior doesn’t change on the macro level. Human’s basic needs are the same, regardless of what new technology is created tomorrow.
This algorithmic update wants bloggers and content writers to focus on the audience instead of the system.
The Pirate Update is Google’s response to websites suspected of copyright infringement. Users would create copyright removal notices on sites that were posting unoriginal content. Sites that have a high number of copyright removal notices would appear lower on Google’s search engine result.
While the update was aimed at copyright prevention, Google did not remove the site unless they had a copyright removal notice from the owner. Thus, Google used this update to protect the quality of their search engine and remove copyright-based content from their algorithm.
On June 28, 2011, Google announced the release of Google+.
Google + is Google's social media platform where they focus on sharing. Google+ gives people the circle and the ability to express themselves through private Circles. Basically, Google+ lets you share your art and ideas only to the circle you choose.
Sparks: Sparks is a feature that allows users to share information about topics with their group. Think of it as a way to "Spark" a conversation. With this feature, Google's search engine will allow the user to add their interests in a conversation.
Hangouts: Google Hangouts is a group feature where Google+ users could have a group video/text conversation. It was the predecessor of the communication apps that would use group video chats as a way to strike a conversation with others.
+Messenger: Coordinating with your family and friends can be difficult. With Google+, the messenger feature connects you with them within seconds. It opens up an instant messenger group, allowing you to speak to your Circles in a group conversation format.
+Circles: Think of the circles as a way to speak with groups. Google Circles allowed users to communicate the same way you'd see in social media platforms like Facebook Instant Messenger. Google+ provided more privacy; Facebook provided ease of access.
Google+ was still a great idea as a platform, but the reason behind its failure was because it failed to address what the customer needed.
There were already social media apps like Facebook and Twitter. Users felt that Google+ was redundant because they already had their social media needs met by other platforms. There was no interest behind it, making Google+ obsolete over time.
Google+’s circle system was not user-friendly. The platform’s UI was too complex for users to start a conversation. Facebook’s Group Messenger was too intuitive for users at the time. Since Google+ had no features that were distinctive from Facebook, it could not adapt to its competition.
In fact, there was a data leakage that leaked over 52.5 million G+ user’s data within the past 3 years. There was a bug that could access private information (age, email address, occupation).
In the aftermath of the Google+ era, Google still created the frameworks for many modern communication apps (Telegram, Discord, Facebook). Google’s search ability has been infused with the platforms, making it the grandfather of communication apps.
This was a time in Google’s algorithmic history where they took a risk with social media. While Google discontinued Google+ on April 2, 2019, it still had an impact on the future of social media algorithms.
Search queries were in need of more privacy during this time. Users felt that their data could be compromised by hackers and phishing networks. On October 18, 2011, Google announced Query Encryption as a solution to this problem.
Google has worked hard on increasing search security through SSL (Secure Sockets Layer). Google made SSL and encouraged the tech industry to adopt better web security standards.
Basically, it encrypts your search query so that only you know what you’re searching for. Simply, Google’s query encryption improves your personalized search results because they are only viewed by you. This also protects your searches from spam marketers and hacks, resulting in less spam email and phishing.
Websites that obtain clicks from their search results will know that you've come from Google, but it doesn't have any information about each search query.
You can receive a list of the 1,000 top search queries that provided traffic to your website via Google Webmaster Tools. Google Webmaster Tools gives you detailed statistics on your user traffic. If you want to click an ad for a campaign, the browser will send a relevant query to the network to help advertisers measure the effectiveness of their campaign. Thus, allowing them to improve the ads they send to you.
SSL Encryption has been Google’s way of keeping our search data safe. Thus, it is why Google continued to use this update as a framework for future net security innovations.
Google created the Freshness update to add more recent search engine results. As the company continues to grow, they need to tune their algorithm to ensure the best user experience. With the freshness update, over 35% of the search queries were impacted as a result.
Frequent Updates: The Freshness update improved search queries that change but isn't a recurring event.
Recent Events: For recent events, Google algorithm helps you find the information easily. Now when you search for the latest events such as "The Grammy Awards," Google will show it on your search results. Thus giving you more recent search results faster.
Regularly Recurring Events: Events such as “Sports Scores” occur multiple times throughout the year. Google used this update to show the results faster.
While the updates are helpful, 35% change did not lead to 35% improvement. We have no straightforward way to measure search engine activity in a numeric fashion. If anything, the update gave a boost to Google’s current search engine quality.
Google missed out on Twitter data during the time. This is because Google is unable to index Twitter's fast-paced content. This update was made for a ranking change, as Google was unable to track social media behavior.
Overall the Freshness update added more relevancy to Google's algorithm. It allowed users to see their search results and divided them based on their urgency. Thus, leading to a "fresher" look on Google's algorithm.
During this update, Google unveiled 10 new changes to their algorithm. Each of the updates was based on Snippets & Page titles, Autocomplete & Translation changes, and ranking changes. Here are some of the updates that occurred:
On this update, Google is able to pick up snippet text from the main content of a web page. This is more accurate than previous updates, where the rich snippet content was placed in menus and headers.
This update improved the accuracy of Google's auto-complete suggestions. In this update, Google fixed the autocomplete suggestions for Russian search queries. In addition, it had cross-language information retrieval in languages such as Serbian, Hindi, Albanian, etc. Thus, making the update better suited to complete the auto requests from a global perspective.
Four of Google’s algorithm updates were related to ranking changes. It resulted in better ranking results for original websites. They retired the image search ranking system, as website images could refer to multiple documents on the web.
Google changed how they handle search results when users enter a query within a specific date range. This ensures that users get the top search results on what they specify.
As a result, these algorithm changes helped Google refine its search engine. As its user base increases, Google made these updates to help high-quality pages receive better search results.
The Panda/Farmer Update is Google’s response to poor or thin content. In 2011, poor content would rank highly, and Google stopped this to preserve their quality and reward organic content creation.
Google Panda works by penalizing the following:
Pages with general information
Poor content-ad ration
Content that has limited to no information
That’s why the content of your site needs to be optimized and original. Google Panda has given older websites penalties, so it’s best to adapt to its ever changing algorithm.
Google Panda’s Penalty will lead to a reduction in website traffic. Here are some ways to fix it:
Update or eliminate the page, so they give value to a reader
Check your site for duplicate pages
Make sure your site doesn’t show excessive ads
When Google refreshes the algorithm, you’ll notice an improvement on your site.
As you continue to create a website that suits Google’s algorithm, you have to create organic content that Panda will not penalize your site later. Here’s how you can create Panda Friendly web content:
Google Webmaster Tools: This tool helps with measuring your site’s user experience and ensures that it’s working properly.
Look at the ad ratio and make sure your users aren't constantly bothered by multiple ads. Try to give your user base 1 ad for every 3-4 days.
Check your site for duplicate content.
Each page on the site needs to fulfill a user's purpose (I.e. a blog must be made to help them further understand a product).
After 2011, Google Panda was used to place high-quality content higher in the search rankings. Always monitor your user traffic to see for any potential drops. When trying to grow your website or brand, focus on the user experience and quality.
Refined Page Detection: Google changed how they awarded good content on their algorithm. The change affected 35% of the searches and determined the freshness of a website's content.
Rich Snippets: Google announced this feature which allowed users to see details in their software applications (user reviews, cost) in the search results. This extends the rich snippet’s application, so they will be used more often.
Page Titles Ranking: Google looks for multiple signals when looking at the page title, such as the anchor text in the links directing going to the page.
Language Translation: For users browsing in different parts of the world (Spanish, French, Albanian, Norwegian, Hindi, etc.), Google translates the website in a different language and then places it above the English text.
If you're a webmaster, don't go wild thinking about reaching your Iceland audience and tuning the anchor text. Remember that this update is a small change Google makes on its algorithm annually.
In 2010, Google announced the Caffeine Search Engine Update. Caffeine gives 50% faster results for web searches than their previous installments. Whether it's a blog post or forum post, you'll find the links to the relevant content faster.
To explain, when you send a Google search, you’re not searching through the live version of the web page. Instead, you’re searching through Google’s index. This helps you pinpoint the data and information you need for accurate website content.
Why did they create a new indexing system? It was a response to the evolution of web content. Web content is growing in size and numbers, but with new content mediums such as videos, real-time updates, news, and images, webpages are becoming more complex.
Also, the user has higher search expectations. Searchers want up-to-date and relevant content, while publishers expect to have their page found as soon as they hit "Publish.” Overall, caffeine was designed to meet increasing user expectations while still creating an accurate search indexing process.
Before Caffeine, Google had a multi-layered indexing system. However, it would take weeks for the main layer to update. To refresh a layer, the entire web page had to be analyzed, which created a delay on what was found on the webpage and what was available to the user. Thus, leading to slower search results.
The current Caffeine update helps with indexing web pages on a larger scale. In fact, Caffeine takes about 100 million gigs of storage into one database. When Google finds new pages, it automatically adds them to the Google Search Index.
With the Caffeine update, we can see how Google has improved its algorithm. Not only is it a more robust foundation that helps Google create better search information, but it is made to scale with the increasing growth of information. Caffeine's system ensures that you get the relevant search results you need when typing in Google's search bar.
The Google Vince Update allowed for large brands to rank higher in Google’s search results. While this might have been seen as a minor update, it affected the entire SEO industry.
The update improved Google's search quality and explained why branding is a huge component of its search algorithm. The update benefitted large companies and government sites because they have higher quality information and more page authority than other websites.
Despite the dominance of large brands at the time, the Vince update pushed smaller brands to create more engaging content.
Google always values originality in its search rankings. If you’re planning on making your business rank, you have to treat it like a brand. Give important information on your business and create content that helps you stand apart.
For larger brands, the update was a huge plus. It rewarded large brands for their originality by placing them on the top of Google’s search rankings.
Through this update, the SEO industry has learned that branding is the way to ensure their websites survive in the long term.
During this time, Google had slower search results. This was due to its increased users on Google's search engine and the quality of its databases.
The Real-Time Search Results update allows for users to receive real live news reports within its algorithm. For users, Google’s update gives them on-time news, which can help them navigate through sites more freely and with more accurate information.
To use it, go to the “Latest results on the search options to find a list of tweets, news, and blogs that appeared on Google’s search engine.
After the launch, Google created "hot topics" for Google Trends. This feature showed the users' most common topics users were posting on Google's algorithm. With this and the change of interface.
Google unveiled Universal Search on May 1, 2007. The update allowed for Google to search for images, videos, news, and books. Overall, it leads to more diversity in Google's search engine, giving it more media to index and rank.
Vertical search is when your search query is sliced down through one topic area. You can only search for newer sites or sites that have relevant information. This search method allows Google to narrow down your search results and give you a better search recommendation.
Before vertical search, the horizontal search was used to search through a wide range of materials. The entire spectrum of topics is being searched alongside your query.
Use horizontal search when you’re trying to get recipe recommendations. But if you’re looking for a specific recipe with the ingredients you’re looking for, then go for a vertical search.
Google Book Search is a great example of Google's vertical search algorithm. It allows the user to see the matching pages of previously scanned books. It allows users to go to the direct page number of a book and find important information within its chapters.
Google Book Search helps you find content in the books, but only if you know how to use that search engine fully.
Universal Search aims to fix search engine problems through blending. It will go through a range of vertical search engines.
The Universal Search update unveiled a change in how video content was indexed. Instead of placing the content as a part of the page index, Google would create a search query on Google Video.
In some searches, Google Search matches will appear. The images will not take the place of the 10 listings. Instead, websites that had high relevance were shown on the top page results. Make sure you use high-quality images to ensure your page is on Google's search results.
On June 1, 2007, Google rolled out their Google Suggest Update. When typing on Google.com or Google’s Search Box, Google Suggest will send you recommendations based on the subject.
Suppose you type in "microphone" into Google's search engine. Google will offer a list of microphones based on "gaming" or "voice conference." So even if you use a portion of a word like "mic," Google will suggest refinements like "professional microphones." These refinements help with getting your entering search results faster, but also in a way to help you find the results you're looking for.
Specifically, Google needs to determine what you've typed, so the partial search queries are sent to their algorithm. For about 98% of the search requests, Google does not track the data and simply returns the suggestions. The other 2% is randomly selected by Google and logs data such as the IP address. This data collection improves Google's monitoring and service improvement.
All of the data retention is a balance between trust and usefulness. And innovation and security play a role in keeping users onto your page. Google Suggest was used to improve the search accuracy of Google's algorithm.
Between May 2006 and June 2006, Google released the Bourbon Update.
The update was used to improve Google’s SERP spam filtering and updating the Page Rank and backlinks of a website. Thus, making it easier for original websites to rank within Google’s algorithm.
Inferior Links: Google might suggest link exchanges or link buying are lower ranked in their search engine. Backlinks that lead to spam will dramatically decrease your page ranking after this update.
Links Without Relevance: When creating your website, if it links to another page without any sign of relevance, Google will reduce your site ranking.
Duplicate Content: If the content is similar on multiple pages, Google would decrease the page rank.
SERP fluctuation was occurring after the update. Google released Bourbon on all of its servers.
There were a few programmer issues, as they reported issues of changing of improvements in Google AdSense, speculations on the websites increased.
On June 1, 2005, Google released a Sitemaps program that allows webmasters to feed what pages they'd want in Google's index. Google created the XML Sitemaps Program in hopes that it will gather pages better than previous methods.
So how does this work?
Webmasters create XML files that contain the web URLs they want to be crawled. They are the host for the server sitemap and tell us where to direct the links.
Google has a Sitemap Generator tool to assist you in the process. Google created this update to index publicly available information for better search results. This benefits the user by having a fresher index and increasing their coverage.
In October 2005, Google announced the creation of Google Local.
Google Local gives users relevant information through detailed driving directions and local search results. This makes it easier for users to get better search information when looking up for a “restaurant” in their city.
With relevant local information and mapping data, Google Local Maps can index your website. It has additional features such as keyboard shortcuts, draggable maps, and more.
Google Local is a mapping product and a comprehensive local search. As it continued to evolve, Google continued to have mapping functionality and local search to the algorithm.
During the Jagger Update, links varied in quality. Some sites that were high in rankings had low-quality inbound and outbound links. Google released the Jagger update to eliminate manipulative link building.
Content: The Jagger update penalized websites that used duplicate content through multiple domains. During the third installment, webmasters have noticed issues with how canonicals were managed. Webmasters whose websites have a mixture of an index.html start to show duplication issues.
Backlink Spam: With this approach, Google made a new approach on how they understood links that are sent to a website. Jagger negatively affected websites that paid for sitewide links and obtaining links from link farms.
Hidden Text, Cloaking, Redirects: In 2005, cloaking was a method where spammers would create one piece of content for search engine rankings while another website was shown to the users. Google punishes web pages by giving them negative rankings if they use CSS to hide text.
Jagger’s update would pave the way for future SEO best practices and algorithm updates. In many ways, the update placed an end to paid links and unnatural link building. Plus, Jagger’s update was the precursor for the Penguin algorithm update.
Before Google Analytics, Google obtained the web statistics program called Urchin. By November 2005, it unveiled Google Analytics:
Google Analytics is a data-collection and analysis software that helps sales and marketing teams on their campaigns. At the time, Google wanted to create free software for data analysis. With Google Analytics, users could simply set up their website as property and receive insights on user interaction.
Google Analytics has helped with the proper tracking and reporting on data. It allows users to create dashboards and reports of user information, giving them the ability to make data-driven decisions.
For example, let’s say there is a Google Analytics alert that there is a spike in Pageview traffic. That gives the webmaster the information needed to know that this site is performing well.
After the launch of Google Analytics, there was an increasing demand for web analytics. It evolved from a niche requirement into an important operation of online business. As its popularity has grown, Google would improve the technology to Universal Analytics (November 2012) and Google Analytics 4 (November 2020).
Overall, through the help of Google Analytics, Google was able to have more data on web analytics and user activity, which helped the algorithm make more changes and help users improve the content of their websites.
The Big Daddy Update allowed for better and accurate search results. You'll find out who you're looking for on the first page of the search results. While this is a quick update, this wasn't always the case.
If the algorithm is working properly, the SERPs will show the most relevant websites for any given query.
For people having multiple websites, this is great news. Even if it meant there would be adjustments, most website owners or SEO professionals would favor website algorithms that would prefer more legitimate, high-quality websites.
Once the Big Daddy Update was in effect, Google announced the supplemental index. This meant that if the visitors ended up on the index, you wouldn't see as many views as you'd like on these pages; the chances are that you weren't going to receive as many views as you would want to.
Since the Big Daddy update development, there have been multiple updates to improve Google's algorithm. The later algorithms took Big Daddy's concept of building a user-friendly and concise website structure.
Big Daddy was one of the earliest updates in Google's algorithm. Since then, lots of changes have been made. SEO professionals learned a valuable lesson through this update.
First, you have to organize your website in a clear and concise manner. This keeps your users engaged and makes it easier for Google to place your site in its search index.
Next, website managers have to control the quality of content that's published on a webpage. Generate content that's suited for your target user, and Google will increase your page ranking as a result.
This update is the successor to the Google "Austin" Update. It created five different changes in its algorithm. Since Florida's data centers were offline, some believed that Google kept its older SERPs preserved in the past few months. Google did create these data centers after Yahoo!
Here are the 5 Updates:
Index Size Increased: Google has a web-crawler spider named Google Bot! In this update, the spider increased its indexing size. Doing so allows them to track pages faster and with more accuracy.
Latent Semantic Indexing: At the time, Link Building Technology was new to Google. To explain, LSI is used to match your page's relevant keywords. LSI algorithms will search for links and words that are relevant to your topic.
Anchor Texts and Links: Links were always an essential component of Google Search rankings, but now this update rewarded users who had high-quality inbound and outbound links.
Neighborhoods: This feature would show links to nearby neighborhoods. Making it a good choice for businesses wanting to advertise towards their local audiences.
Downgrade on Tag Operation: After the Brandy update, there was a change in SEO tag techniques. Google lowered the importance of tag-related CSS themes (h1, h2) in favor of an LSI system and link building, which is more difficult to change by the masses.
For webmasters, this update Google can detect a site’s authority based on the outbound links. When creating outbound links on your website, make sure they are relevant to your niche.
Esmeralda is the last update for Google Dance. This was replaced by Fritz within the following month.
Google Fritz marked an important change in Google's indexing system. Instead of the huge monthly index (from Google Dance), Fritz made incrementally and daily updates. A percentage of the updates were adopted every night. The daily updates (ever flux) updates lead to more stable search results and faster indexing.
At the end of Google’s monthly indexes, Fritz changed the alphabetically ordered naming convention. This changed the Boston algorithm update, which occurred a few months later.
The Google Florida algorithm update was created to answer the following SEO questions:
If you've noticed your page dropping off in the top search in Google's Rankings (2-3 months), then it means your page is not working in Google's new algorithm. Doing a filter test can help you in the short term.
While many pages dropped in rank after the Google Florida update, some pages consequently increased. So continue to make original content on your site to be rewarded via Google’s search algorithm.
During this update, Google had two algorithms at once. The old algorithm and new algorithm were both handling user search queries. The newer algorithm required more processing power.
Google stated that AdSense doesn’t have an impact on site rankings. But there is a thing where when an Adsense Paid Listing goes through the Google Sponsored Listing Program.
The Listing program has large sites such as Google Sponsored Listings Program. The paid service index ensures that their websites are indexed in Google’s algorithm.
This update created the segment between Paid Google Ads and Free Google Ads. Remember, free Ads stay the same. Google's search algorithm will index the web pages and see which ones are the best according to their criteria (i.e., page quality, user engagement, proper SEO tags).
If you’re a website owner that notices drops in your reach, this update meant that Google reduced from having ¾ of the search engine total to the usual 50%. Google’s control over the entire search engine of the internet fluctuates over time. So your website is bound to see changes over time.
Here are some tips to help you optimize your Webpage:
Reduce Spam: Your website should be clear and to the point. That way, the users who interact with your page aren’t bothered by consistent emails, ads, etc.
Link Building: You can look for links from websites that have similar content to yours.
HTML Tagging: Create a descriptive HTML tag. Make sure that it reflects the key search phrases you want your website to be found on.
This update in Google’s algorithm started the foundations of link building. Here are the three main things you should consider when sending out links.
Get links from an audience and webpage you want.
Links to websites because you want your visitors to know about them.
Only buy links if the number of visitors coming from the link will outweigh the costs.
Removing the shopping cart shouldn’t have any negative effects on your website rankings. In fact, most sites have shopping carts for e-commerce purposes.
Google would create the Google Dance update in response to the spam in SEO.
Google Dance is a term that refers to the volatility of your website experiences when Google is trying to rank and index the page. Simply, Google Dance is when Google notices your web page and adds it in SERPs. (Search, Engine, Results, Pages). This is Google’s way of testing to see if your page is going to go higher or lower in their search rankings.
Another explanation is like this, Google is testing your page by changing the page rank (without affecting other ranking factors - keywords, etc.).
Throughout the "Google Dance," your website will fluctuate in SERP page ranking. Finally, Google will determine where it will rank your website. The page ranking will also move depending on what your competitors are doing.
At the beginning of the website stage, you have to understand that:
Volatility = Normal.
Your website is going to go through a series of changes as the months progress. Volatility is a good thing, as it is showing your site that Google is paying attention to it.
The Old Google Dance would push a major update onto its SERPs, and then an entire index would move around. This was a period of high volatility. It would take 3-5 days after an algorithm update to start changing the site's page rank. As Google continued to grow, it would improve its algorithm to create faster updates.
Age: Is it a new page or website = (Newer = More “Google Dancing”).
You can decide whether to go for New or Strong Links to affect the Google Dance Algorithm. Strength - More Longer Links.
Have you made any changes to your website? Having new changes to the website allows it to rank faster. Create changes by:
Website Redesign - Changes to your website’s appearance will cause it to increase Google’s site indexing.
Website Launch - When a full website is launched, it receives a boost in Google’s site rankings.
Webpage Launch - Think short and single-page websites. If it's original, chances are Google has ranked it high in their search engine.
Competitiveness - How competitive is the industry/locale/keyword? The more competitive, the more SERPs that will be applied to the page.
More competitive web pages are danced more drastically in Google's search rankings. If your site doesn't have the right Page Rank Position after it, then you can re-dance later once you've started to build links on your website.
Let’s say you’ve created a new website and you’re ready to create Google Ads with it. When looking at the Google Analytics UA dashboard, you notice a volatile change in your page’s bounce rate.
This is a good thing, as it shows Google's Dance Mechanism. The volatility is Google’s web page trying to show you that it is considering increasing your website’s ranking volume.
When the Google Dance occurs, it’s best to continue improving your sites in the best SEO practices, Don’t panic when the metrics change in a volatile fashion. It’s a sign that your website is going to rank well.
Google released the first Google Toolbar(™) during this update. This was the first update that propelled Google into the tech giant it is today. It was a browser that allowed users to search anywhere on the internet through Google's high-speed technology.
If you currently use Google Chrome, then the Toolbar already comes pre-installed. The toolbar allows users to quickly highlight and jump to any required search term.
Searches over 1.3 billion web pages.
Access to Google’s cached web pages if the link is offline or broken.
Automatically locate searches on the website.
Determine the importance of each web page browsed. This is backed by Google’s Page-Rank (™) Technology.
The Google Toolbar is free to all users and is acceptable on recent PC versions Windows 95, 98, 2000, 10, or NT, and an Explorer 5 or higher.
Google is still running strong as the world's best search engine. After 20 years of updates, fixes, and announcements, Google has continued to revolutionize the web search experience. It is through Google's help that online businesses can continue to grow.
To conclude, Google’s algorithm will continue to grow and adapt over time. So make sure your website for your company and brand is tuned for the next update they release!