Everything About Latest Google Spam Update

Everything About Latest Google Spam Update

Google occasionally rolls out a spam update to make sure that Google results don’t get filled with spam.

It is because if users click on spammy results, it can create a horrible experience, which would cause people to stop using Google.

Although they don’t provide much information on the spam updates on their website, there are two things we can do to help determine the change.

First, we have over 900,000,000 domains tracked on the internet. This allows us to identify patterns.

The second, and more pertinent for this update, is at our agency, The Yellow Strawberry. We have 100 experimental websites that use AI-written material. These sites are not intended to “game” Google. They are designed to find out how Google views AI-written content. As the results were fascinating, I will detail what happened to those sites in this post.

The Purpose of The Update

Google’s primary goal is to reduce spam, as indicated by its name. They now offer multiple spam updates. In July 2021, they offered a link spam alert.

This update did not specify whether it was spam-related or general spam.

However, when we take a look at the more than 900 million domains we monitor, these are the top categories that were most affected globally.

serp volatile

The chart shows that news and sports sites were the most affected. Those sites were closely followed by technology, arts, and community sites.

These sites tend to be heavy on content and not product-oriented.

Here’s what we discovered when we examined the affected sites:

  • Many of them had very little content. To clarify, thin content doesn’t necessarily mean low word count. It refers to content that didn’t really provide any value. The content was superficial and you didn’t gain any insights, actionable points, or value once you had finished reading it.

  • Poorly written meta tags – Many websites were affected by duplicate meta tags or pages that were clearly written for search engines rather than humans.

  • Keyword stuffing is still a problem. While most sites didn’t keyword stuff on the affected websites, around 3.89% did. They were using keywords in a way that was not optimal for the user experience, whether it was in their meta tags or in their content.

SEO is not just about keywords. There are many factors involved. We couldn’t find any other patterns. Some of the items they looked at from a surface perspective seem very similar to the helpful content upgrade.

We also examined the backlinks for the sites that had lost the most traffic, but we didn’t see any patterns. This doesn’t necessarily mean that Google didn’t consider links to be spam in the update. We just couldn’t find any patterns.

Here’s where the fun begins…

AI-Generated Content

Did you know that 100 websites are AI-generated? They come from a range of industries and have at least 60 pages of content. All of them are AI-generated. We manually built links (didn’t buy them and you shouldn’t ever buy links) to help rank the websites.

In reality, there are over 681 AI-generated websites, but not enough SEO traffic. It is difficult to spot patterns when a site receives only 1,000 visitors per month from Google.

However, 100 AI-driven websites receive at least 3,000 visits per month from Google.

They do not sell or collect leads.

53 of 100 AI-generated websites have content that was created entirely by AI. They also have their meta tags, headings and article titles all created entirely by AI.

These pages do not link to external sites or internal pages. AI content generation tools don’t add links.

One thing to keep in mind is that AI tools rarely create content longer than 500 words unless the user adjusts the content or has the AI writer create paragraphs.

We didn’t allow humans to modify or alter any content on the 53 first AI sites. We used only the AI-created content, including any meta tags that the AI writer added.

We had 47 sites in our second batch. AI was used to create these content sites. Then, a human-modified the content to make it better and more valuable. They also added external and internal links to the article, modified the meta tags to make it more user-friendly, and embedded videos and images when necessary.

We didn’t increase the length. Our Ubersuggest AI writer shows that most people use AI-written content and do not modify it. People don’t add much to the word count when they do make slight changes.

We wanted to duplicate what other marketers do with AI on our websites to get an idea of what Google is trying to solve.

Guess what happened to these sites?

The AI-written content that didn’t have a human intervention to modify it, performed less well than the AI-written content.

Traffic & Ranking

They saw an average 17.29% decrease in traffic, and their keyword rankings fell on average 7.9 places. While this may seem like a lot, it is not. None of them had the number one ranking for any popular keyword.

The traffic to the second group dropped by 6.38% and ranking positions fell by an average of 3.3 places.

We discovered that not all sites had been affected by the update when we looked deeper.

The update affected 14 sites out of 53 that had content that was not edited by humans. Traffic dropped between 31.44% to 73.18%. The 14 sites saw traffic drop by 51.65% on average.

Eight sites were hit by the second group which saw humans modify content. Their traffic dropped between 29.52% to 81.43%. These 8 sites experienced an average 42.17% decrease in traffic.

It’s interesting to note that some sites in both buckets experienced smaller traffic drops. Some even saw a slight increase in SEO traffic of up to 4%. This is compared to pre-update traffic which was a 48-hour delay.

Here’s the interesting part: 13 of the 14 affected sites also experienced traffic drops due to the helpful content update. The helpful content update also affected all eight sites in the second group.

It is important to remember that data was not updated for a lot of time between the time Google started rolling out things and when I published this blog post. And I didn’t want to compare Sunday traffic to Wednesday traffic. To remove bias, you should look at Sunday traffic versus Sunday traffic. The above statistics and ranking drops confirm that the update affected these sites.

Conclusion :

According to what we see, most of the update was focused on meta tags and keyword stuffing. This doesn’t necessarily mean Google ignored other factors like duplicate content or links, but we did see the most significant patterns in relation to the factors I listed above and AI-generated material.

Focusing on the user is the best way to win long-term. You may not be ahead in the short term, but you will be in the long term.

Ask yourself these questions: Is this content going to be useful to users? Are all pages using the same meta tag? Does my website offer enough value to attract people to link to it?

You are essentially just doing your best to spot-check yourself and do what is best for users.

Use of Apple Watch

The watch apps depend on the particular specifics of the user. The maker wants a whole new mindset when making an Apple watch. The watch provides multiple features to the user, for instance:

  • Health Tracker
  • Quick access to details
  • Action rings for regular activities
  • Complete iPhone pairing
  • Complexity is low

In terms of the number of displays, types of users, and the use of sensors, small complexity apps are usually low. Mobile applications typically take between 100-300 hours to build with low complexity.

Scroll to Top