If you don’t have at least X% unique content, you can’t rank in Google.Myself as well as every SEO ever
This phrase, or some similar variation, is one that many of us in the SEO world have either heard, read, or even told others before. As it turns out, this is not only incorrect, but it doesn’t even have ground to stand on. Google’s own Search Console Help forum contains an in-depth article that outlines several examples of duplicate content and what the potential impact on users may be, but not anywhere in the guideline does it state that search results will absolutely be automatically hindered if a substantial amount of content is found on more than one site.
Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results.
There are a few stipulations that we should cover before continuing. There is a specific situation where you will not have a benefit from having the same content, and that is in areas where the content will be competing for the same users. For example, if you write an article or create a webpage about, say, the best website hosting platforms for small business owners and use that same exact content on a totally separate webpage anywhere else, you will be competing for the same users’ searches. This does not end up in your favor for rankings because Google has to identify which instance of the article to show to a user and it’s usually smart enough to figure out which article came first. If you just want to syndicate your article across a few platforms, make sure that you properly use canonical tags so that search engines know which article is the original.
But what about if these pages/articles are not competing for the same traffic? We’ll go into a more specific example and my findings, but to keep it simple to start, we’ll take two different websites: Site A and Site B. In our example, both of these websites:
- Are part of the same industry
- Are in two completely different locations
- Each rank for decently competitive, localized terms
- Have extremely similar content because they both used, what I will lovingly refer to as “Cookie-Cutter Industry SEOs.”
If you have been in the search engine optimization game for long enough, you know that some web & SEO companies are specialized and only work within a single industry. While there is nothing inherently wrong with that notion, and for scalability purposes, it makes a lot more sense than just taking on any kind of client (more on that in a future article), it can come with some problems. Many of these companies do exactly as their title says and use overly cookie-cutter websites and strategies. In some cases, this gets bad enough that they use a pre-built website and swap out things like the logo, colors, names, location information, and services while keeping the vast majority of the descriptive, “rich” content the same. This is a situation that I recently came across on a large scale, and I found a few things to my surprise:
As long as the companies are not going after the exact same terms, the fact that content that was exactly the same across dozens of domains did not necessarily cause a ranking penalty.
My (new) client was ranking at the very top of page two for a competitive term in the area and dozens of other companies by this same Cookie-Cutter Industry SEO ranked for similarly-competitive terms in many cities across the US as high as the middle of page 1 – even with this duplicate text from their cookie-cutter website. Below, you can see the phrase I highlighted above that shows on many other websites:
Duplicate content generally refers to substantive blocks of content within or across domains that either completely match other content or are appreciably similar.First paragraph on https://support.google.com/webmasters/answer/66359?hl=en
While there are certainly some instances out there where duplicate content has been a ranking determinator (especially in competitive industries), in this not-so-small sample size, I was not able to see a single one. Obviously, if somebody was running a well-targeted SEO campaign for each of those sites, they would certainly rank better than the copied text because you could key the content much better, but for being entirely templated content, it really was not that bad, and definitely not Google saying, “We must wipe them from the search results because they share the same content.”
The takeaway is that as long as the pages that have near-verbatim content are not competing for the same terms or audience, the search engine does not have to choose between them and is able to attempt to show them in SERPs somewhat normally for their respective target keywords without having to choose which to show.
With that said, all of the content for my client has been rewritten because I don’t see any value in having the same verbiage that so many other websites have. You also just cannot get the same results from templated content that you can with unique, polished articles with proper keywording and that personal touch of an attentive and personally-invested writer.
The short and simple things that we can learn from this are as follows:
- Duplicate content is still not a good thing to do from a user-friendliness perspective.
- Duplicate content probably won’t hurt your site (at the moment) if you are not competing with whatever site has the same content.
- Use canonical tags to signify the original copy of a piece of content.