What happens when content on your website (and blog) is an exact match to multiple sites?
-
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example:
http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/
http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/
http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/
If you google the title of that blog article you find tons of the same article all over the place.
So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
-
Thanks to everyone who commented on this!
Meta, your answer seems to have valid points on different levels. I appreciate the insight!
-
Hey Morgan, I've seen this often with professional sites of all sorts. The vendor is selling a content service but the buyer is either not aware that the same content is being sold to all their clients, or not aware that it makes a difference. Often, the buyer is on the hook for the service for a year or so.
Here's the thing: Competing in the search engines is about differentiating your website and getting people to engage with your content--and it's hard to do either of those things with content that's common to hundreds or thousands of other sites. In answer to your question, the duplication doesn't necessarily make you site irrelevant, it just doesn't give search engines a reason to rank it higher than the next dentist.
What that content does do is provide your local visitors with a feeling that your practice is up to date with news and technology and that can be an advantage over a site that lacks any updated content--you'll just have to drum up those visitors from somewhere other that organic search.
One of those other places is local search. With or without dupe content, you can still focus on making your local results stronger and it can be argued that that's better than showing up in the organic results for many dentists.
-
These dentists seem to be satisfied with pedestrian content on a generic website. They probably rank OK in local search if they are competing in Soldotna or Bugtussle and have someone who knows how to work local.
If they face stiffer competition, especially in organic SERPs, then they will probably not compete very well.
If I was a dentist I would want my own content and photos on the site.... just because.
-
If all these dentist have exactly the same content - how is a prospective customer going to decide which one is best?
"We're just like the next guy" isn't a Unique Value Proposition and isn't going to help your business stand apart from the crowd.
Unique content is harder, but it's so much better than generic "insert your practice name here" boiler plate content.
-
Thanks, James!
Anyone else have any thoughts on this type of thing?
-
It may not be getting them a manual penalty but it's definitely not helping them in the long term either. Creating unique and useful content is the only way to keep gaining organic search traffic in the long run.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix site breadcrumbs on mobile google search
For past one month, I have been doing some research on how to fix this issue on my website but all my efforts didn't work out I really need help on this issue because I'm worried about this I was hoping that Google will cache or understand the structure of my site and correct the error the breadcrumb is working correctly on desktop but not shown on mobile. For Example take a look at : https://www.xclusivepop.com/omah-lay-bad-influence/
White Hat / Black Hat SEO | | Ericrodrigo0 -
Algorithmically penalized site
I have been doing SEO for years, but luckily have never had a client penalized or had to go through that. I see everyone talking about it at conferences and know the absolute basics of recovery, but just had someone come to me that was algorithmically penalized about two years ago. They have no actual data to show me a date and they couldn't tell me a specific date. According to them, their SEO disappeared and wouldn't give them access to the analytics. They are definitely showing just about every red flag with anchor tags and low trust links and tons of duplicate content. Just about everything. I realize you don't have the deep data to go by, but are there cases when it is just better to start over from scratch. They have literally thousands of bad links and strange site pages that they say they weren't even aware of. Whether they were or not I guess isn't the point now, but I have heard rumors that if you start over, Google will still figure it out and follow you with the penalty. Is this true or documented? Don't want to potentially recommend that if that is something that generally happens to bad offenders. Happy to do the work and try to resolve their issues, but it is a lot of work and is going to be expensive and want to present other options. Thanks and any thoughts suggestions are appreciated.
White Hat / Black Hat SEO | | jeremyskillings0 -
Does Duplicate Content Actually "Penalize" a Domain?
Hi all, Some co-workers and myself were in a conversation this afternoon regarding if duplicate content actually causes a penalty on your domain. Reference: https://support.google.com/webmasters/answer/66359?hl=en http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459 Both sources from Google do not say "duplicate content causes a penalty." However, they do allude to spammy content negatively affecting a website. Why it came up: We originally were talking about syndicated content (same content across multiple domains; ex: "5 explanations of bad breath") for the purpose of social media sharing. Imagine if dentists across the nation had access to this piece of content (5 explanations of bad breath) simply for engagement with their audience. They would use this to post on social media & to talk about in the office. But they would not want to rank for that piece of duplicated content. This type of duplicated content would be valuable to dentists in different cities that need engagement with their audience or simply need the content. This is all hypothetical but serious at the same time. I would love some feedback & sourced information / case studies. Is duplicated content actually penalized or will that piece of content just not rank? (feel free to reference that example article as a real world example). **When I say penalized, I mean "the domain is given a negative penalty for showing up in SERPS" - therefore, the website would not rank for "dentists in san francisco, ca". That is my definition of penalty (feel free to correct if you disagree). Thanks all & look forward to a fun, resourceful conversation on duplicate content for the other purposes outside of SEO. Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Forcing Entire site to HTTPS
We have a Wordpress site and hope to force everything to HTTPS. We change the site name (in wordpress settings) to https://mydomain.com In the htaccess code = http://moz.com/blog/htaccess-file-snippets-for-seos Ensure we are using HTTPS version of the site. RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] but some blogs http://stackoverflow.com/questions/19168489/https-force-redirect-not-working-in-wordpress say RewriteCond %{HTTPS} off RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] Which one is right? 🙂 and are we missing anything?
White Hat / Black Hat SEO | | joony0 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
Is there a danger linking to and from one website too many times?
Basically my webdeveloper has suggested that instead of using a subfolder to create an English and Korean version of the site I should create two different websites and then link them both together to provide the page in English, or in Korean, which ever the case may be. My immediate reaction is that search engines may perceive this kind of linking to be manipulative, as you can imagine there will be a lot of links (One for every page). Do you think it is OK to create two webpages and link them together page by page? Or do you think that the site will get penalized by search engines for link farming or link exchanging. Regards, Tom
White Hat / Black Hat SEO | | CoGri0 -
Website Hacked now it's not Ranking
One of my domains was hacked right before I took over managing it. The hacker created around 100 links for simply grotesque things. After I took over I erased the entire site, rebuilt from scratch, new server (inmotion), rewrote every page, robots.txt every offending page, and even 301 just in case 404s were hurting me. I am now almost a month in and I have seen zero movement on anything rankings based. This is not a bad domain it was registered in 2008 and has a few decent citations because of the Doc's medical license. They registered for BBB in November and have a 30 year old listing citation from them based on business establishment. I must be going crazy but it's not ranking for anything except the homepage. I didn't know Google could hold a grudge for so long. The only ranking I can sometimes achieve is through Google Places which still has to compete with tough domains. I've already put in a reconsideration request and received a response stating the following: We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team. Just check it for yourself I know it's a work in progress but I'm not even considered relevant on page 50! And the crap links are still indexed!! A search for a keyword I'm aiming for with my client's name followed after gives me no results. I am currently using wordpress, yoast xml, and single keyword focusses. My market is tough but no way I can not rank for the keyword and my name.
White Hat / Black Hat SEO | | allenrocks0 -
How does Google rank a websites search queries
Hello, I can't seem to find an answer anywhere. I was wondering how a websites search query keyword string url can rank above other page results that have stronger backlinks. The domain is usually strong, but that url with the .php?search=keyword just seems like it doesn't fit in. How does Google index those search string pages? Is it based off of traffic alone to that url? Because those urls typically don't have backlinks, right? Has anyone tried to rank their websites search query urls ever? I'm just a little curious about it. Thanks everyone. Jesse
White Hat / Black Hat SEO | | getrightmusic0