What happens when content on your website (and blog) is an exact match to multiple sites?
-
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example:
http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/
http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/
http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/
If you google the title of that blog article you find tons of the same article all over the place.
So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
-
Thanks to everyone who commented on this!
Meta, your answer seems to have valid points on different levels. I appreciate the insight!
-
Hey Morgan, I've seen this often with professional sites of all sorts. The vendor is selling a content service but the buyer is either not aware that the same content is being sold to all their clients, or not aware that it makes a difference. Often, the buyer is on the hook for the service for a year or so.
Here's the thing: Competing in the search engines is about differentiating your website and getting people to engage with your content--and it's hard to do either of those things with content that's common to hundreds or thousands of other sites. In answer to your question, the duplication doesn't necessarily make you site irrelevant, it just doesn't give search engines a reason to rank it higher than the next dentist.
What that content does do is provide your local visitors with a feeling that your practice is up to date with news and technology and that can be an advantage over a site that lacks any updated content--you'll just have to drum up those visitors from somewhere other that organic search.
One of those other places is local search. With or without dupe content, you can still focus on making your local results stronger and it can be argued that that's better than showing up in the organic results for many dentists.
-
These dentists seem to be satisfied with pedestrian content on a generic website. They probably rank OK in local search if they are competing in Soldotna or Bugtussle and have someone who knows how to work local.
If they face stiffer competition, especially in organic SERPs, then they will probably not compete very well.
If I was a dentist I would want my own content and photos on the site.... just because.
-
If all these dentist have exactly the same content - how is a prospective customer going to decide which one is best?
"We're just like the next guy" isn't a Unique Value Proposition and isn't going to help your business stand apart from the crowd.
Unique content is harder, but it's so much better than generic "insert your practice name here" boiler plate content.
-
Thanks, James!
Anyone else have any thoughts on this type of thing?
-
It may not be getting them a manual penalty but it's definitely not helping them in the long term either. Creating unique and useful content is the only way to keep gaining organic search traffic in the long run.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate product content - from a manufacturer website, to retailers
Hi Mozzers, We're working on a website for a manufacturer who allows retailers to reuse their product information. Now, this of course raises the issue of duplicate content. The manufacturer is the content owner and originator, but retailers will copy the information for their own site and not link back (permitted by the manufacturer) - the only reference to the manufacturer will be the brand name citation on the retailer website. How would you deal with the duplicate content issues that this may cause. Especially considering the domain authority for a lot of the retailer websites is better than the manufacturer site? Thanks!!
White Hat / Black Hat SEO | | A_Q0 -
Separating the syndicated content because of Google News
Dear MozPeople, I am just working on rebuilding a structure of the "news" website. For some reasons, we need to keep syndicated content on the site. But at the same time, we would like to apply for google news again (we have been accepted in the past but got kicked out because of the duplicate content). So I am facing the challenge of separating the Original content from Syndicated as requested by google. But I am not sure which one is better: *A) Put all syndicated content into "/syndicated/" and then Disallow /syndicated/ in robots.txt and set NOINDEX meta on every page. **But in this case, I am not sure, what will happen if we will link to these articles from the other parts of the website. We will waste our link juice, right? Also, google will not crawl these pages, so he will not know about no indexing. Is this OK for google and google news? **B) NOINDEX meta on every page. **Google will crawl these pages, but will not show them in the results. We will still loose our link juice from links pointing to these pages, right? So ... is there any difference? And we should try to put "nofollow" attribute to all the links pointing to the syndicated pages, right? Is there anything else important? This is the first time I am making this kind of "hack" so I am exactly sure what to do and how to proceed. Thank you!
White Hat / Black Hat SEO | | Lukas_TheCurious1 -
Does Duplicate Content Actually "Penalize" a Domain?
Hi all, Some co-workers and myself were in a conversation this afternoon regarding if duplicate content actually causes a penalty on your domain. Reference: https://support.google.com/webmasters/answer/66359?hl=en http://searchengineland.com/googles-matt-cutts-duplicate-content-wont-hurt-you-unless-it-is-spammy-167459 Both sources from Google do not say "duplicate content causes a penalty." However, they do allude to spammy content negatively affecting a website. Why it came up: We originally were talking about syndicated content (same content across multiple domains; ex: "5 explanations of bad breath") for the purpose of social media sharing. Imagine if dentists across the nation had access to this piece of content (5 explanations of bad breath) simply for engagement with their audience. They would use this to post on social media & to talk about in the office. But they would not want to rank for that piece of duplicated content. This type of duplicated content would be valuable to dentists in different cities that need engagement with their audience or simply need the content. This is all hypothetical but serious at the same time. I would love some feedback & sourced information / case studies. Is duplicated content actually penalized or will that piece of content just not rank? (feel free to reference that example article as a real world example). **When I say penalized, I mean "the domain is given a negative penalty for showing up in SERPS" - therefore, the website would not rank for "dentists in san francisco, ca". That is my definition of penalty (feel free to correct if you disagree). Thanks all & look forward to a fun, resourceful conversation on duplicate content for the other purposes outside of SEO. Cole
White Hat / Black Hat SEO | | ColeLusby0 -
What tools do you use to find scraped content?
This hasn’t been an issue for our company so far, but I like to be proactive. What tools do you use to find sites that may have scraped your content? Looking forward to your suggestions. Vic
White Hat / Black Hat SEO | | VicMarcusNWI0 -
WP Datar site shady linking to my site
Hello, I have done some research on this but cannot find a solid answer to my question. After recently reviewing my "not found" errors in webmaster tools, I see that a site called "WP Datar" has linked to a number of our pages that actually do not exist. I am wondering first, if this will harm our site, and second, what is the best way to get those links from their site taken down? I tried emailing, but of course, the email address listed on the site did not work. 🙂 Any help would be greatly appreciated. Thanks!
White Hat / Black Hat SEO | | lfrazer0 -
Competitor is interlinking between his websites
I have a competitor who ranks in the first page for all his keywords and i found out in open site explorer that he has been interlinking between websites and it is obvious because he owns the same domain but different countries. for example, www.example.id (indonesia) www.example.my (malaysia) www.example.sg (singapore) (asian countries domain) my question here is this even consider "white hat"? I read one of the blog post from moz and here is the quote "#7 - Uniqueness of Source + Target The engines have a number of ways to judge and predict ownership and relationships between websites. These can include (but are certainly not limited to): A large number of shared, reciprocated links
White Hat / Black Hat SEO | | andzon
Domain registration data
Shared hosting IP address or IP address C-blocks
Public acquisition/relationship information
Publicized marketing agreements that can be machine-read and interpreted If the engines determine that a pre-existing relationship of some kind could inhibit the "editorial" quality of a link passing between two sites, they may choose to discount or even ignore these. Anecdotal evidence that links shared between "networks" of websites pass little value (particularly the classic SEO strategy of "sitewide" links) is one point many in the organic search field point to on this topic." will interlinking between your sites will be ignored by google in the future? is this a time bomb method or it is fine doing so? Because as far as concern my competitor is actually ranking on the first page for quite some time.1 -
Content optimized for old keywords and G Updates
Hi, We've got some old content, about 50 pages worth in an Ecommerce site, that is optimized for keywords that aren't the subject of the page - these keywords occur about 8 times (2 keywords per page) in the old content. We are going through these 50 pages and changing the title, H1, and meta description tag to match the exact subject of the page - so that we will increase in rankings again - the updates have been lowering our rankings. Do we need to completely rewrite the content for these 50 pages, or can we just sprinkle it with any needed additions of the one keyword that is the subject of the page? The reason I'm asking is that our rankings keep dropping and these 50 pages seem to be part of the problem. We're in the process of updating these 50 pages Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Is it a duplicate content ?
Hi
White Hat / Black Hat SEO | | loumiPlease check this link : http : // www . speedguide . net/news/yahoo-acquires-email-management-app-xobni-5252 it's a post where the admin just write the first 200-300 words and then insert the "read more here" which links to the original post This make the website active as the admin always add new content but is this not against google rules as it's a duplicate content ?? Can you tell me the name of this strategy ? Is this really work to make the website active ??
0