What happens when content on your website (and blog) is an exact match to multiple sites?
-
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example:
http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/
http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/
http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/
If you google the title of that blog article you find tons of the same article all over the place.
So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
-
Thanks to everyone who commented on this!
Meta, your answer seems to have valid points on different levels. I appreciate the insight!
-
Hey Morgan, I've seen this often with professional sites of all sorts. The vendor is selling a content service but the buyer is either not aware that the same content is being sold to all their clients, or not aware that it makes a difference. Often, the buyer is on the hook for the service for a year or so.
Here's the thing: Competing in the search engines is about differentiating your website and getting people to engage with your content--and it's hard to do either of those things with content that's common to hundreds or thousands of other sites. In answer to your question, the duplication doesn't necessarily make you site irrelevant, it just doesn't give search engines a reason to rank it higher than the next dentist.
What that content does do is provide your local visitors with a feeling that your practice is up to date with news and technology and that can be an advantage over a site that lacks any updated content--you'll just have to drum up those visitors from somewhere other that organic search.
One of those other places is local search. With or without dupe content, you can still focus on making your local results stronger and it can be argued that that's better than showing up in the organic results for many dentists.
-
These dentists seem to be satisfied with pedestrian content on a generic website. They probably rank OK in local search if they are competing in Soldotna or Bugtussle and have someone who knows how to work local.
If they face stiffer competition, especially in organic SERPs, then they will probably not compete very well.
If I was a dentist I would want my own content and photos on the site.... just because.
-
If all these dentist have exactly the same content - how is a prospective customer going to decide which one is best?
"We're just like the next guy" isn't a Unique Value Proposition and isn't going to help your business stand apart from the crowd.
Unique content is harder, but it's so much better than generic "insert your practice name here" boiler plate content.
-
Thanks, James!
Anyone else have any thoughts on this type of thing?
-
It may not be getting them a manual penalty but it's definitely not helping them in the long term either. Creating unique and useful content is the only way to keep gaining organic search traffic in the long run.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this considered duplicate content?
Hi Guys, We have a blog for our e-commerce store. We have a full-time in-house writer producing content. As part of our process, we do content briefs, and as part of the brief we analyze competing pieces of content existing on the web. Most of the time, the sources are large publications (i.e HGTV, elledecor, apartmenttherapy, Housebeautiful, NY Times, etc.). The analysis is basically a summary/breakdown of the article, and is sometimes 2-3 paragraphs long for longer pieces of content. The competing content analysis is used to create an outline of our article, and incorporates most important details/facts from competing pieces, but not all. Most of our articles run 1500-3000 words. Here are the questions: NOTE: the summaries are written by us, and not copied/pasted from other websites. Would it be considered duplicate content, or bad SEO practice, if we list sources/links we used at the bottom of our blog post, with the summary from our content brief? Could this be beneficial as far as SEO? If we do this, should be nofollow the links, or use regular dofollow links? For example: For your convenience, here are some articles we found helpful, along with brief summaries: <summary>I want to use as much of the content that we have spent time on. TIA</summary>
White Hat / Black Hat SEO | | kekepeche1 -
Multiple Websites Under Single Brand
My company has many websites having similar content . They are into the retail business and sell manufactured homes. Since they sell different manufacturer homes products having different specification in different states that's why they have created many websites. Will duplicate design and similar content on some of the pages across multiple website will cause issues in Google ranking.
White Hat / Black Hat SEO | | christmaslaserlights0 -
Moz was unable to crawl your site? Redirect Loop issue
Moz was unable to crawl your site on Jul 25, 2017. I am getting this message for my site: It says "unable to access your homepage due to a redirect loop. https://kuzyklaw.com/ Site is working fine and last crawled on 22nd July. I am not sure why this issue is coming. When I checked the website in Chrome extension it saysThe server has previously indicated this domain should always be accessed via HTTPS (HSTS Protocol). Chrome has cached this internally, and did not connect to any server for this redirect. Chrome reports this redirect as a "307 Internal Redirect" however this probably would have been a "301 Permanent redirect" originally. You can verify this by clearing your browser cache and visiting the original URL again. Not sure if this is actual issue, This is migrated on Https just 5 days ago so may be it will resolved automatically. Not sure, can anybody from Moz team help me with this?
White Hat / Black Hat SEO | | CustomCreatives0 -
Is Syndicated (Duplicate) Content considered Fresh Content?
Hi all, I've been asking quite a bit of questions lately and sincerely appreciate your feedback. My co-workers & I have been discussing content as an avenue outside of SEO. There is a lot of syndicated content programs/plugins out there (in a lot of cases duplicate) - would this be considered fresh content on an individual domain? An example may clearly show what I'm after: domain1.com is a lawyer in Seattle.
White Hat / Black Hat SEO | | ColeLusby
domain2.com is a lawyer in New York. Both need content on their website relating to being a lawyer for Google to understand what the domain is about. Fresh content is also a factor within Google's algorithm (source: http://moz.com/blog/google-fresh-factor). Therefore, fresh content is needed on their domain. But what if that content is duplicate, does it still hold the same value? Question: Is fresh content (adding new / updating existing content) still considered "fresh" even if it's duplicate (across multiple domains). Purpose: domain1.com may benefit from a resource for his/her local clientale as the same would domain2.com. And both customers would be reading the "duplicate content" for the first time. Therefore, both lawyers will be seen as an authority & improve their website to rank well. We weren't interested in ranking the individual article and are aware of canonical URLs. We aren't implementing this as a strategy - just as a means to really understand content marketing outside of SEO. Conclusion: IF duplicate content is still considered fresh content on an individual domain, then couldn't duplicate content (that obviously won't rank) still help SEO across a domain? This may sound controversial & I desire an open-ended discussion with linked sources / case studies. This conversation may tie into another Q&A I posted: http://moz.com/community/q/does-duplicate-content-actually-penalize-a-domain. TLDR version: Is duplicate content (same article across multiple domains) considered fresh content on an individual domain? Thanks so much, Cole0 -
Should I report this to Google and will anything happen ?
Hi, I am working with a client and have discovered that a direct competitor has hidden the clients business name in meta information and also hidden the name on the page but off to the side. My intention is to ask the company to remove the content, but the client would like me to report it to Google. Is this a waste of time and what request in webmaster tools should I use. The name is not a trademark but the business name is not generic and it is an obvious attempt to target my clients business. Any help would be appreciated, Thanks in advance
White Hat / Black Hat SEO | | Mozzi0 -
Duplicate Content
Hi, I have a website with over 500 pages. The website is a home service website that services clients in different areas of the UK. My question is, am I able to take down the pages from my URL, leave them down for say a week, so when Google bots crawl the pages, they do not exist. Can I then re upload them to a different website URL, and then Google wont penalise me for duplicate content? I know I would of lost juice and page rank, but that doesnt really matter, because the site had taken a knock since the Google update. Thanks for your help. Chris,
White Hat / Black Hat SEO | | chrisellett0 -
How to Not Scrap Content, but still Being a Hub
Hello Seomoz members. I'm relatively new to SEO, so please forgive me if my questions are a little basic. One of the sites I manage is GoldSilver.com. We sell gold and silver coins and bars, but we also have a very important news aspect to our site. For about 2-3 years now we have been a major hub as a gold and silver news aggregator. At 1.5 years ago (before we knew much about SEO), we switched from linking to the original news site to scraping their content and putting it on our site. The chief reason for this was users would click outbound to read an article, see an ad for a competitor, then buy elsewhere. We were trying to avoid this (a relatively stupid decision with hindsight). We have realized that the Search Engines are penalizing us, which I don't blame them for, for having this scraped content on our site. So I'm trying to figure out how to move forward from here. We would like to remain a hub for news related to Gold and Silver and not be penalized by SEs, but we also need to sell bullion and would like to avoid loosing clients to competitors through ads on the news articles. One of the solutions we are thinking about is perhaps using an iFrame to display the original url, but within our experience. An example is how trap.it does this (see attached picture). This way we can still control the experience some what, but are still remaining a hub. Thoughts? Thank you, nick 3dLVv
White Hat / Black Hat SEO | | nwright0 -
Has anyone been able to recover a site from that was slapped by panda?
I have a client that the only thing I can determine is over optimization of a couple anchor terms which the person no longer ranks for..I tried mixing up with brandname , brandname.com and a diversity of links but nothing seems to budge anyone have a similar problem?
White Hat / Black Hat SEO | | foreignhaus0