What happens when content on your website (and blog) is an exact match to multiple sites?
-
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example:
http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/
http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/
http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/
If you google the title of that blog article you find tons of the same article all over the place.
So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
-
Thanks to everyone who commented on this!
Meta, your answer seems to have valid points on different levels. I appreciate the insight!
-
Hey Morgan, I've seen this often with professional sites of all sorts. The vendor is selling a content service but the buyer is either not aware that the same content is being sold to all their clients, or not aware that it makes a difference. Often, the buyer is on the hook for the service for a year or so.
Here's the thing: Competing in the search engines is about differentiating your website and getting people to engage with your content--and it's hard to do either of those things with content that's common to hundreds or thousands of other sites. In answer to your question, the duplication doesn't necessarily make you site irrelevant, it just doesn't give search engines a reason to rank it higher than the next dentist.
What that content does do is provide your local visitors with a feeling that your practice is up to date with news and technology and that can be an advantage over a site that lacks any updated content--you'll just have to drum up those visitors from somewhere other that organic search.
One of those other places is local search. With or without dupe content, you can still focus on making your local results stronger and it can be argued that that's better than showing up in the organic results for many dentists.
-
These dentists seem to be satisfied with pedestrian content on a generic website. They probably rank OK in local search if they are competing in Soldotna or Bugtussle and have someone who knows how to work local.
If they face stiffer competition, especially in organic SERPs, then they will probably not compete very well.
If I was a dentist I would want my own content and photos on the site.... just because.
-
If all these dentist have exactly the same content - how is a prospective customer going to decide which one is best?
"We're just like the next guy" isn't a Unique Value Proposition and isn't going to help your business stand apart from the crowd.
Unique content is harder, but it's so much better than generic "insert your practice name here" boiler plate content.
-
Thanks, James!
Anyone else have any thoughts on this type of thing?
-
It may not be getting them a manual penalty but it's definitely not helping them in the long term either. Creating unique and useful content is the only way to keep gaining organic search traffic in the long run.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidating small sites into one big site
Hi I have several small review sites in multiple categories and want to consolidate them into a single review site(aged domain I just bought) I'd redirect the old sites to the new one. If I just copied all the old articles onto the new website with solid DA, would this work or would Google think Im trying to start a PBN? Thanks! Eddie
White Hat / Black Hat SEO | | calentador20190 -
How to deal with link echoes of former hacked websites?
Hi all, I'd know which is the best way to deal with link echoes of former hacked websites that Webmaster tool reports. to clarify: when you download the backlink report from Webmaster tool you'll have a list of backlinks discovered, but if you follow one of those links you will see that on that page there is no link to your website. the source code is also clean, no hidden links or other dodgy technique. Since that the topic is usually miles away from my industry I have to assume at some point that site has been hacked by a spammer who placed that backlink. In this case what should I do? Ignore it, disavow the domain or what? Moreover, which is the best procedure when you have to face a site which points a lot of backlinks from only its sub-domains? For example: this dodgy spammy website : http://px949z32.com/ is apparently a desert, but when you do site:http://px949z32.com/ you'll discover 55,200 results! Would be it be enough to just disavow the root domain http://px949z32.com/?
White Hat / Black Hat SEO | | madcow78
As I don't want to wait too long before taking any action, my plan is to disavow all those domains without any mercy, although I can't find a current backlink in one of their pages. I will do this, as at the minute my concern is they will be hacked again and I have to face the same issue again and again Thanks to all, P.0 -
Website slipped for particular keyword this year
www.schupepetents.co.nz I am webmaster for this site which slipped in rankings around 18/1/13. It was doing really well for keyword "marquee hire" ranking 6-8 for google.co.nz. It is now ranked about 30. Background. It has been ranked well for keyword for about 3 years. From what I can see there was a few websites that tumbled around this time. The website has been completely redone to a wordpress site this was in December last year. The switch was done before putting in 301 re directs to preserve internal page rank. As a result the internal page rank of pages has had to start again! Three things from online research into what has potentially affected this:
White Hat / Black Hat SEO | | chopchop
1. Lack of website substance. As website has been redone it has lost Google "esteem" (lack of internal pages PR). But the time that it dropped was a long time after the site redesign.
2. Possible penalty for too high proportion of links from directory websites. Heard some whispers from forums that this had happened. But links are not just from directories.
3. Too many links with specific keyword (marquee hire) for junk sites. This is possibly true but why the drop at that time when no one else experienced this. Also we hired a company mid last year who link bombed using "marquee hire" and a couple other keywords. Moz seems to be very happy with the website!! Good link scores and on-page optimisation is great. Wondering what has happened?!0 -
Google authorship and multiple sites with multiple authors
Hi guys :). I am asking your help - basically I would like to know what would be the best way to set all of this up. Basically I have two main (e-commerce) sites, and a few other big web properties. What I would like to know is if it is ok to link the main sites to my real G+ account, and use alias G+ accounts for other web properties, or is that a kind of spamming? The thing is that I use a G+ account for those e-commerce sites, and would not necessarily want the other web properties to be linked to the same G+ account, as they are not really related. I do hope I was clear. Any insight would be appreciated. Thanks.
White Hat / Black Hat SEO | | sumare0 -
Duplicate Content due to Panda update!
I can see that a lot of you are worrying about this new Panda update just as I am! I have such a headache trying to figure this one out, can any of you help me? I have thousands of pages that are "duplicate content" which I just can't for the life of me see how... take these two for example: http://www.eteach.com/Employer.aspx?EmpNo=18753 http://www.eteach.com/Employer.aspx?EmpNo=31241 My campaign crawler is telling me these are duplicate content pages because of the same title (which that I can see) and because of the content (which I can't see). Can anyone see how Google is interpreting these two pages as duplicate content?? Stupid Panda!
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Website mallware attacks
I keep getting attacks to my website every time that are being blocked by OSE firewall Is there any way to stop this? I am affraid because they actually manage enter my website on the past, and i dont know if they can enter on the future or if having all the pluggins and wordpress updated. I am safe enough, and i am not sure if there is any type of virus on my computer Macbook as those attacked pages were recently updated from my computer. Is there any malware scan for Mac Thanl you == Attack Details == TYPE: Found Basic DoS Attacks DETECTED ATTACK VALUE: dDos Attack ACTION: Blocked LOGTIME: 2013-02-25 11:48:18 FROM IP: http://whois.domaintools.com/75.126.24.81 URI: [http://www.propdental.es/](http://www.propdental.es/) METHOD: HEAD USERAGENT: N/A REFERRER: N/A == Attack Details == TYPE: Found Basic DoS Attacks DETECTED ATTACK VALUE: dDos Attack ACTION: Blocked LOGTIME: 2013-02-25 10:13:17 FROM IP: http://whois.domaintools.com/107.21.150.82 URI: [http://www.propdental.es/blanqueamiento-dental/](http://www.propdental.es/blanqueamiento-dental/) METHOD: HEAD USERAGENT: N/A REFERRER: N/A ``` == Attack Details == TYPE: Found Malicious User Agent DETECTED ATTACK VALUE: curl/7.15.5 (x86_64-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5 ACTION: Blocked LOGTIME: 2013-02-25 03:13:52 FROM IP: http://whois.domaintools.com/119.245.226.74 URI: [http://www.propdental.es/sonrisas/los-martinez/](http://www.propdental.es/sonrisas/los-martinez/) METHOD: HEAD USERAGENT: curl/7.15.5 (x86_64-redhat-linux-gnu) libcurl/7.15.5 OpenSSL/0.9.8b zlib/1.2.3 libidn/0.6.5 REFERRER: N/A ``` ```
White Hat / Black Hat SEO | | maestrosonrisas0 -
Is Meta Keywords Important For Websites?
Hi, I understand that meta title and descriptions are very important for websites. I would like to know if meta keywords are important? I have seen people talking about meta keywords are useless and it should be removed from the website to prevent competitors from knowing your keywords. Anyone has anything to share? 🙂
White Hat / Black Hat SEO | | chanel270 -
Would you get link from this blog?
I have an opportunity to place a guest blog on a site. The site has the following metrics: DA/PA: 24/36 Inbound links: 3K+ from 16 root domains Here is what makes me uneasy: The number of links from the same domain, suggesting sitewide or footer links When I look at the backlinks, there are links from sites like http://best-american-law-firms.info/, or http://www.luvbuds.info/. They sare blogroll links that are likely paid for. Would you get a link from this blog?
White Hat / Black Hat SEO | | inhouseseo0