Duplicate content across hundreds of Local sites and they all rank #1
-
Usually when we discuss duplicate content, we're addressing the topic of penalties or non-indexing. In this case, we're discussing ranking high with duplicate content.
I've seen lots of dental, chiropractor and veterinarian sites built by companies that give them cookie cutter sites with the same copy. And they all rank #1 or #2.
Here are two companies that do that:
http://www.rampsites.com/rampsites/home_standard.asp?sectionid=4
http://mysocialpractice.com/about/The later uses external blogs to provide inbound links to their clients' site, but not all services do that, in fact, this is the first time I've seen them with external blogs. Usually the blog with duplicate copy is ON SITE and the sites still rank #1.
Query "Why Your Smile Prefers Water Over Soft Drinks" to see duplicate content on external blogs.
Or "Remember the Mad Hatter from the childhood classic, Alice in Wonderland? Back then, the process of making hats involved using mercury compounds. Overexposure could produce symptoms referred to as being" for duplicate content on chiropractor sites that rank high.
I've seen well optimized sites rank under them even though their sites have just as much quality content and it's all original with more engagement and inbound links.
It appears to me that Google is turning a blind eye on duplicate content. Maybe because these are local businesses with local clientele it doesn't care that a chiropractor in NY has the same content as one in CA, just as the visitor doesn't care because the visitor in CA isn't look at a chiropractor's site in NY generally.
So maybe geo-targeting the site has something to do with it. As a test, I should take the same copy and put it on a non-geo-targeted site and see if it will get indexed.
I asked another Local SEO expert if she has run across this, probably the best in my opinion. She has and she finds it difficult to rank above them as well. It's almost as if Google is favoring those sites.
So the question is, should all dentists, chiropractors and veterinarians give it up to these services? I shudder to think that, but, hey it's working and it's a whole lot less work - and maybe expense - for them.
-
It looks like the sites are just taking the content from their site, and putting it onto other blogs to generate backlinks. It's hard to imagine links from all that duplicate content helping much, but apparently it's working well enough for them. What keywords are they ranking for?
To answer your question, though, you absolutely should not give up on these niches. These sites barely have any PR. You should be able to easily outrank them with a decent site and a few authoritative links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking Sub Categories on Ecommerce Site
Hi, I haven't tested this yet, so before I do I wanted to see if anyone had some experience with this. I have lower level categories I want to rank for SEO for example: Say I want to rank 'Standard Metal Lockers' - with the way our site is set up, I have to work within a classification, which isn't always easy. So it would be categorised as follows: Cupboards & Lockers > Lockers > Standard Lockers > Standard Metal Lockers The URL structure would remain /standard-metal-lockers & I would link this from the 'Lockers' page. Is this too deep in the site structure to rank? I think if it's linked properly & promoted it will be fine, but I'd like to see if anyone else has had this issue. Becky
Intermediate & Advanced SEO | | BeckyKey0 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
Duplicate content - how to diagnose duplicate content from another domain before publishing pages?
Hi, 🙂 My company is having new distributor contract, and we are starting to sell products on our own webshop. Bio-technology is an industry in question and over 1.000 products. Writing product description from scratch would take many hours. The plan is to re-write it. With permission from our contractors we will import their 'product description' on our webshop. But, I am concerned being penalies from Google for duplicate content. If we re-write it we should be fine i guess. But, how can we be sure? Is there any good tool for comparing only text (because i don't want to publish the pages to compare URLs)? What else should we be aware off beside checking 'product description' for duplicate content? Duplicate content is big issue for all of us, i hope this answers will be helpful for many of us. Keep it hard work and thank you very much for your answers, Cheers, Dusan
Intermediate & Advanced SEO | | Chemometec0 -
Is legacy duplicate content an issue?
I am looking for some proof, or at least evidence to whether or not sites are being hurt by duplicate content. The situation is, that there were 4 content rich newspaper/magazine style sites that were basically just reskins of each other. [ a tactic used under a previous regime 😉 ] The least busy of the sites has since been discontinued & 301d to one of the others, but the traffic was so low on the discontinued site as to be lost in noise, so it is unclear if that was any benefit. Now for the last ~2 years all the sites have had unique content going up, but there are still the archives of articles that are on all 3 remaining sites, now I would like to know whether to redirect, remove or rewrite the content, but it is a big decision - the number of duplicate articles? 263,114 ! Is there a chance this is hurting one or more of the sites? Is there anyway to prove it, short of actually doing the work?
Intermediate & Advanced SEO | | Fammy0 -
Opinion on Duplicate Content Scenario
So there are 2 pest control companies owned by the same person - Sovereign and Southern. (The two companies serve different markets) They have two different website URLs, but the website code is actually all the same....the code is hosted in one place....it just uses an if/else structure with dynamic php which determines whether the user sees the Sovereign site or the Southern site....know what I am saying? Here are the two sites: www.sovereignpestcontrol.com and www.southernpestcontrol.com. This is a duplicate content SEO nightmare, right?
Intermediate & Advanced SEO | | MeridianGroup0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Blog Duplicate Content
Hi, I have a blog, and like most blogs I have various search options (subject matter, author, archive, etc) which produce the same content via different URLs. Should I implement the rel-canonical tag AND the meta robots tag (noindex, follow) on every page of duplicate blog content, or simply choose one or the other? What's best practice? Thanks Mozzers! Luke
Intermediate & Advanced SEO | | McTaggart0 -
Duplicate Content across 4 domains
I am working on a new project where the client has 5 domains each with identical website content. There is no rel=canonical. There is a great variation in the number of pages in the index for each of the domains (from 1 to 1250). OSE shows a range of linking domains from 1 to 120 for each domain. I will be strongly recommending to the client to focus on one website and 301 everything from the other domains. I would recommend focusing on the domain that has the most pages indexed and the most referring domains but I've noticed the client has started using one of the other domains in their offline promotional activity and it is now their preferred domain. What are your thoughts on this situation? Would it be better to 301 to the client's preferred domain (and lose a level of ranking power throught the 301 reduction factor + wait for other pages to get indexed) or stick with the highest ranking/most linked domain even though it doesn't match the client's preferred domain used for email addresses etc. Or would it better to use cross-domain canoncial tags? Thanks
Intermediate & Advanced SEO | | bjalc20110