Cant get my head around this duplicate content dilemma!
-
Hi,
Lets say you have a cleaning company, you have a services page, which covers window cleaning, carpet cleaning etc, lets say the content on this page adds up to around 750 words.
Now lets say you would like to create new pages which targeted location specific keywords in your area.
The easiest way would be to copy the services page and just change all the tags to the location specific term but now you have duplicate content.
If I wanted to target 10 locations, does this now mean I need to generate 750 words of unique content for each page which is basically the services page rewritten?
Cheers
-
That's great, Activitysuper,
Just stage the project in a reasonable manner. Your copywriter can't do it all at once, but he/she can do it over time. Good luck!
Miriam
-
Yeah I did find this very helpful, always good to know how someone has actually tackled this problem, it also reassures im not actually being silly and there is a better way of doing it.
Looks like unique for each page is the only way to go.
The only difference is I might add say 200 words for each page from start and then add 100 words more each month, I think this might make it easier to write more in each location.
I got a copywriter, which is half the battle done for me.
-
Thanks, Alan. Glad you found this helpful. I hope ActivitySuper will, too.
-
Thank you Miriam, this is excellent advice, thank you for taking the time to do it. I wish I could give you 10 thumbs up!
-
Hi ActivitySuper!
Thanks for coming to Q&A with what is actually very important question.I sympathize with your puzzlement here because I hear many local business owners saying the same thing - what am I supposed to write about?
Members here are giving you good advice - you've either got to be ready to make the effort/investment, or be satisfied with simply mentioning your services and locations and cross your fingers. If you are the only game in town, that might get you somewhere, but if you've got even 1 local competitor, such an approach will not lead to the dominance that you no doubt seek.
Here is what I do for my clients (some of who, coincidentally, are carpet cleaning companies!) This advice is given with the understanding that, like most business owners in the cleaning industries, you have one actual physical location but serve within a variety of neighboring cities. If that's correct, read on. If that isn't correct and you've got multiple physical offices, let me know.
1. Implement the major local hooks on the website for the physical location - Google is always going to see you as most relevant to your city of location, not other cities within the service radius. In addition to doing the on-site Local SEO, get the business properly listed with a violation-free Google Place Page and other local business directory listings.
2. Create a list of your 5-10 main services. Make a menu on the site of a page for each of these services, optimized for the services + your city of location. The content must be good, strong and unique.
3. Create a list of your 5-10 main service cities. Create a city landing page for each of these cities (including your city of location) creating an overview of your work in each city on each page. Make a menu of these pages on the site. Again, the content must be good, strong and unique. No cutting and pasting!
At this point you will have developed 10-20 pages of unique, creative content for your website. Depending on the competitiveness of your industry in your region, this may get you enough rankings to satisfy you and get phones ringing. However, in most cases, you will want to do more. Move on to step 4.
4. Now, create a big list of all possible combinations. This might look like:
Carpet Cleaning City A
Carpet Cleaning City B
Carpet Cleaning City C
Window Cleaning City A
Window Cleaning City B
Window Cleaning City C
Tile and Grout Cleaning City A
etc.Create a timeline for beginning to write articles over a set number of months to cover each of these phrases. You're not going to do this all at once. My clients have most typically requested that I do anywhere from 3-10 articles a month for them. A blog is terrific for this sort of thing, by the way. If the client has hired me to do 10 articles a month, in 3 months, we've covered 30 terms, in 6 months, we've covered 60 terms, etc.
The client has to participate in this. If he simply paid some penny copywriter to write a bunch of boring, generic content for this large number of terms, chances are, he wouldn't end up with a very pleasant or persuasive website. Rather, he needs to be photographing his projects in the different cities and coming to me with photos, testimonials from clients in the service cities, anecdotes and what have you. I take this, combine it with a solid knowledge of the city and the service/products used, find some other photographs and maybe maps and turn each article into a very solid piece of content. The approach is quite authentic and results in the ongoing creation of an ever-growing library of content about the client's work in each of his cities.
Remember, the whole point of this approach is to obtain secondary visibility (typically organic) for terms outside of his city of location. It should be seen as an ongoing project, and I've seen this approach work time and again for my clients.
You're at an important point of decision right now. You need to decide if you have the creativity and time to do this right on your own, hire a Local SEO-skilled Copywriter to do it for you or if you just can't do either. Sincerely wishing you luck!
Miriam -
I agree, Alan. No matter how hard you try it is going to carry some level of dup - you would be better off trying to target all 10 locations on the main services page than trying to re-spin the same 750 words. Your first suggestion is the approach I would take as well.
-
What you just said is only true if you have nothing to write about.
I should have made it clear that I only advocate doing that if you have something to say that is meaningful.
If you don't, then don't do it - If you have something relevant and useful to say, it is better than repeating the same information, whether you rewrite it or not.
Spinning the 750 word story into 10 different versions is a really bad idea, in my opinion.
-
Yeah that is a option but your now looking at creating semi-relevant content (that's even if you can find something to write about for each location that is semi-relevant).
But your reply is an option so thanks.
-
Only if you want them to be indexed.
Alternatively, you can write 150 to 200 words that apply specifically to the location and link off to the original page with 750 words of stunning content.
For example, here is an idea:
_ Window Cleaning and Carpet Cleaning in Murfreesboro _
Murfreesboro and surrounding areas sometimes presents a problem for carpet cleaners, because of the high incidence of termites. These termites..... blah blah blah.
....
Our Murfreesboro carpet cleaning crews are all locals, so they have an intimate knowledge of ... blah blah blah.
....
Read about how our carpet cleaning service fits your unique needs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Email and landing page duplicate content issue?
Hi Mozers, my question is, if there is a web based email that goes to subscribers, then if they click on a link it lands on a Wordpress page with very similar content, will Google penalize us for duplicate content? If so is the best workaround to make the email no index no follow? Thanks!
Technical SEO | | CalamityJane770 -
Best Way to Handle Near-Duplicate Content?
Hello Dear MOZers, Having duplicate content issues and I'd like some opinions on how best to deal with this problem. Background: I run a website for a cosmetic surgeon in which the most valuable content area is the section of before/after photos of our patients. We have 200+ pages (one patient per page) and each page has a 'description' block of text and a handful of before and after photos. Photos are labeled with very similar labels patient-to-patient ("before surgery", "after surgery", "during surgery" etc). Currently, each page has a unique rel=canonical tag. But MOZ Crawl Diagnostics has found these pages to be duplicate content of each other. For example, using a 'similar page checker' two of these pages were found to be 97% similar. As far as I understand there are a few ways to deal with this, and I'd like to get your opinions on the best course. Add 150+ more words to each description text block Prevent indexing of patient pages with robots.txt Set the rel=canonical for each patient page to the main gallery page Any other options or suggestions? Please keep in mind that this is our most valuable content, so I would be reluctant to make major structural changes, or changes that would result in any decrease in traffic to these pages. Thank you folks, Ethan
Technical SEO | | BernsteinMedicalNYC0 -
Duplicate Content Mystery
Hi Moz community! I have an ongoing duplicate mystery going on here and I'm hoping someone here can answer my question. We have an Ecommerce site that has a variety of product pages and category pages. There are Rel canonicals in place, along with parameters in GWT, and there are also URL rewrites. Here are some scenarios, maybe you can give insight as to what’s exactly going on and how to fix it. All the duplicates look to be coming from category pages specifically. For example:
Technical SEO | | Ecom-Team-Access
This link re-writes: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html?cat=407&color=152&price=20- To: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html The rel canonical tag looks like this: http://www.incipio.com/cases/tablet-cases/amazon-kindle-cases-sleeves.html" /> The CONTENT is different, but the URLs are the same. It thinks that the product category view is the same as the all products view, even though there is a canonical in there telling it which one is the original. Some of them don’t have anything to do with each other. Take a look: Link identified as duplicate: http://www.incipio.com/cases/smartphone-cases/htc-smartphone-cases/htc-windows-phone-8x-cases.html?color=27&price=20- Link this is a duplicate of: http://www.incipio.com/cases/macbook-cases/macbook-pro-13in-cases.html Any idea as to what could be happening here?0 -
Duplicate Content
Hello guys, After fixing the rel tag on similar pages on the site I thought that duplicate content issue were resolved. I checked HTML Improvements on GWT and instead of going down as I expected, it went up. The duplicate issues affect identical product pages which differ from each other just for one detail, let's say length or colour. I could write different meta tags as the duplicate is the meta description, and I did it for some products but still didn't have any effects and they are still showing as duplicates. What would the problem be? Cheers
Technical SEO | | PremioOscar0 -
What to do about similar content getting penalized as duplicate?
We have hundreds of pages that are getting categorized as duplicate content because they are so similar. However, they are different content. Background is that they are names and when you click on each name it has it's own URL. What should we do? We can't canonical any of the pages because they are different names. Thank you!
Technical SEO | | bonnierSEO0 -
Are recipes excluded from duplicate content?
Does anyone know how recipes are treated by search engines? For example, I know press releases are expected to have lots of duplicates out there so they aren't penalized. Does anyone know if recipes are treated the same way. For example, if you Google "three cheese beef pasta shells" you get the first two results with identical content.
Technical SEO | | RiseSEO0 -
Crawl Errors and Duplicate Content
SEOmoz's crawl tool is telling me that I have duplicate content at "www.mydomain.com/pricing" and at "www.mydomain.com/pricing.aspx". Do you think this is just a glitch in the crawl tool (because obviously these two URL's are the same page rather than two separate ones) or do you think this is actually an error I need to worry about? Is so, how do I fix it?
Technical SEO | | MyNet0 -
The Bible and Duplicate Content
We have our complete set of scriptures online, including the Bible at http://lds.org/scriptures. Users can browse to any of the volumes of scriptures. We've improved the user experience by allowing users to link to specific verses in context which will scroll to and highlight the linked verse. However, this creates a significant amount of duplicate content. For example, these links: http://lds.org/scriptures/nt/james/1.5 http://lds.org/scriptures/nt/james/1.5-10 http://lds.org/scriptures/nt/james/1 All of those will link to the same chapter in the book of James, yet the first two will highlight the verse 5 and verses 5-10 respectively. This is a good user experience because in other sections of our site and on blogs throughout the world webmasters link to specific verses so the reader can see the verse in context of the rest of the chapter. Another bible site has separate html pages for each verse individually and tends to outrank us because of this (and possibly some other reasons) for long tail chapter/verse queries. However, our tests indicated that the current version is preferred by users. We have a sitemap ready to publish which includes a URL for every chapter/verse. We hope this will improve indexing of some of the more popular verses. However, Googlebot is going to see some duplicate content as it crawls that sitemap! So the question is: is the sitemap a good idea realizing that we can't revert back to including each chapter/verse on its own unique page? We are also going to recommend that we create unique titles for each of the verses and pass a portion of the text from the verse into the meta description. Will this perhaps be enough to satisfy Googlebot that the pages are in fact unique? They certainly are from a user perspective. Thanks all for taking the time!
Technical SEO | | LDS-SEO0