Cant get my head around this duplicate content dilemma!
-
Hi,
Lets say you have a cleaning company, you have a services page, which covers window cleaning, carpet cleaning etc, lets say the content on this page adds up to around 750 words.
Now lets say you would like to create new pages which targeted location specific keywords in your area.
The easiest way would be to copy the services page and just change all the tags to the location specific term but now you have duplicate content.
If I wanted to target 10 locations, does this now mean I need to generate 750 words of unique content for each page which is basically the services page rewritten?
Cheers
-
That's great, Activitysuper,
Just stage the project in a reasonable manner. Your copywriter can't do it all at once, but he/she can do it over time. Good luck!
Miriam
-
Yeah I did find this very helpful, always good to know how someone has actually tackled this problem, it also reassures im not actually being silly and there is a better way of doing it.
Looks like unique for each page is the only way to go.
The only difference is I might add say 200 words for each page from start and then add 100 words more each month, I think this might make it easier to write more in each location.
I got a copywriter, which is half the battle done for me.
-
Thanks, Alan. Glad you found this helpful. I hope ActivitySuper will, too.
-
Thank you Miriam, this is excellent advice, thank you for taking the time to do it. I wish I could give you 10 thumbs up!
-
Hi ActivitySuper!
Thanks for coming to Q&A with what is actually very important question.I sympathize with your puzzlement here because I hear many local business owners saying the same thing - what am I supposed to write about?
Members here are giving you good advice - you've either got to be ready to make the effort/investment, or be satisfied with simply mentioning your services and locations and cross your fingers. If you are the only game in town, that might get you somewhere, but if you've got even 1 local competitor, such an approach will not lead to the dominance that you no doubt seek.
Here is what I do for my clients (some of who, coincidentally, are carpet cleaning companies!) This advice is given with the understanding that, like most business owners in the cleaning industries, you have one actual physical location but serve within a variety of neighboring cities. If that's correct, read on. If that isn't correct and you've got multiple physical offices, let me know.
1. Implement the major local hooks on the website for the physical location - Google is always going to see you as most relevant to your city of location, not other cities within the service radius. In addition to doing the on-site Local SEO, get the business properly listed with a violation-free Google Place Page and other local business directory listings.
2. Create a list of your 5-10 main services. Make a menu on the site of a page for each of these services, optimized for the services + your city of location. The content must be good, strong and unique.
3. Create a list of your 5-10 main service cities. Create a city landing page for each of these cities (including your city of location) creating an overview of your work in each city on each page. Make a menu of these pages on the site. Again, the content must be good, strong and unique. No cutting and pasting!
At this point you will have developed 10-20 pages of unique, creative content for your website. Depending on the competitiveness of your industry in your region, this may get you enough rankings to satisfy you and get phones ringing. However, in most cases, you will want to do more. Move on to step 4.
4. Now, create a big list of all possible combinations. This might look like:
Carpet Cleaning City A
Carpet Cleaning City B
Carpet Cleaning City C
Window Cleaning City A
Window Cleaning City B
Window Cleaning City C
Tile and Grout Cleaning City A
etc.Create a timeline for beginning to write articles over a set number of months to cover each of these phrases. You're not going to do this all at once. My clients have most typically requested that I do anywhere from 3-10 articles a month for them. A blog is terrific for this sort of thing, by the way. If the client has hired me to do 10 articles a month, in 3 months, we've covered 30 terms, in 6 months, we've covered 60 terms, etc.
The client has to participate in this. If he simply paid some penny copywriter to write a bunch of boring, generic content for this large number of terms, chances are, he wouldn't end up with a very pleasant or persuasive website. Rather, he needs to be photographing his projects in the different cities and coming to me with photos, testimonials from clients in the service cities, anecdotes and what have you. I take this, combine it with a solid knowledge of the city and the service/products used, find some other photographs and maybe maps and turn each article into a very solid piece of content. The approach is quite authentic and results in the ongoing creation of an ever-growing library of content about the client's work in each of his cities.
Remember, the whole point of this approach is to obtain secondary visibility (typically organic) for terms outside of his city of location. It should be seen as an ongoing project, and I've seen this approach work time and again for my clients.
You're at an important point of decision right now. You need to decide if you have the creativity and time to do this right on your own, hire a Local SEO-skilled Copywriter to do it for you or if you just can't do either. Sincerely wishing you luck!
Miriam -
I agree, Alan. No matter how hard you try it is going to carry some level of dup - you would be better off trying to target all 10 locations on the main services page than trying to re-spin the same 750 words. Your first suggestion is the approach I would take as well.
-
What you just said is only true if you have nothing to write about.
I should have made it clear that I only advocate doing that if you have something to say that is meaningful.
If you don't, then don't do it - If you have something relevant and useful to say, it is better than repeating the same information, whether you rewrite it or not.
Spinning the 750 word story into 10 different versions is a really bad idea, in my opinion.
-
Yeah that is a option but your now looking at creating semi-relevant content (that's even if you can find something to write about for each location that is semi-relevant).
But your reply is an option so thanks.
-
Only if you want them to be indexed.
Alternatively, you can write 150 to 200 words that apply specifically to the location and link off to the original page with 750 words of stunning content.
For example, here is an idea:
_ Window Cleaning and Carpet Cleaning in Murfreesboro _
Murfreesboro and surrounding areas sometimes presents a problem for carpet cleaners, because of the high incidence of termites. These termites..... blah blah blah.
....
Our Murfreesboro carpet cleaning crews are all locals, so they have an intimate knowledge of ... blah blah blah.
....
Read about how our carpet cleaning service fits your unique needs.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL slash creating duplicate content
Hi All, I currently have an issue whereby by domain name (just homepage) has: mydomain.com and: mydomain.com/ Moz crawler flags this up as duplicate content - does anyone know of a way I can fix this? Thanks! Jack
Technical SEO | | Jack11660 -
Duplicate content issue
Hi, A client of ours has one URL for the moment (https://aalst.mobilepoint.be/) and wants to create a second one with exactly the same content (https://deinze.mobilepoint.be/). Will that mean Google punishes the second one because of duplicate content? What are the recommendations?
Technical SEO | | conversal0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
.com and .co.uk duplicate content
hi mozzers I have a client that has just released a .com version of their .co.uk website. They have basically re-skinned the .co.uk version with some US amends so all the content and title tags are the same. What you do recommend? Canonical tag to the .co.uk version? rewrite titles?
Technical SEO | | KarlBantleman0 -
Duplicate Content in Wordpress.com
Hi Mozers! I have a client with a blog on wordpress.com. http://newsfromtshirts.wordpress.com/ It just had a ranking drop because of a new Panda Update, and I know it's a Dupe Content problem. There are 3900 duplicate pages, basically because there is no use of noindex or canonical tag, so archives, categories pages are totally indexed by Google. If I could install my usual SEO plugin, that would be a piece of cake, but since Wordpress.com is a closed environment I can't. How can I put a noindex into all category, archive and author peges in wordpress.com? I think this could be done by writing a nice robot.txt, but I am not sure about the syntax I shoud use to achieve that. Thank you very much, DoMiSol Rossini
Technical SEO | | DoMiSoL0 -
301 duplicate content dynamic url
I have a number of pages that appear as duplicate titles in google webmaster. They all have to do with a brand name query. I want to 301 these pages since I'm going to relaunch my new website on wordpress and don't want to have 404s on these pages. a simple 301 redirect doesn't work since they are dynamic urls. here is an example: /kidsfashionnetherlands/mimpi.html?q=brand%3Amim+pi%3A&page=2&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=3&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=5&sort=relevance should all be 301 to the original page that I want to remain indexed: /kidsfashionnetherlands/mimpi.html I have a lot of these but for different queries. Should I do a 301 on each of them to avoid having 404s when I change my site to wordpress? Thanks
Technical SEO | | dashinfashion0 -
What are some of the negative effects of having duplicate content from other sites?
This could include republishing several articles from another site with permission.
Technical SEO | | Charlessipe0 -
Complex duplicate content question
We run a network of three local web sites covering three places in close proximity. Each sitehas a lot of unique content (mainly news) but there is a business directory that is shared across all three sites. My plan is that the search engines only index the business in the directory that are actually located in the place the each site is focused on. i.e. Listing pages for business in Alderley Edge are only indexed on alderleyedge.com and businesses in Prestbury only get indexed on prestbury.com - but all business have a listing page on each site. What would be the most effective way to do this? I have been using rel canonical but Google does not always seem to honour this. Will using meta noindex tags where appropriate be the way to go? or would be changing the urls structure to have the place name in and using robots.txt be a better option. As an aside my current url structure is along the lines of: http://dev.alderleyedge.com/directory/listing/138/the-grill-on-the-edge Would changing this have any SEO benefit? Thanks Martin
Technical SEO | | mreeves0