New Client Wants to Keep Duplicate Content Targeting Different Cities
-
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities.
We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
-
This is a tough situation. I tend to agree with Ricky - these are exactly the kinds of pages that have been hit by Panda, and there's real risk. In the old days, the biggest risk was that the pages would just stop getting traffic. Now, the impact could hit the rest of the site as well, and it's a lot more dangerous.
The problem is that it's working for now, and you're asking them to give up traffic in the short-term to avoid losing it in the long-term. Again, I think the long-term risk is serious (and it's not that easy to recover from), but the short-term pain to the client is very real.
What's the scope of the 300 pages compared to the rest of the site (are we talking a 400 page site or a 40,000 page site)? How many of these city pages are getting real traffic? My best alternative solution is to pin down the 10-20% of the city pages getting most of the traffic, temporarily NOINDEX the rest, and then beef up those well-trafficked city pages with unique content (so, maybe you're talking about 30 pages). Then, build out from there.
Give these pages real value - it's not only good for SEO, but it will probably improve conversion, too. The other problem with pages that just swap out a city is that they're often low quality - they may draw traffic in, but then have high bounce rates and low conversion. If you can show that you can improve the value, even with some traffic loss, it's easier to win this fight.
-
Does the analytics support specific city search terms targeting those city specific pages, or going to the home page (or the canonical version of the duplicate content page)?
If it is the later, then you certainly should move those city specific keyword terms into the single version of the duplicate content in some creative fashion.
Regardless you still should remove the duplicate content, preferably sooner than later because they are certainly low value pages!
-
I agree with Ricky - I would slowly make all those pages unique in some way. I still find it beneficial to rank to different city pages as long as they have prime content. Google will eventually sift its way and find those pages as spam.
-
It seems to me that Google would see all of that duplicate content and simply have 1 page ranking as the canonical page. If they are seeing organic traffic and rankings for multiple pages, I am not sure how long that will last. From what I understand, it would be best to start the slow process of making the content on each page somewhat unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicated content by the product pages
Hi,Do you thing those pages have duplicate content:https://www.nobelcom.com/Afghanistan-phone-cards/from-Romania-235-2.htmlhttps://www.nobelcom.com/Afghanistan-phone-cards-2.htmlhttps://www.nobelcom.com/Afghanistan-Cell-phone-cards-401.htmlhttps://www.nobelcom.com/Afghanistan-Cell-phone-cards/from-Romania-235-401.html.And also how much impact will it have on a panda update?I'm trying to figure out if all the product pages, (that are in the same way as the ones above) are the reson for a Panda Penalty
On-Page Optimization | | Silviu0 -
Is tracking code added to the end of a URL considered duplicate content
I have two URLs one with a tracking coded and one without. http://www.towermarketing.net/lets-talk-ux-baby and http://www.towermarketing.net/lets-talk-ux-baby/**#.U6ghgLEz64I ** My question is will this be considered as two separate URLs, will Google consider this as two pages with duplicate content. Any recommendations would be much appreciated.
On-Page Optimization | | TowerMarketing0 -
E-commerce store having same content different language pages
Hello, I have an e-commerce store operating on PrestaShop. I have four languages Fr, De, En and Nl. The url for each page changes like Example.com/en/product1 Example.com/fr/Product1 Example.com/de/product1 Example.com/nl/Product1 All these pages have same content about product1 but translated in respective language. Is it considered as duplicate content? Should I suppose to write different content for each page?
On-Page Optimization | | MozAddict0 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0 -
New bookingsengine url, what would you do?
A client of mine is introducing a new and improved bookingsengine. They're launching it on a different url than the existing one. The existing one needs to stay online a little bit longer for affiliate purposes. The old engine url has a sitelink in the SERPS and ranks well on a few terms. I'm wondering what you would do in this case? They want the new url to rank as quickly as possible also as sitelink of course. Any help greatly appreciated. I have some thoughts of my own of course... 🙂 But to keep the discussion as wide as possible... I'll wait a bit to add m thoughts.
On-Page Optimization | | YannickVeys0 -
Duplicate content on homepage?
Hi I have just created a new campaign and it states that I have duplicate page content which would affect search rankings. Basically it is counting my site www.mydomain.com and www.mydomain.com/index.php as two seperate pages. How can I make it so that only www.mydomain.com is visible reducing the duplicate content issue? Many Thanks
On-Page Optimization | | idv0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5