New Client Wants to Keep Duplicate Content Targeting Different Cities
-
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities.
We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
-
This is a tough situation. I tend to agree with Ricky - these are exactly the kinds of pages that have been hit by Panda, and there's real risk. In the old days, the biggest risk was that the pages would just stop getting traffic. Now, the impact could hit the rest of the site as well, and it's a lot more dangerous.
The problem is that it's working for now, and you're asking them to give up traffic in the short-term to avoid losing it in the long-term. Again, I think the long-term risk is serious (and it's not that easy to recover from), but the short-term pain to the client is very real.
What's the scope of the 300 pages compared to the rest of the site (are we talking a 400 page site or a 40,000 page site)? How many of these city pages are getting real traffic? My best alternative solution is to pin down the 10-20% of the city pages getting most of the traffic, temporarily NOINDEX the rest, and then beef up those well-trafficked city pages with unique content (so, maybe you're talking about 30 pages). Then, build out from there.
Give these pages real value - it's not only good for SEO, but it will probably improve conversion, too. The other problem with pages that just swap out a city is that they're often low quality - they may draw traffic in, but then have high bounce rates and low conversion. If you can show that you can improve the value, even with some traffic loss, it's easier to win this fight.
-
Does the analytics support specific city search terms targeting those city specific pages, or going to the home page (or the canonical version of the duplicate content page)?
If it is the later, then you certainly should move those city specific keyword terms into the single version of the duplicate content in some creative fashion.
Regardless you still should remove the duplicate content, preferably sooner than later because they are certainly low value pages!
-
I agree with Ricky - I would slowly make all those pages unique in some way. I still find it beneficial to rank to different city pages as long as they have prime content. Google will eventually sift its way and find those pages as spam.
-
It seems to me that Google would see all of that duplicate content and simply have 1 page ranking as the canonical page. If they are seeing organic traffic and rankings for multiple pages, I am not sure how long that will last. From what I understand, it would be best to start the slow process of making the content on each page somewhat unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content and Other Issues from Blog Tags and Categories
I have recently taken over the maintenance/redesign of our website and after setting up Moz I see many errors:
On-Page Optimization | | jgoethert
Duplicate content
Missing descriptions
Duplicate titles
etc. All are related to blog categories and tags. My questions are: are these errors hurting us? Should I simply remove tags/categories from the sitemaps or bite the bullet and create content for every single category page? Our site is https://financiallysimple.com/ and we are using Yoast plugin in Wordpress (if that helps)2 -
More Singular KW Targeted Landing Pages vs. Less Multiple KW Targeted Landing Pages
So my question is... I have a adopted a site which currently ranks quite well for some industry competitive keywords with a number of poor quality landing pages which specifically target a singular keyword. I am wondering if its worth merging some of these pages together into one authoritative, better quality landing page targeting multiple keywords (as the intent for some of these keywords are largely the same). What i don't want to do is jeopardise the existing rankings in doing so. The alternative option would just be to improve the content on the existing landing pages without merging. What are peoples thoughts on this? Are there any positive case studies out there where merging has had a positive effect? Any help would be great. Regards,
On-Page Optimization | | NickG-1231 -
Avoiding Duplicate Title Tags and Duplicate Content
Hi - I have a question on how to both avoid duplicate title tags and duplicate content AND still create a good user experience. I have a lot of SEO basics to do as the company has not done any SEO to this point. I work for a small cruise line. We have a page for each cruise. Each cruise is associated with a unique itinerary. However the ports of call are not necessarily unique to each itinerary. For each port on the itinerary there are also a set of excursions and if the port is the embark/disembark port, hotels that are associated. The availability of the excursions and hotels depends on the dates associated with the cruise. Today, we have two pages associated with each cruise for the excursions and hotels: mycruisecompany.com/cruise/name-of-cruise/port/excursion/?date=dateinport mycruisecompany.com/cruise/name-of-cruise/port/hotel/?date=dateinport When someone navigates to these pages, they can see a list of relevant content. From a user perspective the list I see is only associated with the relevant date (which is determined by a set of query parameters). Unfortunately, there are situations where the same content is on multiple pages. For instance the exact same set of hotels or excursions might be available for two different cruises or on multiple dates of the same cruise. This is causing a couple of different challenges. For instance, with regard to title tags, we have <title>Hotels in Rome</title> multiple times. I know that isn't good. If I tried to just have a hub page with hotels and a hub page with excursions available from each cruise and then a page for each hotel and excursion, each with a unique title tag, then the challenge is that I don't know how to not make the customer have to work through whether the hotel they are looking for is actually available on the dates in question. So while I can guarantee unique content/title tags, I end up asking the user to think too much. Thoughts?
On-Page Optimization | | Marston_Gould1 -
Ratings pages are Duplicate Content
This brought up another question. should the review page (which now has a canonical to the item page) be Index,follow? My item review pages are showing up with Duplicate Content errors in MOZ. Here are two examples http://www.americanmusical.com/ItemReview--i-HAM-SK1-LIST http://www.americanmusical.com/ItemReview--i-MAC-203680902-LIST is the problem that the pages contain the same code and questions with very little customer created info?
On-Page Optimization | | dianeb1520 -
Empty public profiles are viewed as duplicate content. What to do?
Hi! I manage a social networking site. We have a lot of public user profiles that are viewed as duplicate content. This is because these users haven't filled out any public profile info and thus the profiles are "empty" (except for the name). Is this something I should worry about? If yes, what are my options to solve this? Thanks!
On-Page Optimization | | thomasvanderkleij0 -
"Issue: Duplicate Page Content " in Crawl Diagnostics - but these pages are noindex
Saw an issue back in 2011 about this and I'm experiencing the same issue. http://moz.com/community/q/issue-duplicate-page-content-in-crawl-diagnostics-but-these-pages-are-noindex We have pages that are meta-tagged as no-everything for bots but are being reported as duplicate. Any suggestions on how to exclude them from the Moz bot?
On-Page Optimization | | Deb_VHB0 -
Duplicate content list by SEOMOZ
Hi Friends, I am seeing lot of duplicate (about 10%) from the crawl report of SEOMOZ. The report says, "Duplicate Page Content" But the urls it listed have different title, different url and also different content. I am not sure how to fix this issue.. My site has both Indian cinema news and photo gallery. The problme mainly coming in photo gallery posts. for example: this is the main url of a post. apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos . But in this post, each image is a link to its enlarged images (default wordpress). The problem is coming with each individual image with in this post. examples of SEOMOZ report 3 individual urls as duplicate content...from the same above post.: http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-4 http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-3 http://apgossips.com/2012/12/18/telugu-actress-poonam-kaur-photos/poonam-kaur-hot-photo-shoot-stills-2 Some body please advise me.. Appreciate your help.
On-Page Optimization | | ksnath0 -
Will duplicate content supplied from a hotel provider damage my website, or simply just the pages that it appears on?
Hi, I currently have a lot of hotel listings pages with little or no content, as I'm scared that if I place duplicate hotel descriptions on the pages then Google will stop ranking the page. I've found that having descriptions of some kind do help conversion significantly, so I'm considering generating unique hotel descriptions on each main page (page 1 in each set of listings) - these are the pages that Google indexes. On subsequent pages (page 2, page 3 etc.) I'm thinking about resorting to displaying the duplicate affiliate content hotel descriptions - these pages can be crawled but are set to noindex. My question is, do you think this is likely to have an effect on my website in the rankings, and as a result push my primary pages (that contain 100% unique content) down in SERPs. Thanks Mike
On-Page Optimization | | mjk260