New Client Wants to Keep Duplicate Content Targeting Different Cities
-
We've got a new client who has about 300 pages on their website that are the same except the cities that are being targeted. Thus far the website has not been affected by penguin or panda updates, and the client wants to keep the pages because they are bringing in a lot of traffic for those cities.
We are concerned about duplicate content penalties; do you think we should get rid of these pages or keep them?
-
This is a tough situation. I tend to agree with Ricky - these are exactly the kinds of pages that have been hit by Panda, and there's real risk. In the old days, the biggest risk was that the pages would just stop getting traffic. Now, the impact could hit the rest of the site as well, and it's a lot more dangerous.
The problem is that it's working for now, and you're asking them to give up traffic in the short-term to avoid losing it in the long-term. Again, I think the long-term risk is serious (and it's not that easy to recover from), but the short-term pain to the client is very real.
What's the scope of the 300 pages compared to the rest of the site (are we talking a 400 page site or a 40,000 page site)? How many of these city pages are getting real traffic? My best alternative solution is to pin down the 10-20% of the city pages getting most of the traffic, temporarily NOINDEX the rest, and then beef up those well-trafficked city pages with unique content (so, maybe you're talking about 30 pages). Then, build out from there.
Give these pages real value - it's not only good for SEO, but it will probably improve conversion, too. The other problem with pages that just swap out a city is that they're often low quality - they may draw traffic in, but then have high bounce rates and low conversion. If you can show that you can improve the value, even with some traffic loss, it's easier to win this fight.
-
Does the analytics support specific city search terms targeting those city specific pages, or going to the home page (or the canonical version of the duplicate content page)?
If it is the later, then you certainly should move those city specific keyword terms into the single version of the duplicate content in some creative fashion.
Regardless you still should remove the duplicate content, preferably sooner than later because they are certainly low value pages!
-
I agree with Ricky - I would slowly make all those pages unique in some way. I still find it beneficial to rank to different city pages as long as they have prime content. Google will eventually sift its way and find those pages as spam.
-
It seems to me that Google would see all of that duplicate content and simply have 1 page ranking as the canonical page. If they are seeing organic traffic and rankings for multiple pages, I am not sure how long that will last. From what I understand, it would be best to start the slow process of making the content on each page somewhat unique.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New Page Not ranking?
One of this client's top keyword is "oak beams". They already rank well in the UK for other related terms like "reclaimed oak beams" at /reclaimed-oak-beams/ and "air dried oak beams" at /air-dried-oak-beams/ We have created a page at /oak-beams/ but this page ranks nowhere? Instead the reclaimed oak beams or air dried oak beams page ranks for the term "oak beams". Any ideas why Google is swapping between those pages and not choosing the /oak-beams/ page? A few notes are that the /oak-beams/ page is newest page on the site and yes I know there are no links pointing to it but there are no links pointing to the other pages either?
On-Page Optimization | | Marketing_Today0 -
Single Page on my client's website is not crawling and indexing new changes. What could be the possible reason?
I made several changes on client's website on different pages, changed titles, add content on few pages, moved blog from subdomain to sub directory. Everything is crawled but there is one page on the website (not part of the blog) that isn't getting crawled in Google and picking up changes. The last crawl of the website is 2 days back whereas that page was last crawled on 30th sep. I just wanted to know the possible reasons and has anyone encountered this before?
On-Page Optimization | | MoosaHemani0 -
Category Page Content
Hey Mozzers, I've recently been doing a content audit on the category and sub-category pages on our site. The old pages had the following "profile" Above The Fold
On-Page Optimization | | ATP
Page Heading
Image Links to Categories / Products
Below the Fold
The rest of the Image Links to Categories / Products
600 words+ of content duplicated from articles, sub categories and products My criticisms of the page were
1. No content (text) above the fold
2. Page content was mostly duplicated content
3. No keyword structure, many pages competed for the same keywords and often unwanted pages outranked the desired page for the keyword. I cleaned this up to the following structure Above The Fold
H1 Page Heading 80-200 Word of Content (Including a link to supporting article)
H2 Page Heading (Expansion or variance of the H1 making sure relevant) 80-200 150 Words of Content
Image Links to Categories / Products
Below the Fold
The rest of the Image Links to Categories / Products The new pages are now all unique content, targeted towards 1-2 themed keywords. I have a few worries I was hoping you could address. 1. The new pages are only 180-300 words of text, simply because that is all that is needed to describe that category and provide some supporting information. the pages previously contained 600 words. Should I be looking to get more content on these pages?
2. If i do need more content, It wont fit "above the fold" without pushing the products and sub categories below the fold, which isn't ideal. Should I be putting it there anyway or should I insert additional text below the products and below the fold or would this just be a waste.
3. Keyword Structure. I have designed each page to target a selction of keywords, for example.
a) The main widget pages targets all general "widget" terms and provides supporting infromation
b) The sub-category blue widget page targets anything related and terms such as "Navy Widgets" because navy widgets are a type of blue widget etc"
Is this keyword structure over-optimised or exactly what I should be doing. I dont want to spread content to thin by being over selective in my categories Any other critisms or comment welcome0 -
Why do I have such drastic differences in my ratings?
I work for an e-commerce site that has quite a few categories. Some of these categories rank really well and other don't rank at all. What causes such a drastic difference? I understand that there are a lot of factors in ranking, so I am not asking why am I ranked #1 in some keywords and #2 in others. What I am trying to figure out is why I rank #1 in some keywords and don't rank in the first ten pages in another. The pages and optimization are the same. Why wouldn't it rank at all?
On-Page Optimization | | EcommerceSite0 -
Is duplicate content harmful? Example from on my site
I'm not talking about content copied from another site but content unique to a site being used on several pages. I have a delivery tab that has precisely the same content as another product page. This content is on four product pages and the dedicated delivery page. Thanks
On-Page Optimization | | Brocberry0 -
Duplicate Page Content Issues
How can I fix Duplicate Page Content Issues on my site : www.ifocalmedia.com. This is a WP site and the diagnostics shows I have 115 errors? I know this is damaging to my SEO campaign how do I clear these? Any help is very welcome.
On-Page Optimization | | shami0 -
Can duplicate content issues be solved with a noindex robot metatag?
Hi all I have a number of duplicate content issues arising from a recent crawl diagnostics report. Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem? Thanks for any / all replies
On-Page Optimization | | joeprice0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0