How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
-
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type>
These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel.
The problem that we are running into is that the pages are showing up as dupe content.
One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type>
My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type>
One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there.
Are there other potential solutions to this?
-
Thanks for getting back to me!
Even if you do the dynamic meta data, that does not sound like a lot of duplication can be avoided between the 30,000 city/state pages.
That's the reason we are considering this new strategy, where we have essentially 300 unique pages, but we dynamically generate the city/state.
Is the content on these pages unique ? I mean the set of products returned, are they just unique collections or what ?
The content is unique, to a point. The pricing varies by region, but the actual products are the same.
You said these pages are showing up as duplicate content ? Where are they showing as duplicate content ?
They are showing up as dupe content on the seomoz report. Not sure if that's what you mean?
Do your stronger pages, homepage, category pages rank for your head keywords ? Do they rank very very well ?
Not sure what you're asking exactly. How do I find out what my head keywords are?
It really depends upon the overall domain authority and what other stuff is going on your website.
Over PR is about a 5 right now.
-
Even if you do the dynamic meta data, that does not sound like a lot of duplication can be avoided between the 30,000 city/state pages.
Is the content on these pages unique ? I mean the set of products returned, are they just unique collections or what ?
You said these pages are showing up as duplicate content ? Where are they showing as duplicate content ?
Do your stronger pages, homepage, category pages rank for your head keywords ? Do they rank very very well ?
It really depends upon the overall domain authority and what other stuff is going on your website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dynamic Pages with No Results Causing Thin Content
Hi Mozers, We have dynamic listing pages that pull in clinical trial results for specific disease types. Sometimes diseases have clinical trials and sometimes they don't. This means that sometimes the page will have zero results and sometimes it will return results. We have a sizable number of these so that when there are zero results, these pages look like thin content. What is the recommended method of dealing with this? Is there a way of doing a conditional noindex where the page is indexed if results are pulled in and and not indexed when the page returns zero results? If we can do this, should we? Will it confuse Google and send negative signals? Any guidance/thoughts are much appreciated! Yael
Intermediate & Advanced SEO | | yaelslater0 -
Blog Content Displayed on Multiple Pages
We are developing an online guide that will provide information and listing for a few different cities in Canada and the US. We have blog content that will be pulled into each different city's blog articles page. Some articles are location agnostic and can be displayed for any city, and other articles will only be city specific, and only appear under a particular city. www.mysite.com//blog/seattle/article1
Intermediate & Advanced SEO | | EBKMarketing
www.mysite.com/blog/portland/article1 From what I know of SEO, it seems that this is a perfect example for the use of canonicalization. So for article that will appear in multiple city guides, should there be a tag that points to a home for that article www.mysite.com/blog/article1 Thanks0 -
Does it make sense to create new pages with friendlier URLs then redirect old pages to new?
Hi Moz! My client has messy URLs. does it make sense to write new clean URLs, then 301 redirect all old URLs to the new ones? Thanks for reading!
Intermediate & Advanced SEO | | DA20130 -
Loading Content Asynchronously for Page Speed Purposes?
Pages for my companies play process load slowly because the process is heavy. Below the play process there is a block of text, put mostly there for SEO purposes. R&D are proposing to load the SEO Area only after the play process is loading.
Intermediate & Advanced SEO | | theLotter
This seems like a very bad solution, because loading the SEO Area asynchronously will make the content unreadable to Google. Am I missing something?0 -
Changing content in a well established page.
I have a question i rank well for O'fallon lawn care and I dont rank well for O'Fallon, MO lawn care. Is it ok to go in that page and add some content optimizing it around O'Fallon, MO Lawn care or is that a bad idea. Appreciate any feed back thanks everyone.
Intermediate & Advanced SEO | | gslc0 -
NOINDEX content still showing in SERPS after 2 months
I have a website that was likely hit by Panda or some other algorithm change. The hit finally occurred in September of 2011. In December my developer set the following meta tag on all pages that do not have unique content: name="robots" content="NOINDEX" /> It's been 2 months now and I feel I've been patient, but Google is still showing 10,000+ pages when I do a search for site:http://www.mydomain.com I am looking for a quicker solution. Adding this many pages to the robots.txt does not seem like a sound option. The pages have been removed from the sitemap (for about a month now). I am trying to determine the best of the following options or find better options. 301 all the pages I want out of the index to a single URL based on the page type (location and product). The 301 worries me a bit because I'd have about 10,000 or so pages all 301ing to one or two URLs. However, I'd get some link juice to that page, right? Issue a HTTP 404 code on all the pages I want out of the index. The 404 code seems like the safest bet, but I am wondering if that will have a negative impact on my site with Google seeing 10,000+ 404 errors all of the sudden. Issue a HTTP 410 code on all pages I want out of the index. I've never used the 410 code and while most of those pages are never coming back, eventually I will bring a small percentage back online as I add fresh new content. This one scares me the most, but am interested if anyone has ever used a 410 code. Please advise and thanks for reading.
Intermediate & Advanced SEO | | NormanNewsome0 -
How do I fix the error duplicate page content and duplicate page title?
On my site www.millsheating.co.uk I have the error message as per the question title. The conflict is coming from these two pages which are effectively the same page: www.millsheating.co.uk www.millsheating.co.uk/index I have added a htaccess file to the root folder as I thought (hoped) it would fix the problem but I doesn't appear to have done so. this is the content of the htaccess file: Options +FollowSymLinks RewriteEngine On RewriteCond %{HTTP_HOST} ^millsheating.co.uk RewriteRule (.*) http://www.millsheating.co.uk/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index\.html\ HTTP/ RewriteRule ^index\.html$ http://www.millsheating.co.uk/ [R=301,L] AddType x-mapp-php5 .php
Intermediate & Advanced SEO | | JasonHegarty0 -
Getting 260,000 pages re-indexed?
Hey there guys, I was recently hired to do SEO for a big forum to move the site to a new domain and to get them back up to their ranks after this move. This all went quite well, except for the fact that we lost about 1/3rd of our traffic. Although I expected some traffic to drop, this is quite a lot and I'm wondering what it is. The big keywords are still pulling the same traffic but I feel that a lot of the small threads on the forums have been de-indexed. Now, with a site with 260,000 threads, do I just take my loss and focus on new keywords? Or is there something I can do to get all these threads re-indexed? Thanks!
Intermediate & Advanced SEO | | StefanJDorresteijn0