How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
-
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type>
These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel.
The problem that we are running into is that the pages are showing up as dupe content.
One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type>
My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type>
One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there.
Are there other potential solutions to this?
-
Thanks for getting back to me!
Even if you do the dynamic meta data, that does not sound like a lot of duplication can be avoided between the 30,000 city/state pages.
That's the reason we are considering this new strategy, where we have essentially 300 unique pages, but we dynamically generate the city/state.
Is the content on these pages unique ? I mean the set of products returned, are they just unique collections or what ?
The content is unique, to a point. The pricing varies by region, but the actual products are the same.
You said these pages are showing up as duplicate content ? Where are they showing as duplicate content ?
They are showing up as dupe content on the seomoz report. Not sure if that's what you mean?
Do your stronger pages, homepage, category pages rank for your head keywords ? Do they rank very very well ?
Not sure what you're asking exactly. How do I find out what my head keywords are?
It really depends upon the overall domain authority and what other stuff is going on your website.
Over PR is about a 5 right now.
-
Even if you do the dynamic meta data, that does not sound like a lot of duplication can be avoided between the 30,000 city/state pages.
Is the content on these pages unique ? I mean the set of products returned, are they just unique collections or what ?
You said these pages are showing up as duplicate content ? Where are they showing as duplicate content ?
Do your stronger pages, homepage, category pages rank for your head keywords ? Do they rank very very well ?
It really depends upon the overall domain authority and what other stuff is going on your website.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page with metatag noindex is STILL being indexed?!
Hi Mozers, There are over 200 pages from our site that have a meta tag "noindex" but are STILL being indexed. What else can I do to remove them from the Index?
Intermediate & Advanced SEO | | yaelslater0 -
Home Page Disappears From Google - But Rest of Site Still Ranked
As title suggests we are running into a serious issue of the home page disapearing from Google search results whilst the rest of the site still remains. We search for it naturally cannot find a trace, then use a "site:" command in Google and still the home page does not come up. We go into web masters and inspect the home page and even Google states that the page is indexable. We then run the "Request Indexing" and the site comes back on Google. This is having a damaging affect and we would like to understand why this issue is happening. Please note this is not happening on just one of our sites but has happened to three which are all located on the same server. One of our brand which has the issue is: www.henweekends.co.uk
Intermediate & Advanced SEO | | JH_OffLimits0 -
Is it good or bad to add noindex for empty pages, which will get content dynamically after some days
We have followers, following, friends, etc pages for each user who creates account on our website. so when new user sign up, he may have 0 followers, 0 following and 0 friends, but over period of time he can get those lists go up. we have different pages for followers, following and friends which are allowed for google to index. When user don't have any followers/following/friends, those pages looks empty and we get issue of duplicate content and description too short. so is it better that we add noindex for those pages temporarily and remove noindex tag when there are at least 2 or more people on those pages. What are side effects of adding noindex when there is no data on those page or benefits of it?
Intermediate & Advanced SEO | | swapnil120 -
SEO - Massive duplication of same page, but different link.
Hi!
Intermediate & Advanced SEO | | jennisprints
I'm dealing with a big client who's site has a big (approx. 39 000) duplication of the "same" page (same content) but each page has a different URL. The duplicated page is a "become a member"-page.
I've checked the backlinks in Google Search Console and there are no sites linking to any of the duplicated pages.
The developers have no clue where or how the pages came to be duplicated, but my guess is that every time a new customer sets up an account the page becomes duplicated. The customer want us to just remove the pages and sort out the duplication, but removing the pages might cause a big drop in back links/traffic and what not. I would much rather redirect the duplicated pages to the original page, but given that there are 39 000 pages it might mess with the site speed. Looking for ideas and suggestions of what the next step should be, remove or redirect.
Thanks so much!0 -
What's the best way to noindex pages but still keep backlinks equity?
Hello everyone, Maybe it is a stupid question, but I ask to the experts... What's the best way to noindex pages but still keep backlinks equity from those noindexed pages? For example, let's say I have many pages that look similar to a "main" page which I solely want to appear on Google, so I want to noindex all pages with the exception of that "main" page... but, what if I also want to transfer any possible link equity present on the noindexed pages to the main page? The only solution I have thought is to add a canonical tag pointing to the main page on those noindexed pages... but will that work or cause wreak havoc in some way?
Intermediate & Advanced SEO | | fablau3 -
New indepth page for content marketing - feedback?
I finally figured out that to EARN quality links I need some great content that I can share. Content that is valuable to people. So I created a (indepth?) article with some pretty sweet infographics. This is NEW to me. So I was hoping I could get some feedback on this before I attempt to promote it to media and industry publishers. Here is the link: http://www.titanium-jewelry.com/jewelry-insurance-info.html Would love your feedback and suggestions! Thanks ron
Intermediate & Advanced SEO | | yatesandcojewelers2 -
Faceted Navigation and Dupe Content
Hi, We have a Magento website using layered navigation - it has created a lot of duplicate content and I did ask Google in GWT to "No URLS" most of the querystrings except the "p" which is for pagination. After reading how to tackle this issue, I tried to tackle it using a combination of Meta Noindex, Robots, Canonical but still it was a snowball I was trying to control. In the end, I opted for using Ajax for the layered navigation - no matter what option is selected there is no parameters latched on to the url, so no dupe/near dupe URL's created. So please correct me if I am wrong, but no new links flow to those extra URL's now so presumably in due course Google will remove them from the index? Am I correct in thinking that? Plus these extra URL's have Meta Noindex on them too - I still have tens of thousands of pages indexed in Google. How long will it take for Google to remove them from index? Will having Meta No Index on the pages that need to be removed help? Any other way of removing thousands of URLS from GWT? Thanks again, B
Intermediate & Advanced SEO | | bjs20100 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0