Reducing pages with canonical & redirects
-
We have a site that has a ridiculous number of pages. Its a directory of service providers that is organized by city and sub-category of the vertical. Each provider is on the main city page, then when you click on a category, it will only show those folks who offer that subcategory of this service.
example:
- colorado/denver - main city page
- colorado/denver/subcat1 - subcategory page
There are 37 subcategories. So, 38 pages that essentially have the same content - minus a provider or two - for each city.
There are approx 40K locations in our database. So rough math puts us at 1.5 million results pages, with 97% of those pages being duplicate content!
This is clearly a problem. But many of these obscure pages do rank and get traffic. A fair amount when you aggregate all these pages together.
We are about to go through a redesign and want to consolidate pages so we can reduce the dupe content, get crawl budget allocated to more meaningful pages, etc.
Here's what I'm thinking we should do with this site, and I would love to have your input:
- Canonicalize
Before the redesign use the canonical tag on all the sub-category pages and push all the value from those pages (colorado/denver/subcat1, /subcat2, /subcat3... etc) to the main city page (colorado/denver/subcat1)
- 301 Redirect
On the new site (we're moving to a new CMS) we don't publish the duplicate sub-category pages and do 301 redirects from the sub-category URLs to the main city page urls.
We'd still have the sub-categories (keywords) on-page and use some Javascript filtering to narrow results.
We could cut to the chase and just do the redirects, but would like to use canonicalization as a proof of concept internally at my company that getting rid of these pages is a good thing, or at least wont have a negative impact on traffic. i.e. by the time we are ready to relaunch traffic and value has been transfered to the /state/city page
Trying to create the right plan and build my argument. Any feedback you have will help.
-
Hi! We're going through some of the older unanswered questions and seeing if people still have questions or if they've gone ahead and implemented something and have any lessons to share with us. Can you give an update, or mark your question as answered?
Thanks!
-
The best way is to make sure you're using the tag properly and that you have all your angles covered.
There is actually some good posts on SEOmoz about canonicalization, I'll try and find those for you.
-
awesome feedback! thanks david. would like to hear your thoughts on proper canonicalization when you have a moment. thanks again.
-
Your plan sounds good but here are a few things I'd like to add.
-
Make sure the dupe pages you're getting rid of are not the main traffic sources. If that is the case you'll want to redirect only a few at a time and slowly go around fixing that. You don't want to switch to new CMS, throw up redirects, and lose 85% of your traffic. Just make sure it's not your main traffic source.
-
Make sure you use the proper methods of canonicalization. Don't half-ass it.
-
On the new site, because you have a large and deep site, make sure you have a proper sitemap generated fresh all the time and that the proper weights are assigned and proper structuring. Less levels = better.
-
Watch your Webmaster Tools.
That is all I have, I think you'll be fine.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal search pages (and faceted navigation) solutions for 2018! Canonical or meta robots "noindex,follow"?
There seems to conflicting information on how best to handle internal search results pages. To recap - they are problematic because these pages generally result in lots of query parameters being appended to the URL string for every kind of search - whilst the title, meta-description and general framework of the page remain the same - which is flagged in Moz Pro Site Crawl - as duplicate, meta descriptions/h1s etc. The general advice these days is NOT to disallow these pages in robots.txt anymore - because there is still value in their being crawled for all the links that appear on the page. But in order to handle the duplicate issues - the advice varies into two camps on what to do: 1. Add meta robots tag - with "noindex,follow" to the page
Intermediate & Advanced SEO | | SWEMII
This means the page will not be indexed with all it's myriad queries and parameters. And so takes care of any duplicate meta /markup issues - but any other links from the page can still be crawled and indexed = better crawling, indexing of the site, however you lose any value the page itself might bring.
This is the advice Yoast recommends in 2017 : https://yoast.com/blocking-your-sites-search-results/ - who are adamant that Google just doesn't like or want to serve this kind of page anyway... 2. Just add a canonical link tag - this will ensure that the search results page is still indexed as well.
All the different query string URLs, and the array of results they serve - are 'canonicalised' as the same.
However - this seems a bit duplicitous as the results in the page body could all be very different. Also - all the paginated results pages - would be 'canonicalised' to the main search page - which we know Google states is not correct implementation of canonical tag
https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html this picks up on this older discussion here from 2012
https://moz.com/community/q/internal-search-rel-canonical-vs-noindex-vs-robots-txt
Where the advice was leaning towards using canonicals because the user was seeing a percentage of inbound into these search result pages - but i wonder if it will still be the case ? As the older discussion is now 6 years old - just wondering if there is any new approach or how others have chosen to handle internal search I think a lot of the same issues occur with faceted navigation as discussed here in 2017
https://moz.com/blog/large-site-seo-basics-faceted-navigation1 -
Why does Google display the home page rather than a page which is better optimised to answer the query?
I have a page which (I believe) is well optimised for a specific keyword (URL, title tag, meta description, H1, etc). yet Google chooses to display the home page instead of the page more suited to the search query. Why is Google doing this and what can I do to stop it?
Intermediate & Advanced SEO | | muzzmoz0 -
Pages that 301 redirect to a 404
We are going through a website redesign that involves changing URL's for the pages on our site. Currently all our pages are in the format domain.com/example.html and we are moving to stip off the .html file extension so it would just be domain.com/example We have thousands of pages as the site deals with news so building a redirect for each individual page isn't really feasible. My plan is to have a generic rewrite rule that redirects any page that ends .html to the stripped off version of this. A problem I can see with this is that it will also redirect pages that don't exist. So for example, domain.com/non-existant-page.html would 301 to domain.com/non-existant-page which would then return a 404 status. What would the SEO repercussions be for this? Obviously if a page doesn't exist already then it shouldn't show up in the search engine indexes and shouldn't be a problem but I'm a bit worried about how old pages that currently legitimately 404 will be treated when they start to 301 redirect to a 404 instead. Not sure if there any other potential issues from this that I've missed either? Thanks!
Intermediate & Advanced SEO | | sbb0240 -
Will thousands of redirected pages have a negative impact on the site?
A client site has thousands of pages with unoptimized urls. I want to change the url structure to make them a little more search friendly. Many of the pages I want to update have backlinks to them and good PR so I don't want to delete them entirely. If I change the urls on thousands of pages, that means a lot of 301 redirects. Will thousands of redirected pages have a negative impact on the site? Thanks, Dino
Intermediate & Advanced SEO | | Dino641 -
How do I reduce internal links & cannibalisation from primiary navigation?
SEOmoz tools is reporting each page on our site containing in excess of 200 internal links mostly from our primary navigation menu which it says is too many. This also causes cannibalization on the word towels which i would like to avoid if possible. Is there a way to reduce the number of internal links whilst maintaining a good structure to allow link juice to filter through the site and also reduce cannibalization?
Intermediate & Advanced SEO | | Towelsrus0 -
Home page url 301 redirect suggestion
Hello, In our site we have already done 301 redirect from http:// to http://www. However, the home page links are still coming in 2 ways http://www.mycarhelpline.com/ http://www.mycarhelpline.com/index.php?option=com_newcar&view=search&Itemid=2 Need suggestion We have already use rel canonical is another 301 redirect to be used for maintaining the home page pr from seo point of view. Does google still takes both urls as separate url and finds duplicate content
Intermediate & Advanced SEO | | Modi0 -
301 - should I redirect entire domain or page for page?
Hi, We recently enabled a 301 on our domain from our old website to our new website. On the advice of fellow mozzer's we copied the old site exactly to the new domain, then did the 301 so that the sites are identical. Question is, should we be doing the 301 as a whole domain redirect, i.e. www.oldsite.com is now > www.newsite.com, or individually setting each page, i.e. www.oldsite.com/page1 is now www.newsite.com/page1 etc for each page in our site? Remembering that both old and new sites (for now) are identical copies. Also we set the 301 about 5 days ago and have verified its working but haven't seen a single change in rank either from the old site or new - is this because Google hasn't likely re-indexed yet? Thanks, Anthony
Intermediate & Advanced SEO | | Grenadi0 -
Use of rel=canonical to view all page & No follow links
Hey, I have a couple of questions regarding e-commerce category pages and filtering options: I would like to implement the rel=canonical to the view all page as suggested on this article on googlewebmastercentral. If you go on one of my category pages you will see that both the "next page link" and the "view all" links are nofollowed. Is that a mistake? How does nofoolow combines with canonical view all? Is it a good thing to nofollow the "sorty by" pages or should I also use Noindex for them?
Intermediate & Advanced SEO | | Ypsilon0