Forced to remove Categories with high volume & revenue
-
Hi everyone
I've been forced to remove level 4 & 5 categories (e.g. example.com/level-2/level-3**/level-4/level-5/**) from our website, even though they're getting plenty of traffic, revenue and are ranking for some of our keywords. The argument is customers were using refinement/filters more than clicking into categories, and a new backend system is coming into the business and these need to be removed anyway.
We've done this before and seen a drop in visibility, revenue & traffic in these areas, but we're going ahead with another batch of removals anyway. I was wondering if anyone has any experience in fixing a problem like this? I've been told the categories will not be returning and have to 301 them, so need to find a workaround to get eligible for ranking for these Keywords again.
I've been looking at using the refinements to make it look like a category (change URL to a clean one, update Page Title, Meta Description, H1, remove text from core page, when refinement is clicked) but not sure what kind of knock-on effects this will have, if it even works!
Hope you can help! I've probably missed some details so let me know if you need more info!!!
Thanks
-
Very hard to prove these things before they're done - good luck with getting buy-in for what you need to do and in undoing the worst of the damage.
-
Thanks Will! Yep sounds similar to what I've sent onto Development, where the filters are actually those sub-category pages. Unfortunately they think it's going to be a huge amount of work, so now I need to show the value of creating these pages before they start working on it. From the Macro point of view, unfortunately, I had no choice and just had to redirect, which are all in place now. Painful to do when you know it's going to damage the performance, and after a couple of weeks it looks like the stats showing it already has
But great to have your feedback, will definitely give weight for my pitch to get those filters working for us! The top-level idea might actually be a great workaround for now too!
-
Hi Frankie,
Sorry for the slow reply to this one. I hope it's still relevant to offer some thoughts.
First, at the top level, I would say that the stated reasons don't necessarily mean that you should not have the kinds of pages you describe. My first preference would be to modify the functionality so that the filters you describe users actually using are those sub-category pages. Even if this meant changing URLs (and hence 301 redirecting the pages you currently have), it is possible to have filter / facet pages be indexable and have unique URLs and meta information.
If that's not possible for whatever reason, I would separate my efforts into the micro and the macro:
- Micro: apply a 80:20 or 90:10 rule to the pages that you are losing - find the small number of most important and highest traffic / conversion pages and find a way to keep versions of those pages (again - even if you have to 301 redirect them, you could create them as static content pages targeting those keywords or something if you had to)
- Macro: where you simply have no choice but to lose these pages, I think your best bet will be to redirect them to the absolutely best (/ next best!) page on the site for those queries - these might be other (sub-)category pages or they might be individual products or content pages, but at least for the highest traffic end, it'd be worth specific research effort to identify the best redirect targets
One final thought: it's not always the case that the URL has to represent every level in the hierarchy. I don't know your underlying technology, but it might be possible to recreate some of these sub-categories as top-level categories if products are allowed by your CMS to be in more than one category at once. I wrote this article about the difference between URL structures and site architecture that might give more clarity on what I mean here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Negative SEO & How long does it take for Google to disavow
Following on from a previous problem of 2 of our main pages completely dropping from index, we have discovered that 150+ spam, porn domains have been directed at our pages (sometime in the last 3-4 months, don't have an exact date). Does anyone have exerpeince on how long it may take Google to take noticed of a new disavow list? Any estimates would be very helpful in determining our next course of action.
Intermediate & Advanced SEO | | Vuly1 -
Impact of Removing 60,000 Page from Sites
We currently have a database of content across about 100 sites. All of this content is exactly the same on all of them, and it is also found all over the internet in other places. So it's not unique at all and it brings in almost no organic traffic. I want to remove this bloat from our sites. Problem is that this database accounts for almost 60,000 pages on each site and it is all currently indexed. I'm a little bit worried that flat out dumping all of this data at once is going to cause Google to wonder what in the world we are doing and we are going to see some issues from it (at least in the short run). My thought now is to remove this content in stages so it doesn't all get dropped at once. But would deindexing all of this content first be better? That way Google would still be able to crawl it and understand that it is not relevant user content and therefore minimize impact when we do terminate it completely? Any other ideas for minimizing SEO issues?
Intermediate & Advanced SEO | | MJTrevens1 -
Competitor Ranking in Positions 1 & 2
I have seen an increasing instance of where the same company, one of our main competitors, are ranking in positions 1 and 2 for the same search phrase. It appears that both the homepage and dedicated service page relevant to the search term are ranking, but surely having them at position 1 & 2 is not something search engines like Google want to encourage? I have also seen other instances of the same company ranking twice on page 1 but not necessarily in #1 or #2. Is this an anomaly or just something I have to live with?
Intermediate & Advanced SEO | | Kevbyrne2 -
Should I remove pages to concentrate link juice?
So our site is database powered and used to have up to 50K pages in google index 3 years ago. After re-design that number was brought down to about 12K currently. Legacy URLs that are now generating 404 have mostly been redirected to appropriate pages (some 13K 301 redirects currently). Trafficked content accounts for about 2K URLs in the end so my question is should I in context of concentrating link juice to most valuable pages: remove non-important / least trafficked pages from site and just have them show 404 no-index non-important / least trafficked pages from site but still have them visible 1 or 2 above plus remove from index via Webmaster Tools none of the above but rather something else? Thanks for any insights/advice!
Intermediate & Advanced SEO | | StratosJets0 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
Intermediate & Advanced SEO | | Mivito0 -
301's & Link Juice
So lets say we have a site that has 0 page rank (kind of new) has few incoming links, nothing significant compared to the other sites. Now from what I understand link juice flows throughout the site. So, this site is a news site, and writes sports previews and predictions and what not. After a while, a game from 2 months gets 0 hits, 0 search queries, nobody cares. Wouldn't it make sense to take that type of expired content and have it 301 to a different page. That way the more relevant content gets the juice, thus giving it a better ranking... Just wondering what everybody's thought its on this link juice thing, and what am i missing..
Intermediate & Advanced SEO | | ravashjalil0 -
Silo This! Siloing issue with KW targets and multiple categories
I am having a difficult time determining how to silo the content for this website (douwnpour). The issue I am having is that as I see it there are several different top-level keyword targets to put at the top of the silos, however due to the nature of the products they fit in almost every one of the top-level categories. For instance our main keyword term is "Audio Books" (and derivatives thereof). but we also want to target "Audiobook Downloads" and "Books on CD". Due to the nature of the products, almost every product would fit in all 3 categories. It gets even worse when you consider normal book taxonomy. The normal breakdown would be from audiobooks>Fiction(or Nonfiction). Now each product also belongs to one of these categories, as well as "download", "CD", and "Audiobook". And still worse, our navigation menus link every page on the site back to all of these categories (except audiobooks, as we don't really have a landing page for that besides the home page, which is lacking in optimized content, but is linked from every page on the site.) So, I am finding siloing, or developing a cross-linking plan that makes sense very difficult. It's much easier at the lower levels, but at the top things become muddy. Throw in the idea that we may eventually get e-books as well, and it gets even muddier. I have some ideas of how to deal with some of this, such as having the site navigation put in an i frame, instituting basic breadcrumbs, and building landing pages, but I'm open to any advice or ideas that might help, especially with the top level taxonomy structure. TIA!
Intermediate & Advanced SEO | | DownPour0 -
Removing Duplicate Content Issues in an Ecommerce Store
Hi All OK i have an ecommerce store and there is a load of duplicate content which is pretty much the norm with ecommerce store setups e.g. this is my problem http://www.mystoreexample.com/product1.html
Intermediate & Advanced SEO | | ChriSEOcouk
http://www.mystoreexample.com/brandname/product1.html
http://www.mystoreexample.com/appliancetype/product1.html
http://www.mystoreexample.com/brandname/appliancetype/product1.html
http://www.mystoreexample.com/appliancetype/brandname/product1.html so all the above lead to the same product
I also want to keep the breadcrumb path to the product Here's my plan Add a canonical URL to the product page
e.g. http://www.mystoreexample.com/product1.html
This way i have a short product URL Noindex all duplicate pages but do follow the internal links so the pages are spidered What are the other options available and recommended? Does that make sense?
Is this what most people are doing to remove duplicate content pages? thanks 🙂0