Forced to remove Categories with high volume & revenue
-
Hi everyone
I've been forced to remove level 4 & 5 categories (e.g. example.com/level-2/level-3**/level-4/level-5/**) from our website, even though they're getting plenty of traffic, revenue and are ranking for some of our keywords. The argument is customers were using refinement/filters more than clicking into categories, and a new backend system is coming into the business and these need to be removed anyway.
We've done this before and seen a drop in visibility, revenue & traffic in these areas, but we're going ahead with another batch of removals anyway. I was wondering if anyone has any experience in fixing a problem like this? I've been told the categories will not be returning and have to 301 them, so need to find a workaround to get eligible for ranking for these Keywords again.
I've been looking at using the refinements to make it look like a category (change URL to a clean one, update Page Title, Meta Description, H1, remove text from core page, when refinement is clicked) but not sure what kind of knock-on effects this will have, if it even works!
Hope you can help! I've probably missed some details so let me know if you need more info!!!
Thanks
-
Very hard to prove these things before they're done - good luck with getting buy-in for what you need to do and in undoing the worst of the damage.
-
Thanks Will! Yep sounds similar to what I've sent onto Development, where the filters are actually those sub-category pages. Unfortunately they think it's going to be a huge amount of work, so now I need to show the value of creating these pages before they start working on it. From the Macro point of view, unfortunately, I had no choice and just had to redirect, which are all in place now. Painful to do when you know it's going to damage the performance, and after a couple of weeks it looks like the stats showing it already has
But great to have your feedback, will definitely give weight for my pitch to get those filters working for us! The top-level idea might actually be a great workaround for now too!
-
Hi Frankie,
Sorry for the slow reply to this one. I hope it's still relevant to offer some thoughts.
First, at the top level, I would say that the stated reasons don't necessarily mean that you should not have the kinds of pages you describe. My first preference would be to modify the functionality so that the filters you describe users actually using are those sub-category pages. Even if this meant changing URLs (and hence 301 redirecting the pages you currently have), it is possible to have filter / facet pages be indexable and have unique URLs and meta information.
If that's not possible for whatever reason, I would separate my efforts into the micro and the macro:
- Micro: apply a 80:20 or 90:10 rule to the pages that you are losing - find the small number of most important and highest traffic / conversion pages and find a way to keep versions of those pages (again - even if you have to 301 redirect them, you could create them as static content pages targeting those keywords or something if you had to)
- Macro: where you simply have no choice but to lose these pages, I think your best bet will be to redirect them to the absolutely best (/ next best!) page on the site for those queries - these might be other (sub-)category pages or they might be individual products or content pages, but at least for the highest traffic end, it'd be worth specific research effort to identify the best redirect targets
One final thought: it's not always the case that the URL has to represent every level in the hierarchy. I don't know your underlying technology, but it might be possible to recreate some of these sub-categories as top-level categories if products are allowed by your CMS to be in more than one category at once. I wrote this article about the difference between URL structures and site architecture that might give more clarity on what I mean here.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any way to force a URL out of Google index?
As far as I know, there is no way to truly FORCE a URL to be removed from Google's index. We have a page that is being stubborn. Even after it was 301 redirected to an internal secure page months ago and a noindex tag was placed on it in the backend, it still remains in the Google index. I also submitted a request through the remove outdated content tool https://www.google.com/webmasters/tools/removals and it said the content has been removed. My understanding though is that this only updates the cache to be consistent with the current index. So if it's still in the index, this will not remove it. Just asking for confirmation - is there truly any way to force a URL out of the index? Or to even suggest more strongly that it be removed? It's the first listing in this search https://www.google.com/search?q=hcahranswers&rlz=1C1GGRV_enUS753US755&oq=hcahr&aqs=chrome.0.69i59j69i57j69i60j0l3.1700j0j8&sourceid=chrome&ie=UTF-8
Intermediate & Advanced SEO | | MJTrevens0 -
Http & https domain names
We currently have a site which we found SEM Rush to show that their were duplicate pages for the site. Upon further inspection we realized this was because there existed both http:// and https:// Versions of the site. Is this a problem for Google that the site appears for both http:// and https:// and that there are therefore duplicate versions of the site?
Intermediate & Advanced SEO | | Gavo0 -
Manage category pages and duplicate content issues
Hi everybody, I am now auditing this website www.disfracessimon.com
Intermediate & Advanced SEO | | teconsite
this website has some issues with canonicals and other things. But right now I have found something that I would like to know your opinion. When I was checking parts of the content in google to find duplicate content issues I found this: I google I searched: "Chaleco de streck decorado con botones" and found First result: "Hombre trovador" is the one I was checking -> Correct
The following results are category pages where the product is listed in. I was wondering if this could cause any problem related with duplicated content. Should I no index category pages or should I keep it?
The first result in google was the product page. And category pages I think are good for link juice transfer and to capture some searchs from Google. Any advice? Thank you0 -
What are the pros & cons of recycling an old domain name?
Hi, Old domain name is about books and book buyback. It had about 1000 pages at one time, been around since 2006, and still shows in Open Site Explorer as 86 links from from 46 domains, PA 43 DA 35, spam score of 4. The 4 evidently relates to low number of internal links and no contact info. The domain name's ownership hasn't changed, but for the last year has either not been up at all or only the homepage in the last couple of months. Now the idea is to maybe re-purpose it for place rating content... no more book content... totally different subject matter. Is this an organic search advantage or would it be better to start fresh with a new domain name? Is Google going to have a harder time seeing it as relevant for a new subject (with good new content) or seeing a new site as important? Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Limit on Google Removal Tool?
I'm dealing with thousands of duplicate URL's caused by the CMS... So I am using some automation to get through them - What is the daily limit? weekly? monthly? Any ideas?? thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Link Reclimation & Redirects
Hello, I'm in the middle of a link reclamation project wherein we're identifying broken links, links pointing to dupe content etc. I found a forgotten co-brand which is effectively dupe content across 8 sub-domains, some of which have a significant number of links (200+ linking domains | 2k+ in-bound links). Question for the group is what's the optimal redirect option? Option 1: set 301 and maintain 1:1 URL mapping will pass all equity to applicable PLPs and theoretically improve rank for related keyword(s). requires a bit more configuration time and will likely have small effect on rank given links are widely distributed across URLs. Option 2: set 301 to redirect all requests to the associated sub-domain e.g. foo.mybrand.cobrand.com/page1.html and foo.mybrand.cobrand.com/page2 both redirect to foo.mybrand.com/ will accumulate all equity at the sub-domain level which theoretically will be roughly distributed throughout underlying pages and will limit risk of penalty to that sub-domain. Option 3: set 301 to redirect all requests to our homepage. easiest to configure & maintain, will accumulate the maximum equity on a priority page which should positively affect domain authority. run risk of being penalized for accumulating links en mass, risk penalty for spammy links on our primary sub-domain www, won't pass keyword specific equity to applicable pages. To be clear, I've done an initial scrub of anchor text and there were no signs of spam. I'm leaning towards #3, but interested in others perspectives. Cheers,
Intermediate & Advanced SEO | | PCampolo
Stefan0 -
Why will Google not remove a manual penalty against us?
Our site was placed under a manual penalty last year in June 2012 after penguin rolled out. We were advised by Google that we had unnatural links pointing to our site. We fought for months, running backlink checks and contacting webmasters where Google's WMT was showing the sites which had links. We have submitted numerous reconsideration requests with proof of our efforts in the form of huge well labeled spreadsheets, emails, and screen shots of online forms requesting link removal.When the disavow tool came out we thought it was a godsend and added all the sites who had either ignored us or refused to take down the links to the disavow.txt with the domain: tag. Then we submitted another reconsideration request, but to no avail.We have since had email correspondence with a member of the Google Quality Search Team who after reviewing the evidence of all our previous reconsideration requests and disavow.txt still advised us to make a genuine effort and listed sites which had inorganic links pointing to our site which were already included in the disavow.txt.Google has stated "In order for your site to have a successful reconsideration request, we will need to see a substantial, good-faith effort to remove the links, and this effort should result in a significant decrease in the number of bad links that we see."We have truly done everything we can and proven it too! Especially with all the sites in the disavow.txt there must be a decrease in links pointing to our site. What more can we do? Please help!
Intermediate & Advanced SEO | | Benbug0 -
How deep should ecommerce categories be?
Hello, I have a client with 300 products. He's thinking of about 20 main categories built mainly with the user in mind. How deep should each category be? For example, 3 clicks from the home page, 18 products on 2 pages (9 products per page)? or 2 clicks from the home page simplifying as many categories as possible to get them on one page? or what is the best practice for product indexing and usability? Thanks.
Intermediate & Advanced SEO | | BobGW0