Am I losing link juice with 302-redirected faceted navigation?
-
My site has faceted navigation that allows shoppers to filter category page results by things brand, size, price range, etc. These pages 302 redirect to the same page they came from, which already include canonical meta tags. I added the rel="nofollow" attribute to the facet links and added the line "Disallow: /category_filter/" to robots.txt.
One of our SEO consultants told me that this is likely diluting the potency of the page's link juice since they are divided among all the page's links, including the links I am instructing crawlers to disregard.
Can anybody tell me whether I am following the best practices for links that redirect to the same page?
-
I considered this but our shopping cart software is has a lot of "black box" features including this one, so I have no control over how this feature is handled. Also, we use SLI search with site champion, which does a very similar auto-generated landing page function for category facets so including this function again would be redundant and possibly dilute our indexed results.
-
Why are you 302 redirecting in the first place? Doesn't make much sense to me.
Why aren't your filter links simple hyperlinks?
The correct way to do this would be:
- Set urlrewrite's that match your filter expressions (e.g. /category1/brand, or /category3/xxl)
- In your category results page have hyperlinks that point at the rewritten urls (you build these dynamically)
This will avoid 302's completely and stop any redirects.
An alternative to this would be to server side filtering of data on page postback and avoid redirects this way.
I'm not sure how correct your SEO consultant is about the link juice being divided amongst all links inlcuding those that are nofollowed. My understanding is that if it is nofollowed, the search engine essentially ignores it.
-
It sounds to me that you are in the clear. The use of the canonical tag would prevent pagerank dispersion, the use of the nofollow tag would as well, and the robots.txt file should prevent the spidering of those pages. Is google indexing any of the category filter pages?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl Budget and Faceted Navigation
Hi, we have an ecommerce website with facetted navigation for the various options available. Google has 3.4 million webpages indexed. Many of which are over 90% duplicates. Due to the low domain authority (15/100) Google is only crawling around 4,500 webpages per day, which we would like to improve/increase. We know, in order not to waste crawl budget we should use the robots.txt to disallow parameter URL’s (i.e. ?option=, ?search= etc..). This makes sense as it would resolve many of the duplicate content issues and force Google to only crawl the main category, product pages etc. However, having looked at the Google Search Console these pages are getting a significant amount of organic traffic on a monthly basis. Is it worth disallowing these parameter URL’s in robots.txt, and hoping that this solves our crawl budget issues, thus helping to index and rank the most important webpages in less time. Or is there a better solution? Many thanks in advance. Lee.
Intermediate & Advanced SEO | | Webpresence0 -
Urgent Site Migration Help: 301 redirect from legacy to new if legacy pages are NOT indexed but have links and domain/page authority of 50+?
Sorry for the long title, but that's the whole question. Notes: New site is on same domain but URLs will change because URL structure was horrible Old site has awful SEO. Like real bad. Canonical tags point to dev. subdomain (which is still accessible and has robots.txt, so the end result is old site IS NOT INDEXED by Google) Old site has links and domain/page authority north of 50. I suspect some shady links but there have to be good links as well My guess is that since that are likely incoming links that are legitimate, I should still attempt to use 301s to the versions of the pages on the new site (note: the content on the new site will be different, but in general it'll be about the same thing as the old page, just much improved and more relevant). So yeah, I guess that's it. Even thought the old site's pages are not indexed, if the new site is set up properly, the 301s won't pass along the 'non-indexed' status, correct? Thanks in advance for any quick answers!
Intermediate & Advanced SEO | | JDMcNamara0 -
302 redirects in the sitemap?
My website uses a prefix at the end to instruct the back-end about visitor details. The setup is similar to this site - http://sanfrancisco.giants.mlb.com/index.jsp?c_id=sf with a 302 redirect from the normal link to the one with additional info and a canonical tag on the actual URL without the extra info ((the normal one here being http://sanfrancisco.giants.mlb.com,) However, when I used www.xml-sitemaps.com to create a sitemap they did so using the URLs with the extra info on the links... what should I do to create a sitemap using the normal URLs (which are the ones I want to be promoting)
Intermediate & Advanced SEO | | theLotter0 -
Redirect 301
Hi, I `m redirecting some pages in htaccess The first 15 pages that i redirected it worked. But the last 3 dont work, and i cant figure it out why it is not working. Redirect 301 /analyseverktoy/ /webanalyse
Intermediate & Advanced SEO | | SGacic
Redirect 301 /index.php/ledige-stillinger/ /
Redirect 301 /?page_id=352/ / Anu suggestions?0 -
Is this link being indexed?
link text Deadline: Monday, Sep 30, 2013 link text I appreciate the help guys!
Intermediate & Advanced SEO | | jameswalkerson0 -
Redirects, 302 and geolocation for multi-regional .com website
One of our sites utilizes a single .com domain but offers differentiated article page for users depending on their location. For example: example.com/articles/how-to-learn-seo-gb-en for UK example.com/articles/how-to-learn-seo-us-en for US example.com/articles/how-to-learn-seo-au-en for Aus Currently we use example.com/articles/how-to-learn-seo as the relative link on the site and then the user is redirected by 302 to the correct article for them based on their location. I've read countless pages about 302 redirects (and largely why you shouldn't use them because of link juice, indexing etc) but what alternative can we use since we don't want to permanently redirect to one URL but rather redirect to the relevant URL based on the users location. All the stuff I've read talks about redirecting using 301s but this surely only works when you are redirecting from one URL to one permanent new URL as opposed to redirecting to one of many country specific URLs. It's not really a solution for us to set up separate TLDs for each country so what is the best mechanism for redirecting user to the correct article for them and making sure that link juice is shared, pages are indexed etc? I hope I've explained this well enough for any of you to offer advice. Many thanks in advance.
Intermediate & Advanced SEO | | simon_realbuzz0 -
What are your thoughts on using Dripable, VitaRank, or similar service to build URL links too dilute link profile???
One of my sites has a very spamy link profile, top 20 anchors are money keywords. What are your thoughts on using Dripable, VitaRank, or similar service to help dilute the link profile by building links with URLs, Click Here, more Info, etc. I have been building URL links already, but due to the site age(over 12 years) the amount of exact match anchor text links is just very large and would take forever to get diluted.
Intermediate & Advanced SEO | | 858-SEO0 -
Outgoing affiliate links and link juice
I have some affiliate websites which have loads of outgoing affiliate links. I've discussed this with a SEO friend and talked about the effect of the link juice going out to the affiliate sites. To minimize this I've put "no follows" on the affiliate links but my friend says that even if you have no follow Google still then diminishes the amount of juice that goes to internal pages, for example if the page has 10 links, 9 are affiliate with no follow - Google will only give 10% of the juice to the 1 internal page. Does anyone know if this is the case? and whether there are any good techniques to keep as much link juice on the site as possible without transferring to affiliate links? Appreciate any thoughts on this! Cheers
Intermediate & Advanced SEO | | Ventura0