Can using nofollow on magento layered navigation hurt?
-
Howdy Mozzers!
We would like to use no follow, no index on our magento layered navigation pages after any two filters are selected. (We are using single filter pages as landing page, so we would liked them indexed)
Is it ok to use nofollow, noindex on these filter pages? Are there disadvantages of using nofollow on internal pages?
Matt mentioned refraining from using nofollow internally https://www.youtube.com/watch?v=4SAPUx4Beh8
But we would like to conserve crawling bandwidth and PR flow on potentially 100's of thousands of irrelevant/duplicate filter pages.
-
I understand I might be a little late, but I had experienced this issue first hand with a Magento site. Once I added a wildcard exclusion in the robots.txt file my impressions and click improved noticeably.
-
HI,
That is quite a few pages!
If the main issue is crawling related then robots.txt is probably the best way to go, I think the meta tags will still allow the pages to be crawled (they have to be for the tag to be read). Check out the comments in this and this post for wildcard matching in robots.txt which should do what you need. If the pages are indexed then it might be wise to leave a bit of time so that the noindex tags are picked up and then implement the crawl blocking in the robots.txt (and test in GWT to make sure you are not accidentally blocking more then you think). In this case I think you could still leave out the nofollow meta tag but this might just be personal opinion now - I'm not sure if in practice it would make much difference once you have no indexed and blocked crawling!
-
Hi Lynn,
Thank you for your valuable input on the matter. Yes, using meta tags in the header. We are currently submitting filter pages that we want indexed through the site map, so google bot should be able to reach these pages. Also, we are displaying noindex, nofollow tags only on filter pages which have a combination of more than two filters selected as we do not need to go any deeper than that.
I understand your point of using noindex, follow instead of noindex, nofollow to prevent unexpected crawl issues. But on the contrary, don't you think we could conserve crawling bandwidth using noindex, nofollow tags on filter pages that serve no purpose being crawled and probably wont be externally linked to either?
We currently have around 7 filters, some with many values. This can create combinations of more than 500,000 filter pages...
Thanks
-
Hi,
I assume you mean in a meta header tag for these pages? As a general rule I would avoid using nofollow and simply noindex the pages in question. If you are implementing this with a meta tag then the pages will be reached from the layered navigation links anyway so they would then be a dead end for both PR and the crawler - with the potential to cause unexpected crawl problems rather than optimising it.
As long as you are addressing as best you can any duplicate content issues caused by the layered navigation (check out this post for a good rundown on the various solutions) then I would leave the noindex in place and let the crawler follow the links as normal.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Can you spot the differences?
Well, I have been scratching my head on this for days, I will try throwing the ball to you with hopes someone more experienced than me can help. The scenario is: e-commerce -> brand page -> SERP -> comparison between how two pages rank; one from my website, one from a competitor website. The brand is Michelin, the keyword is "pneumatici michelin" (equivalent in italian of “michelin tires”). I am not looking at SERP first page, where competition is surely much more fierce. I am looking at position 11: http://www.cambio-gomme.it/marchi/michelin/ And my page (not in the first 50): http://www.gomme-auto.it/pneumatici/michelin My page: MOZ Page Grade (for keyword “pneumatici michelin”): A External backlinks to the page: 1 Domain Authority: 29 Page Authority: 24 On-page SEO optimization: keyword density: 0.87% internal links: 145 external links: 3 page size: 108kb html size: 24kb words on page: 2077 link-words: 408 non-linked words: 1669 time to first byte: 0.419s Competitor page: MOZ Page Grade (for keyword “pneumatici michelin”): A External backlinks to the page: 0 Domain Authority: 26 Page Authority: 13 On-page SEO optimization: keyword density: 0.75% internal links: 70 external links: 1 page size: 31kb html size: 9kb words on page: 1521 link-words: 168 non-linked words: 1353 time to first byte: 0.373s Domain age is very similar, both websites launched close to each other in 2012. Ideas? Suggestion on other metrics to compare?
Intermediate & Advanced SEO | | max.favilli0 -
Should I switch all paid-for directory backlinks to nofollow backlinks?
Hello Mozzers, I'm looking at a niche party services directory (b2c), established for over 8 years. They're not using nofollow tags on backlinks from their paid entries (free entries only get phone numbers and not backlinks). If they suddenly switch all the paid-for backlinks in their directory to nofollow backlinks, might that have some kind of negative impact. Switching sounds like the best way forward, but I want to avoid any unintended consequences. Perhaps I should only implement this change gradually? Thanks in advance, Luke Edited 30 minutes ago by Luke Rowland
Intermediate & Advanced SEO | | McTaggart0 -
What can you do when Google can't decide which of two pages is the better search result
On one of our primary keywords Google is swapping out (about every other week) returning our home page, which is more transactional, with a deeper more information based page. So if you look at the Analysis in Moz you get an almost double helix like graph of those pages repeatedly swapping places. So there seems to be a bit of cannibalizing happening that I don't know how to correct. I think part of the problem is the deeper page would ideally be "longer" tail searches that contain the one word keyword that is having this bouncing problem as a part of the longer phrase. What can be done to try prevent this from happening? Can internal links help? I tried adding a link on that term to the deeper page to our homepage, and in a knee jerk reaction was asked to pull that link before I think there was really any evidence to suggest that that one new link made a positive or negative effect. There are some crazy theories floating around at the moment, but I am curious what others think both about if adding a link from a informational to a transactional page could in fact have a negative effect, and what else could be done/tried to help clarify the difference between the two pages for the search engines.
Intermediate & Advanced SEO | | plumvoice0 -
Index, Nofollow Issue
We are having on our site a couple of pages that we want the page to be indexed, however, we don't want the links on the page to be followed. For example url: http://www.printez.com/animal-personal-checks.html. We have added in our code: . Bing Webmaster Tools, is telling us the following: The pages uses a meta robots tag. Review the value of the tag to see if you are not unintentionally blocking the page from being indexed (NOINDEX). Question is, is the page using the right code as of now or do we need to do any changes in the code, if so, what should we use for them to index the page, but not to follow the links on the page? Please advise, Morris
Intermediate & Advanced SEO | | PrintEZ0 -
Can DIVS that look like frames hurt?
We are working with a site that has what appears to be a frame in the middle but it is not targeting a new html page it is just a div hiding content until you use the scroll bar. What are best practices for working with this?
Intermediate & Advanced SEO | | siteoptimized0 -
Really bad technical SEO and Nofollow
I posted a question week ago about a client with really awful SEO errors to the tune of over 75k violations including massive duplicate content (over 8000 pages) and pages with too many links (homepage alone has over 300 links), and I was thinking, why not try to nofollow the product pages which are the ones causing so many issue. They have super low domain authority, and are wasting spider energy, have no incoming links. Thoughts? BTW the entire site is an ecommerce site wth millions of products and each product is its own wordpress blog post...YIKES! Thoughts?
Intermediate & Advanced SEO | | runnerkik1 -
Can this site be optimised?
I have been told that because of the technology this site was developed with it cannot be changed for example urls title and meta tags cannot be changed. why is that and what other types of sites also cannot be changed. http://www.alliedpickfords.com/Pages/Landing.aspx For example i have been told alot of online stores cannot be optimised because the urls change every time some one goes to the page therefor you cant lionk to a certain page is that true and what is the way around it if any.
Intermediate & Advanced SEO | | duncan2740