Should 'View All' filters on ecommerce sites be indexable?
-
Hi,
I’m looking at a site at the moment that has a lot of products. For some of their category pages they have a ‘View All’ feature available. The URL uses this structure:
domain.com/category/sub-category/product
domain.com/category/sub-category/view-all < currently noindex applied
Should the view all page be available for indexing? The individual sub-categories and products are indexable
My immediate reaction is no, so long as the individual sub-cats are?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URLs too long, but run an eCommerce site
Hi, When I started out I was pretty green to SEO, and didn't consider the usability/SEO impact of URL structure. Flash forward, I'm 5 years deep into using the following: mysite.com/downloads/category/premium-downloads/sub-category/ ("category" is quite literally one rung on the link - thanks, WordPress - however "sub-category" is a placeholder) I run a digital downloads store, and I now have 100s ofinternal links beholden to this hideous category linking structure. Not to mention external links at Google Ads, etc. I would LOVE to change this, but if I were to do so, what should I consider? For instance, is there a checklist for making a change like this? I was thinking of changing it to something like the following: mysite.com/shop/c/premium/sub-category/ And also, how much damage, if any, would this be doing to my SEO? Thanks in advance,
Technical SEO | | LouCommaTheCreator
Lou1 -
How to setup an iFrame to be indexed as the parent site
Hi, we are trying to move all of our website content from www.mysite.com to a subdomain (i.e. content.mysite.com), and make "www.mysite.com" nothing more than an iFrame displaying the content from content.mysite.com. We have about 10 pages linking from the home page, all indexed separately, so I understand we'll have to do this for every one of them. (www.mysite.com/contact will be an iframe containing the content from content.mysite.com/contact, and we'll need to do this for every page) How do we do this so Google continues to index the content hosted at content.mysite.com with the parent page in organic results (www.mysite.com). We want all users to enter the site through www.mysite.com or www.mysite.com/xxxxxx, which will contain no content except for iFrames pulling in content from content.mysite.com. Our fear is that google will start directing users directly to content.mysite.com, rather than continue feeding to www.mysite.com. If we use www1.mysite.com or www2.mysite.com as the location of the content, instead of say content.mysite.com, would these subdomain names work better for passing credit for the iFramed content to the parent page (www.mysite.com)? Thanks! SIDE NOTE: Before someone asks why we need to do this, the content on mysite.com ranks very well, but site has a huge bounce rate due to a poorly designed CMS serving the content. The CMS does not load the page in pieces (like most pages load), but instead presents the visitor with a 100% blank page while the page loads in the background for about 5-10 seconds, and then boom 100% of the page shows up. We've been back and forth with our CMS provider about doing something about this for 5 years now, and we have given up. We tested moving our adwords links to xyz.mysite.com, where users are immediately shown a loading indicator, with our site (www.mysite.com) behind it in an iFrame. The immediate result was resounding success... our bounce rate PLUMMETED, and the root domain www.mysite.com saw a huge boost in search results. Problem with this is our site still comes up in organic results as www.mysite.com, which does not have any kind of spinning disk loading indicator, and still has a very high bounce rate.
Technical SEO | | vezaus0 -
Can anyone tell me why some of the top referrers to my site are porn site?
We noticed today that 4 of the top referring sites are actually porn sites. Does anyone know what that is all about? Thanks!
Technical SEO | | thinkcreativegroup1 -
My old URL's are still indexing when I have redirected all of them, why is this happening?
I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?
Technical SEO | | One2OneDigital0 -
Homepage de-indexed, rest of site all there...
This is a random issue that I've been trying to get to the bottom of over the last few months. First I thought it might be that I have a spammy host, so I changed it. My site loads a little faster but the homepage is still totally non-visible. Other pages and posts index no problem.. It's really quite frustrating. http://bit.ly/1hA8DqV Any suggestions welcome. Standard WP, running Wordpress SEO by Joost and a few other basic plugins...
Technical SEO | | duncm0 -
Duplicate content issue index.html vs non index.html
Hi I have an issue. In my client's profile, I found that the "index.html" are mostly authoritative than non "index.html", and I found that www. version is more authoritative than non www. The problem is that I find the opposite situation where non "index.html" are more authoritative than "index.html" or non www more authoritative than www. My logic would tell me to still redirect the non"index.html" to "index.html". Am I right? and in the case I find the opposite happening, does it matter if I still redirect the non"index.html" to "index.html"? The same question for www vs non www versions? Thank you
Technical SEO | | Ideas-Money-Art0 -
I am trying to block robots from indexing parts of my site..
I have a few websites that I mocked up for clients to check out my work and get a feel for the style I produce but I don't want them indexed as they have lore ipsum place holder text and not really optimized... I am in the process of optimizing them but for the time being I would like to block them. Most of my warnings and errors on my seomoz dashboard are from these sites and I was going to upload the folioing to the robot.txt file but I want to make sure this is correct: User-agent: * Disallow: /salondemo/ Disallow: /salondemo3/ Disallow: /cafedemo/ Disallow: /portfolio1/ Disallow: /portfolio2/ Disallow: /portfolio3/ Disallow: /salondemo2/ is this all i need to do? Thanks Donny
Technical SEO | | Smurkcreative0 -
After entire site is noindex'd, how long to recover?
A programmers 'accidentally' put "name="robots" content="noindex" />" into every single page of one of my sites (articles, landing pages, home page etc). This happened on Monday, and we just noticed today. Ugh... We've fixed the issue; how long will it take to get reindexed? Will we instantly retain our same positions for keywords? Any tips?
Technical SEO | | EricPacifico0