What reasons exist to use noindex / robots.txt?
-
Hi everyone. I realise this may appear to be a bit of an obtuse question, but that's only because it is an obtuse question. What I'm after is a cataloguing of opinion - what reasons have SEOs had to implement noindex or add pages to their robots.txt on the sites they manage?
-
Many reasons. You don't want the admin pages of your site indexed, for example. You may not want all of the search queries that people perform on your site search to be indexed. You don't want or need your cart checkout being indexed for an ecommerce site. You don't want a print version and a web version of the same document indexed, so you exclude the print version from being indexed. Your site is in development, and you don't want it being indexed before it is ready.
For robots.txt in particular, some search engines now respect wildcards and you can exclude some session IDs via robots.txt. OSCommerce is real bad about creating session IDs and getting those indexed, then you have tons of different URLs indexed for the same page.
http://www.cogentos.com/bloggers-guide-to-using-robotstxt-and-robots-meta-tags-to-optimise-indexing/ is a post that explains some of the reasons to use robots and no-index on a Wordpress site.
-
There are a couple that come to my mind when i used them working for an agency. I remember one client had some temporary pages that didn't want to get indexed, explaining certain problem with a product at that time. We wanted the page to be live, but didn't want the problems that the product was having to show up in the search engines since it was just temporary.
Also, pages that are targeting same keywords that you dont want to erase or redirect and instead want to keep them live but at the same time you dont want to compete with the other main page. You just block it to the search engines.
Hope this helps
-
I really should have worded my question better.
I'll try again.
**What reasons do people have for not wanting their pages show on search results? **
I've got a few reasons of my own, but I'm interested in seeing if there's any I hadn't thought of.
-
For pages you don't want them to show up on search results. =P
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Schema would a Web design/development/seo ageny use and what is the schema.org link?
What Schema would a Web design/development/SEO Ageny use, and what is the schema.org link? I cannot for the life of me figure it out. ProfessionalService has been deprecated.
On-Page Optimization | | TiagoPedreira0 -
Optimizing a URL/menu structure
Hi Mozzers, I'm working on Content Strategy at my job, and I'm close to making some recommendations on short/long-term direction. While I'm there, I want to tackle the URL/menu structure (correct term?), which is a bit of a mess as pages have been created without any consideration for it over time. For ease, let's just say we have 3 main subdirectories of the site (Section A-C), and let's also say that section A also has 3 important subdirectories. From a UX perspective at least, we want a page to look like: example.com/sectionA/subsectionAA/page1 but currently it's example.com/page1 We have dozens and dozens of these examples. To complicate matters a little further, Sections B and C have been earmarked to be consolidated into a new section (D), as they're currently confusing and overlapping, and create roadblocks in user journeys. So a page that is, say: example.com/sectionB/page2 may well want to be: example.com/sectionD/subsectionDA/page2 I'm comfortable enough with technically doing this, as I'm experienced enough in Drupal and have an agency on hand too, BUT - I don't know if there are any SEO pitfalls I need to be wary of when I'm doing this, beyond resubmitting sitemaps, and the trickle-down effects of redirects. Any advice, wise forum? thanks!
On-Page Optimization | | joberts0 -
How long should I leave an existing web page up after a 301 redirect?
I've been reading through a few of blog posts here on moz and can't seem to find the answer to these two questions: How long should I leave an existing page up after a 301 redirect? The page old page is no longer needed but has pretty high page authority. If I take the old page down—the one that I'm redirecting from—immediately after I set up the 301 redirect, will link juice still be passed to the new page? My second question is, right now, on my index.html page I have both a 301 redirect and a rel canonical tag in the head. They were both put in place to redirect and pass link equity respectively. I did this a couple years back after someone recommended that I do both just to be safe, but from what I've gathered reading the articles here on moz is that your supposed to pick one or the other depending on whether or not it's permanent. Should I remove the rel conanical tag or would it be better to just leave it be?
On-Page Optimization | | ScottMcPherson0 -
Using a keyword on homepage of a blog
I have a blog and the homepage has the 5 most recent posts. I ran a report card on my homepage for my main keyword. One of the problems is that the keyword only appears 1 time. I don't want to put it in the signature of every post because I found that causing problems with self-cannibalizing. I checked my competitor and they got a check mark for this but I looked at their homepage and I found the keyword NOWHERE! So where is my competitor hiding the keywords and how can I get the keywords on the homepage when the content is constantly changing? Thanks in advance!
On-Page Optimization | | 2bloggers0 -
Keyword Cannibalization/stuffing on an ecommerce category page
Hi, Whats the best way to tackle e-commerce category pages? If you have, say, a category showing 30 pairs of socks, and each of the sock products in the lists has a 'view more' link, a link from the product name and a link from the thumbnail. Naturally each of those links should be the product name - sprinkled with a slight variation, a preceding 'View more on [product name]' or superseded with the shop name, so you dont end up with complete duplicate link titles, you get the idea. But you suddenly end up with 90 instances of links with title tags containing 'socks', which ultimately lead to keyword stuffing/cannibalization - especially as you then move to another category with, say, sports socks showing 40 products and therefore 120 link titles also with the word 'socks' Thought on a postcard please? Thanks Tom
On-Page Optimization | | pretige120 -
Depreciated content - Canononical, 301, or noindex?
I have a page that has existed on our website for many years, without ever being updated.This is what I would consider an "evergreen" content page, but it is now considered out of date and depreciated. It was never ranking high for any keyword in particular, but it is a page that has existed for many years. We have now created a more up-to-date version of the page, with much more informative content, a new URL, and of course it is SEO optimized. I am puzzled as to what I should do with my old page. Should I add a canononical link pointing it to the new updated page, or should I 301 redirect it to the new page, or should I no-index the old page? What are your thoughts and suggestions? I can give more information if needed. Thank you!!
On-Page Optimization | | jcph0 -
URL best practices, use folders or not ?
Hi I have a question about URLs. Client have all URL written after domain and have only one / slash in all URLs. Is this best practice or i need to use categories,folders? Thanks
On-Page Optimization | | 77Agency0 -
Using AJAX to get a meta description to show up
We're unhappy with the meta descriptions google is picking up for our links in SERPs so have started using AJAX to stream in the content google was previously picking up for meta descriptions. This worked and it doesn't seem to have impacted traffic coming to our site, however since the day of that change our bounce rates have gone up significantly, even for pages that we did not push this change to. Is it possible that doing this caused Google to treat our site differently site wide? Is there anything we should be cautious of when doing this? I know the bounce rates could be impacted by users being better prepared by the google meta descriptions, however it doesn't explain what's happening to parts of our site that we didn't do anything to. -Billy
On-Page Optimization | | RealSelf0