What reasons exist to use noindex / robots.txt?
-
Hi everyone. I realise this may appear to be a bit of an obtuse question, but that's only because it is an obtuse question. What I'm after is a cataloguing of opinion - what reasons have SEOs had to implement noindex or add pages to their robots.txt on the sites they manage?
-
Many reasons. You don't want the admin pages of your site indexed, for example. You may not want all of the search queries that people perform on your site search to be indexed. You don't want or need your cart checkout being indexed for an ecommerce site. You don't want a print version and a web version of the same document indexed, so you exclude the print version from being indexed. Your site is in development, and you don't want it being indexed before it is ready.
For robots.txt in particular, some search engines now respect wildcards and you can exclude some session IDs via robots.txt. OSCommerce is real bad about creating session IDs and getting those indexed, then you have tons of different URLs indexed for the same page.
http://www.cogentos.com/bloggers-guide-to-using-robotstxt-and-robots-meta-tags-to-optimise-indexing/ is a post that explains some of the reasons to use robots and no-index on a Wordpress site.
-
There are a couple that come to my mind when i used them working for an agency. I remember one client had some temporary pages that didn't want to get indexed, explaining certain problem with a product at that time. We wanted the page to be live, but didn't want the problems that the product was having to show up in the search engines since it was just temporary.
Also, pages that are targeting same keywords that you dont want to erase or redirect and instead want to keep them live but at the same time you dont want to compete with the other main page. You just block it to the search engines.
Hope this helps
-
I really should have worded my question better. I'll try again.
**What reasons do people have for not wanting their pages show on search results? **
I've got a few reasons of my own, but I'm interested in seeing if there's any I hadn't thought of.
-
For pages you don't want them to show up on search results. =P
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Repeat keywords on the pages/titles
I know it is important to avoid duplicate titles and title tags, but I wanted to verify. Lets say you are a collection agency, would it be smart as a strategy to do domain.com/collectionagency/Dallas_ collection_agency and have that same key word structure for many states so many pages?
On-Page Optimization | | SeobyKP0 -
Latent semantic Indexing - Does this help rankings/relevance?
Hi, Does semantically related words to the target term on a page help with rankings/relevance? If your after the term 'PC Screen' and you use the term 'PC Monitor' will go make the connection and also reward you because of the relevance? Anyone do this and have you seen any positives? I've just started to try this out lately and have been combining it with Wordle.net to give me an indication of where the content piece is heading and how aggressive the content leans towards certain words (makes things a little more interesting then calculating densities).
On-Page Optimization | | Bondara0 -
Can Sitemap Be Used to Manage Canonical URLs?
We have a duplicate content challenge that likely has contributed to us loosing SERPs especially for generic keywords such as "audiobook," "audiobooks," "audio book," and "audio books." Our duplicate content is on two levels. 1. The first level is at our web store, www.audiobooksonline.com. Audiobooks are sometimes published in abridged, unabridged, on compact discs, on MP3 CD by the same publisher. In this case we use the publisher description of the story for each "flavor" = duplicate content. Can we use our sitemap to identify only one "flavor" so that a spider doesn't index the others? 2. The second level is that most online merchants of the same publisher's audio book use the same description of the story = lots of duplicate content on the Web. In that we have 11,000+ audio book titles offered at our Web store, I expect Google sees us as having lots of duplicated (on the Web) content and devalues our site. Some of our competitors who rank very high for our generic keywords use the same publisher's description. Any suggestions on how we could make our individual audio book title pages unique will be greatly appreciated.
On-Page Optimization | | lbohen0 -
Using Iframes for Affiliate Ads
Hey all, I got a question: is it Ok to use an Iframe for example to list some Vacation Houses in my Travel Website? This is an Affiliate kind of Program Iframe Link. I did write unique Content about 400 Words and put this Text below the Affiliate Iframe, is this good or not?
On-Page Optimization | | myrtus190 -
Duplicate content with a trailing slash /
Hi, I 've pages like this: A) www.example.com/file/ B) www.example.com/file Just two questions: Does Google see this as duplicate content? Best to 301 redirect B to A? Many thanks Richard PS I read previous threads re the subject, it sounded like there was a bug in SEOMoz but I was not absolutely clear. Apologies if this is going over old ground.
On-Page Optimization | | Richard5550 -
Duplicate content http:// something .com and http:// something .com/
Hi, I've just got a crawl report for a new wordpress blog with suffusion theme and yoast wordpress seo module and there is duplicate content for: http:// something .com and http:// something .com/ I just can't figure out how to handle this. Can I add a redirect for .com/ to .com in htaccess? Any help is appreciated! By the way, the tag value for rel canonical is **http:// something .com/ **for both.
On-Page Optimization | | DanielSndstrm0 -
Using rel="nofollow"
Hello, Quick question really, as far as the SERPs are concerned If I had a site with say 180 links on each page - 80 above suggested limit, would putting 'rel="nofollow"' on 80 of these be as good as only having 100 links per page? Currently I have removed the links, but wereally need these as they point to networked sites that we own and are relevant... But we dont want to look spammy... An example of one of the sites without the links can be seen here whereas a site with the links can be seen here You can see the links we are looking to keep (at the bottom) and why... Thanks
On-Page Optimization | | TwoPints0 -
Optimizing for Date Sensitive Products/Services
We have a product that we currently rank number one for, but would like to capture the date modified variations of the term (such as event 2011 or product 2012). My question is - what would be the best way to optimize for a date senstive product/service? Would it be better to include the date variation of the term on the main page for the product? Or should we create a new page entirely to capture this variation? I lean towards optimizing the existing page because the intent is the same whether a user is searching for product or product 2012. I should mention that the previous year versions of the product are not available. Merci. Chris Thompson
On-Page Optimization | | GroupPublishing0