What reasons exist to use noindex / robots.txt?
-
Hi everyone. I realise this may appear to be a bit of an obtuse question, but that's only because it is an obtuse question. What I'm after is a cataloguing of opinion - what reasons have SEOs had to implement noindex or add pages to their robots.txt on the sites they manage?
-
Many reasons. You don't want the admin pages of your site indexed, for example. You may not want all of the search queries that people perform on your site search to be indexed. You don't want or need your cart checkout being indexed for an ecommerce site. You don't want a print version and a web version of the same document indexed, so you exclude the print version from being indexed. Your site is in development, and you don't want it being indexed before it is ready.
For robots.txt in particular, some search engines now respect wildcards and you can exclude some session IDs via robots.txt. OSCommerce is real bad about creating session IDs and getting those indexed, then you have tons of different URLs indexed for the same page.
http://www.cogentos.com/bloggers-guide-to-using-robotstxt-and-robots-meta-tags-to-optimise-indexing/ is a post that explains some of the reasons to use robots and no-index on a Wordpress site.
-
There are a couple that come to my mind when i used them working for an agency. I remember one client had some temporary pages that didn't want to get indexed, explaining certain problem with a product at that time. We wanted the page to be live, but didn't want the problems that the product was having to show up in the search engines since it was just temporary.
Also, pages that are targeting same keywords that you dont want to erase or redirect and instead want to keep them live but at the same time you dont want to compete with the other main page. You just block it to the search engines.
Hope this helps
-
I really should have worded my question better. I'll try again.
**What reasons do people have for not wanting their pages show on search results? **
I've got a few reasons of my own, but I'm interested in seeing if there's any I hadn't thought of.
-
For pages you don't want them to show up on search results. =P
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category pages, should I noindex them?
Hi there, I have a question about my blog that I hope you guys can answer. Should I no index the category and tag pages of my blog? I understand they are considered as duplicate content, but what if I try to work the keyword of that category? What would you do? I am looking forward to reading your answers 🙂
On-Page Optimization | | lucywrites0 -
Exclude sorting options using nofollow to reduce duplicate content
I'm getting reports of duplicate content for pages that have different sorting options applied, e.g: /trips/dest/africa-and-middle-east/
On-Page Optimization | | benbrowning
/trips/dest/africa-and-middle-east/?sort=title&direction=asc&page=1
/trips/dest/africa-and-middle-east/?sort=title&direction=des&page=1 I have the added complication of having pagination combined with these sorting options. I also don't have the option of a view all page. I'm considering adding rel="nofollow" to the sorting controls so they are just taken out of the equation, then using rel="next" and rel="prev" to handle the pagination as per Google recommendations(using the default sorting options). Has anyone tried this approach, or have an opinion on whether it would work?0 -
Can I robots.txt an entire site to get rid of Duplicate content?
I am in the process of implementing Zendesk and will have two separate Zendesk sites with the same content to serve two separate user groups (for the same product-- B2B and B2C). Zendesk does not allow me the option to changed canonicals (nor meta tags). If I robots.txt one of the Zendesk sites, will that cover me for duplicate content with Google? Is that a good option? Is there a better option. I will also have to change some of the canonicals on my site (mysite.com) to use the zendesk canonicals (zendesk.mysite.com) to avoid duplicate content. Will I lose ranking by changing the established page canonicals on my site go to the new subdomain (only option offered through Zendesk)? Thank you.
On-Page Optimization | | RoxBrock0 -
Does using keywords in the top level domain make any difference to SEO rankingsq
i am setting up a new company and I need to register a domain name, is there any advantage to using the full name of the company in the domain name? I know it used to be important but does it still make a difference? If possible I would like to just use say the first letter of each of the 3 words instead what will amount to 23 letters but of it makes a difference then 23 letters it will have to be. Thanks
On-Page Optimization | | Ast98581 -
Silly question about noindex and canonical
Hi, This is probably going to sound a bit stupid, but I nevertheless want to check. We have a site that's going to have identical pages (really not my choice) for a sales reason. The two examples would be example.com/profile-name and example.com/location/profile-name Users using the onsite navigation will always end up in the latter example naturally as they have to select a location before viewing content (plus having the location in the url is nice as there are multiple profiles across different locations that have the same name). However, it's easier to sell our services when we can offer just example.com/profile-name to users for their own marketing reasons. I'd like to make the example.com/profile-name noindex follow, and have just the example.com/location/profile-name indexed, but not sure if it would be better to implement canonical tags instead? Can anyone see any potential pitfalls of using either method or does it not really make a difference (which is what I suspect, but I'd rather look stupid than get this wrong)? Thanks!
On-Page Optimization | | LeahHutcheon0 -
Writing Service/Product Descriptions
Hi, I work for a site that allows people to book a variety of different services in different locations (mainly hair and beauty related). The site is still in development so I can't link to it I'm afraid. My colleague is about to start writing these descriptions for each of the beauty salons we have signed up and I thought I'd take the opportunity to check what everyone else thought about these descriptions. As far as I'm concerned, a near perfect example can be found at http://www.toptable.co.uk/fishers-in-the-city We have about 100 words at the most, so I was thinking that as long as we get in the name of the salon, the location (being more descriptive than the general area our services search function allows for) and the USP of each salon - their specialty services. Is there anything else you'd include? Foremost, I want this to be as descriptive as possible to offer more detailed information about the salon. Thanks!
On-Page Optimization | | LeahHutcheon0 -
Using meta robots 'noindex'
Alright, so I would consider myself a beginner at SEO. I've been doing merchandising and marketing for Ecommerce sites for about a year and a half now and am just now starting to attempt to apply some intermediate SEO techniques to the sites I work on so bear with me. We are currently redoing the homepage of our site and I am evaluating what links to have on it. I don't want to lose precious link juice to pages that don't need it, but there are certain pages that we need to have on the homepage that people just won't search for. My question is would it be a good move to add the meta robots 'noindex' tag to these pages? Is my understanding correct that if the only link on the page is back to the homepage it will pass back the linkjuice? Also, how many homepage links are too many? We have a fairly large ecommerce site with a lot of categories we'd like to feature, but don't want to overdo the homepage. I appreciate any help!
On-Page Optimization | | ClaytonKendall0 -
Using Transcriptions
Hi everyone, I've spent a long time trying to figure this one out, so I'm looking forward to your insights. I've recently started having our videos transcribed and keyworded. The videos are hosted on youtube and already embedded on our website. Each embedded video is accompanied by an existing keyword-rich article that covers pretty much the same content of the video, but in a little more detail. I'm now going back and having these videos transcribed. The reason I started doing this was to essentially lengthen the article and get more keywords on the page. Question A. My concern is that the transcription covers the same content as the article, so doesn't add that much for the reader. That's why when I post the transcription (below the embedded video), I use a little javascript link for people to click if they want to read it. Then it becomes visible. Otherwise it's not visible. Note that I am NOT trying to hide it from google by doing this - and it will still show up for people who don't have javascript on - so I'm not trying to cheat google at all and I think I'm doing it based on how they want it done. You can see an example here: http://www.healthyeatingstartshere.com/nutrition/healthy-diet-plan-mistakes So my first question is: do you think the javascript method is a good way of doing it? Question B. Does anyone have any insight on whether it would be better to put the transcription:
On-Page Optimization | | philraymond
1. On the same page as the embedded video/article (which I am doing now), or
2. On a different page, linked to from the above page, or
3. On various other websites (wordpress, blogspot, web2.0 sites) that link back to the video/article on our site. I know it's usually best practice to put it on the same page as the video, but I'm wondering from an <acronym title="Search Engine Optimization">SEO</acronym> point of view if I'm wasting a 500 word transcription by posting it on the same page as a 500 article that covers the same topic and uses the same keywords, and I wonder if it would be better to use the transcription elsewhere. Do you have any thoughts on which of the above methods would be best? Thanks so much for reading and any advice you may have.0