Robots.txt advice
-
Hey Guys,
Have you ever seen coding like this in a robots.txt, I have never seen a noindex rule in a robots.txt file before - have you?
user-agent: AhrefsBot
User-agent: trovitBot
User-agent: Nutch
User-agent: Baiduspider
Disallow: /User-agent: *
Disallow: /WebServices/
Disallow: /*?notfound=
Disallow: /?list=
Noindex: /?*list=
Noindex: /local/
Disallow: /local/
Noindex: /handle/
Disallow: /handle/
Noindex: /Handle/
Disallow: /Handle/
Noindex: /localsites/
Disallow: /localsites/
Noindex: /search/
Disallow: /search/
Noindex: /Search/
Disallow: /Search/
Disallow: ?I have never seen a noindex rule in a robots.txt file before - have you?
Any pointers? -
Never seen this, doubt it's any useful as this isn't part of any search engines recommended statements to use. I don't think this would have any impact on what search engine robots would look at as it's not a statement in the robots.txt documentation.
-
Best I could find was-
Unlike disallowed pages, noindexed pages don’t end up in the index and therefore won’t show in search results. Combine both in robots.txt to optimise your crawl efficiency: the noindex will stop the page showing in search results, and the disallow will stop it being crawled
From-https://www.deepcrawl.com/blog/best-practice/robots-txt-noindex-the-best-kept-secret-in-seo/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots blocked by pages webmasters tools
a mistake made in software. How can I solve the problem quickly? help me. XTRjH
Intermediate & Advanced SEO | | mihoreis0 -
Subdomain optimization - advices
Hi, I need some specific advices on which is the best way to optimize the subdomain of a main domain. Besides meta title, description, etc. Br.
Intermediate & Advanced SEO | | Tormar0 -
Panda penalty removal advice
Hi everyone! I'm after a second (or third, or fourth!) opinion here! I'm working on the website www.workingvoices.com that has a Panda penalty dating from the late March 2012 update. I have made a number of changes to remove potential Panda issues but haven't seen any rankings movement in the last 7 weeks and was wondering if I've missed something... The main issues I identified and fixed were: Keyword stuffed near duplicate title tags - fixed with relevant unique title tags Copies of the website on other domains creating duplicate content issues - fixed by taking these offline Thin content - fixed by adding content to some pages, and noindexing other thin/tag/category pages. Any thoughts on other areas of the site that might still be setting off the mighty Panda are appreciated! Cheers Damon.
Intermediate & Advanced SEO | | Digitator0 -
I Need to put static text every page (600 words) need advice
i need to put static text (about our company brief 600 words) to all content section of pages of our website. I know it's bad for SEO Duplicate Content. But i need to tell google this is my static content and do NOT crawl it. Or something like that. canonical is for whole page but i need to set it up for certain positions of every page. is that possible?
Intermediate & Advanced SEO | | nopsts0 -
The "webmaster" disallowed all ROBOTS to fight spam! Help!!
One of the companies I do work for has a magento site. I am simply the SEO guy and they work the website through some developers who hold access to their systems VERY tightly. Using Google Webmaster Tools I saw that the robots.txt file was blocking ALL robots. I immediately e-mailed out and received a long reply about foreign robots and scrappers slowing down the website. They told me I would have to provide a list of only the good robots to allow in robots.txt. Please correct me if I'm wrong.. but isn't Robots.txt optional?? Won't a bad scrapper or bot still bog down the site? Shouldn't that be handled in httaccess or something different? I'm not new to SEO but I'm sure some of you who have been around longer have run into something like this and could provide some suggestions or resources I could use to plead my case! If I'm wrong.. please help me understand how we can meet both needs of allowing bots to visit the site but prevent the 'bad' ones. Their claim is the site is bombarded by tons and tons of bots that have slowed down performance. Thanks in advance for your help!
Intermediate & Advanced SEO | | JoshuaLindley0 -
Robots.txt Blocked Most Site URLs Because of Canonical
Had a bit of a "Gotcha" in Magento. We had Yoast Canonical Links extension which worked well , but then we installed Mageworx SEO Suite.. which broke Canonical Links. Unfortunately it started putting www.mysite.com/catalog/product/view/id/516/ as the Canonical Link - and all URLs with /catalog/productview/* is blocked in Robots.txt So unfortunately We told Google that the correct page is also a blocked page. they haven't been removed as far as I can see but traffic has certainly dropped. We have also , at the same time had some Site changes grouping some pages & having 301 redirects. Resubmitted site map & did a fetch as google. Any other ideas? And Idea how long it will take to become unblocked?
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
Advice Needed: Why Is My Site Not Ranking Despite All The White Hat I Have Done?
Hi all, I have tried all white hat ways to make my local business website rank well in Google. We have done: 1.) Good quality content for our site on regular basis
Intermediate & Advanced SEO | | chanel27
2.) Submit to Google sitemap
3.) Link in an ethical way
4.) Post on social media sites No Google Panda content farming
No Google Penguin unnatural linking In fact, we have more quality articles to share compared to other laundry dry cleaning websites in Singapore. Can anyone advice me on why my site is my ranking well? Site:
http://www.drycleaning.com.sg0 -
Will disallowing in robots.txt noindex a page?
Google has indexed a page I wish to remove. I would like to meta noindex but the CMS isn't allowing me too right now. A suggestion o disallow in robots.txt would simply stop them crawling I expect or is it also an instruction to noindex? Thanks
Intermediate & Advanced SEO | | Brocberry0