Question about Syntax in Robots.txt
-
So if I want to block any URL from being indexed that contains a particular parameter what is the best way to put this in the robots.txt file?
Currently I have-
Disallow: /attachment_idWhere "attachment_id" is the parameter. Problem is I still see these URL's indexed and this has been in the robots now for over a month. I am wondering if I should just do
Disallow: attachment_id or Disallow: attachment_id= but figured I would ask you guys first.
Thanks!
-
That's excellent Chris.
Use the Remove Page function as well - it might help speed things up for you.
-Andy
-
I don't know how but I completely forgot I could just pop those URL's in GWT and see if they were blocked or not and sure enough, Google says they are. I guess this is just a matter of waiting.... Thanks much!
-
I have previously looked into both of those documents and the issue remains that they don't exactly address how best to block parameters. I could do this through GWT but just am curious about the correct and preferred syntax for the robots.txt as well. I guess I could just look at sites like Amazon or other big sites to see what the common practices are. Thanks though!
-
Problem is I still see these URL's indexed and this has been in the robots now for over a month. I am wondering if I should just do
It can take Google some time to remove pages from the index.
The best way to test if this has worked is hop into Webmaster Tools and use the Test Robots.txt function. If it has blocked the required pages, then you know it's just a case of waiting - you can also remove pages from within Webmaster Tools as well, although this isn't immediate.
-Andy
-
Hi there
Take a look at Google's resource on robots.txt, as well as Moz's. You can get all the information you need there. You can also let Google know about what URLs to exclude from it's crawls via Search Console.
Hope this helps! Good luck!
-
Im not a robots.txt expert by a long shot, but I found this, which is a little dated, which explained it to me in terms i could understand.
https://sanzon.wordpress.com/2008/04/29/advanced-usage-of-robotstxt-w-querystrings/
there is also a feature in Google Webmaster tools called URL parameters that lets you block URLs with set parameters for all sorts of reason to avoid duplicate content etc. I havn't used it myself but may be work looking into
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best practice for disallowing URLS with Robots.txt
Hi Everybody, We are currently trying to tidy up the crawling errors which are appearing when we crawl the site. On first viewing, we were very worried to say the least:17000+. But after looking closer at the report, we found the majority of these errors were being caused by bad URLs featuring: Currency - For example: "directory/currency/switch/currency/GBP/uenc/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL3dvcmt3ZWFyP3ByaWNlPTUwLSZzdGFuZGFyZHM9NzEx/" Color - For example: ?color=91 Price - For example: "?price=650-700" Order - For example: ?dir=desc&order=most_popular Page - For example: "?p=1&standards=704" Login - For example: "customer/account/login/referer/aHR0cDovL2NlbnR1cnlzYWZldHkuY29tL2NhdGFsb2cvcHJvZHVjdC92aWV3L2lkLzQ1ODczLyNyZXZpZXctZm9ybQ,,/" My question now is as a novice of working with Robots.txt, what would be the best practice for disallowing URLs featuring these from being crawled? Any advice would be appreciated!
Intermediate & Advanced SEO | | centurysafety0 -
Question & Review should be seperate page
Hi pls look at the below page, http://www.powerwale.com/store/exide-xplore-xltz4-3ah-battery/76933 is questions and review should be in seperate page, as i think that in the future the comments, will become Key word stuffing for the product page. Pls suggest.. If yes, suggest the best url as well.. thanks
Intermediate & Advanced SEO | | Rahim1191 -
SEO Question re: Keyword Cannibalization
I know about Keyword Cannibalization, so I understand why it's generally a problem. If you have multiple versions of the same page, Google has to "guess" which one to display (as I understand it, unless you have a SUPER influential page you won't get both pages showing up on the SERP). To explain why I'm not sure if this applies to our page, we have a blog that we write about employment law issues on. So we might have 20 blog posts over the past year that all talk about recent pregnancy discrimination lawsuits employers might be interested in. Now, searching the Google Keyword tools, there aren't even close to 20 different focus keywords that would make any sense. "Pregnancy Discrimination lawsuit" is niche enough for us to be competitive, but anything more specific than that simply has very little search activity. My suggestion is to just optimize all of them for "pregnancy discrimination lawsuit". My understand of how Panda works is that if the content is different on each page (and it is!) then it will only display what it guesses is the most relevant "NLRB" post, but any link juice sent to the other 19 "NLRB" posts would still boost the relevancy for whatever post Google chooses. And it wouldn't get dinged as keyword stuffing because it's clearly not just the same page repeated over and over. I've found quite a few articles on Keyword Cannibalization but many are pre-Panda. I was CERTAIN I'd seen a post that explained my idea is a totally viable and good one, but of course now I can't find it. So before I go full steam ahead with this strategy I just want to make sure there's nothing I'm missing. Thanks!
Intermediate & Advanced SEO | | CEDRSolutions0 -
YouTube hosting question
The "How it works" video that is embedded on my sites homepage is currently linked to an individual YouTube account not our company account. I would like to change the ownership so that the company profile can enjoy the added views (currently 13K +). Is there a way to move the video to a different account without losing the views it has already accumulated? Also, a related technical question - our R&D team says the video is slowing down the site. It links to YouTube but there is nothing in the source of our page about YouTube. Any suggestions for embedding it more effectively?
Intermediate & Advanced SEO | | theLotter0 -
Duplicate Content Question
Currently, we manage a site that generates content from a database based on user search criteria such as location or type of business. ..Although we currently rank well -- we created the website based on providing value to the visitor with options for viewing the content - we are concerned about duplicate content issues and if they would apply. For example, the listing that is pulled up for the user upon one search could have the same content as another search but in a different order. Similar to hotels who offer room booking by room type or by rate. Would this dynamically generated content count as duplicate content? The site has done well, but don't want to risk a any future Google penalties caused by duplicate content. Thanks for your help!
Intermediate & Advanced SEO | | CompucastWeb1 -
Paging Question: Rel Next or Canonical?
Hi, Lets say you have a category which displays a list of 20 products and pagination of up to 10 pages. The root page has some content but when you click through the paging the content is removed leaving only the list of products. Would it be best to apply a canonical tag on the paging back to the root or apply the prev/next tags. I understand prev/next is good for say a 3 part article where each page holds unique content but how do you handle the above situation? Thanks
Intermediate & Advanced SEO | | Bondara0 -
XML question - not finding all of the pages
When I run http://www.xml-sitemaps.com/ on my site, it doesn't find all of my pages. The pages do not have any no follows in them (I thought that was the original problem). Has this happened to anyone else? What is the solution?
Intermediate & Advanced SEO | | digitalops0 -
Analytics Question?
Is there a way to see in GA traffic from other IP address's. I want to subtract all the times I visit the site from my IP and get a real traffic %.
Intermediate & Advanced SEO | | SEObleu.com0