Will a robots.txt disallow apply to a 301ed URL?
-
Hi there,
I have a robots.txt query which I haven't tried before and as we're nearing a big time for sales, I'm hesitant to just roll out to live!
Say for example, in my robots.txt I disallow the URL 'example1.html'.
In reality, 'example1.html' 301s/302s to 'example2.html'. Would the robots.txt directive also apply to 'example2.html' (disallow) or as it's a separate URL, would the directive be ignored as it's not valid?
I have a feeling that as it's a separate URL, the robots disallow directive won't apply. However, just thought I'd sense-check with the community.
-
I would that the example2.html wouldn't be affected by the robots.txt as it can be that a bot will visit example2.html directly without visiting example1.html. Definitely as it could be that the page was picked up after the first time it visited example1.html.
-
I would have to agree, using your example if example1.html is blocked via robots.txt and when you visit it, it 301's to example2.html then it's not blocked by robots. as long as example2.html is discoverable via other indexed pages linking to it.
Does anyone else in the community have some insight they would like to share?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking subdomains with Robots.txt file
We noticed that Google is indexing our pre-production site ibweb.prod.interstatebatteries.com in addition to indexing our main site interstatebatteries.com. Can you all help shed some light on the proper way to no-index our pre-prod site without impacting our live site?
Technical SEO | | paulwatley0 -
URL Structure
I'm going through the process of redesigning our website, and the URL structure was brought up. We currently have our URLs structured as domain.com/keyword. It seems that some people think setting your URLs up to look like: domain.com/directory/keyword makes more sense from a user's perspective, and from a search engine's perspective. With our directories labeled as services, solutions, clients - I see no value in adding directories as it dilutes the keyword and brings the keyword further away from the domain. Are there situations where adding a directory before the page in the URL makes sense? If anyone has data showing the difference between the two that'd be great! Thanks, Brian
Technical SEO | | PrasoonGoel0 -
Blocked by robots
my client GWT has a number of notices for "blocked by meta-robots" - these are all either blog posts/categories/or tags his former seo told him this: "We've activated following settings: Use noindex for Categories Use noindex for Archives Use noindex for Tag Archives to reduce keyword stuffing & duplicate post tags
Technical SEO | | Ezpro9
Disabling all 3 noindex settings above may remove google blocks but also will send too many similar tags, post archives/category. " is this guy correct? what would be the problem with indexing these? am i correct in thinking they should be indexed? thanks0 -
Robots.txt
Hello Everyone, The problem I'm having is not knowing where to have the robots.txt file on our server. We have our main domain (company.com) with a robots.txt file in the root of the site, but we also have our blog (company.com/blog) where were trying to disallow certain directories from being crawled for SEO purposes... Would having the blog in the sub-directory still need its own robots.txt? or can I reference the directories i don't want crawled within the blog using the root robots.txt file? Thanks for your insight on this matter.
Technical SEO | | BailHotline0 -
Changing the URL structure will it help me or hurt me?
I got handed a website running on Joomla without the SEO friendly URL check box selected so our URLs all look like this www.rotaryvalve.com/index.php?option=com_content&view=article&id=22&Itemid=37 . I am hoping to rework this website in the near future here and plan on changing the URL structure across the website so there are some actual keywords in the URL. When I did this I was thinking of just doing 301 redirects to the new pages and hopefully the hit from the search engines wouldn't be too bad. Can anyone speak from experience as to what the best way to go about doing this would be so I don't end up falling back ranking wise. Would change the URLs end up helping me or hurting me? Thanks
Technical SEO | | wmwmeyer0 -
Mobile site: robots.txt best practices
If there are canonical tags pointing to the web version of each mobile page, what should a robots.txt file for a mobile site have?
Technical SEO | | bonnierSEO0 -
Trailing Slashes In Url use Canonical Url or 301 Redirect?
I was thinking of using 301 redirects for trailing slahes to no trailing slashes for my urls. EG: www.url.com/page1/ 301 redirect to www.url.com/page1 Already got a redirect for non-www to www already. Just wondering in my case would it be best to continue using htacces for the trailing slash redirect or just go with Canonical URLs?
Technical SEO | | upick-1623910 -
How do you disallow HTTPS?
I currently have a site (startuploans.org) that runs everything as http, recently we decided to start an online application to process loan apps. Now, for one certain section we configured ssl to work (https://www.startuploans.org/secure/). If I go to the HTTPS url for any of my other pages they show up...I was going to just 301 everything from https but because it is in a subdirectiory I can't... Also, canonical URL's won't work either because it's a totally different system and the pages are generated in an odd manor. It's really just 1 page that needs to be disallowed.. Is there any way to disallow all HTTPS requests from robots.txt while keeping all the HTTP requests working as normal?
Technical SEO | | WebsiteConsultants0