Robots txt. in page with 301 redirect
-
We currently have a a series of help pages that we would like to disallow from our robots txt.
The thing is that these help pages are located in our old website, which now has a 301 redirect to current site.
Which is the proper way to go around?
1- Add the pages we want to disallow to the robots.txt of the new website?
2- Break the redirect momentarily and add the pages to the robots.txt of the old one?
Thanks
-
In that case, you'd need to add the robots meta tag at the page level before the tag.
or
-
Hey, for some time we will keep the files in the old domain. Should we break the redirect and insert the disallows to the robot.txt of the old site?
-
So, the problem is that the robots.txt file can't be accessed because of the 301 redirect to the new domain?
Do you plan to keep the help files on the old domain, or will they be removed completely?
-
Hi Laura,
Thanks for your reply. I don't want to disallow the URLs these pages are being redirected to. Actually these URLs are in the old version but still can be accessed. So to put it simply, this is my case:
1- This was our current website: www.kilgray.com (With a 301 redirect)
2- This is our new website: www.memoq.com
3- I would like to disallow the following links on the old website that are still visible (haven't been redirected):
http://kilgray.com/memoq/2015-100/help-en/index.html
http://kilgray.com/memoq/2014/help-en/
-
Do you want to disallow the URLs that these pages are being redirected to? If not, there's no need to add anything to the robots.txt file.
If you do want to disallow the URLs that these pages are being redirected to, use relative URLs in your robots.txt file. For example, let's say olddomain.com/old-help-page/ is being redirected to newdomain.com/new-help-page/. If that's the case, add the following to your robots.txt file.
Disallow: /new-help-page/
There's no need to disallow the specific URLs that are being redirected to something else. Are you trying to get them removed from Google's index or something? If so, Google will update their index eventually based on your 301 redirects.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our protected pages 302 redirect to a login page if not a member. Is that a problem for SEO?
We have a membership site that has links out in our unprotected pages. If a non-member clicks on these links it sends a 302 redirect to the login / join page. Is this an issue for SEO? Thanks!
Technical SEO | | rimix1 -
Where to put 301 redirects in my Wordpress htaccess file?
I have about 25 301 redirects in my Wordpress htaccess file, that look like this: <code>Redirect301/store/index.html https://www.notesinspanish.com/store-home/</code> At the moment they are at the bottom of my htaccess file, below the usual Wordpress rewrite rules: <code># BEGIN WordPress <ifmodulemod_rewrite.c>RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] # END WordPress</ifmodulemod_rewrite.c></code> So they are below all that. Above my WP rewrite rules I have a number of other rules from plugins (caching, ssl). Are my 301's OK where they are at the very bottom of that file? They are working, and redircting pages correctly. Should they be somewhere else? Many thanks for any help. Thanks for any help.
Technical SEO | | Benspain0 -
301 Redirect domain with penalty
Wondering if I could get a few views on this please... I have added an affiliate store to a domain I own, however I forgot to noindex the product pages which were duplicate content of the merchants. Despite a good deal of backlink building the site will not do much in the engines at all, doesn't even come up on the first few pages for it's own name! This suggests to me that I have a duplicate content penalty. Try as I may I cannot get it removed so am thinking of cloning the domain to a new domain, however, I do not want to lose the links I collected so I am planning on 301ing them. While I will not get all the link power moved over, I should at least get credit for some of them which will kick start the new domain. Can anyone forsee any potential issues with doing this? Is there a danger of 301ing a site with a penalty that the penalty would be carried over? I know there is no penalty on the links, no WMT warnings etc, it is the content causing the issue. Thanks, Carl
Technical SEO | | Grumpy_Carl0 -
Redirect a 301 Redirect
Does any link juice get passed from a permanent redirect to a new 301 redirect? If so, are there any studies which indicate an estimated percentage?
Technical SEO | | RedCaffeine0 -
Can I remove 301 redirects after some time?
Hello, We have an very large number of 301 redirects on our site and would like to find a way to remove some of them. Is there a time frame after which Google does not need a 301 any more? For example if A is 301 redirected to B, does Google know after a while not to serve A any more, and replaces any requests for A with B? How about any links that go to A? Or: Is the only option to have all links that pointed to A point to B and then the 301 can be removed after some time? Thank you for you you help!
Technical SEO | | Veva0 -
Using Robots.txt
I want to Block or prevent pages being accessed or indexed by googlebot. Please tell me if googlebot will NOT Access any URL that begins with my domain name, followed by a question mark,followed by any string by using Robots.txt below. Sample URL http://mydomain.com/?example User-agent: Googlebot Disallow: /?
Technical SEO | | semer0 -
Robots.txt Sitemap with Relative Path
Hi Everyone, In robots.txt, can the sitemap be indicated with a relative path? I'm trying to roll out a robots file to ~200 websites, and they all have the same relative path for a sitemap but each is hosted on its own domain. Basically I'm trying to avoid needing to create 200 different robots.txt files just to change the domain. If I do need to do that, though, is there an easier way than just trudging through it?
Technical SEO | | MRCSearch0 -
IIS Work Around 301 Redirects
We are redirecting page-level content (about 500 pages) from several sub domains to our main site. With IIS, It’s my understanding that file locations must match. For example: subdomain/pathA/filename1
Technical SEO | | DigitalMkt
mainsite/pathA/filename1 Since the sub domain files are not on the main site, this means we'd create up to 500 zero byte dummy files on the new server and replicate the sub domain directory structure. With IIS is there a work around for handling page level redirects without duplicating the file location? In the case of white papers, videos and case studies, we'll imlement directory level redirection. Thanks in advance.0