Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Robots txt. in page with 301 redirect
-
We currently have a a series of help pages that we would like to disallow from our robots txt.
The thing is that these help pages are located in our old website, which now has a 301 redirect to current site.
Which is the proper way to go around?
1- Add the pages we want to disallow to the robots.txt of the new website?
2- Break the redirect momentarily and add the pages to the robots.txt of the old one?
Thanks
-
In that case, you'd need to add the robots meta tag at the page level before the tag.
or
-
Hey, for some time we will keep the files in the old domain. Should we break the redirect and insert the disallows to the robot.txt of the old site?
-
So, the problem is that the robots.txt file can't be accessed because of the 301 redirect to the new domain?
Do you plan to keep the help files on the old domain, or will they be removed completely?
-
Hi Laura,
Thanks for your reply. I don't want to disallow the URLs these pages are being redirected to. Actually these URLs are in the old version but still can be accessed. So to put it simply, this is my case:
1- This was our current website: www.kilgray.com (With a 301 redirect)
2- This is our new website: www.memoq.com
3- I would like to disallow the following links on the old website that are still visible (haven't been redirected):
http://kilgray.com/memoq/2015-100/help-en/index.html
http://kilgray.com/memoq/2014/help-en/
-
Do you want to disallow the URLs that these pages are being redirected to? If not, there's no need to add anything to the robots.txt file.
If you do want to disallow the URLs that these pages are being redirected to, use relative URLs in your robots.txt file. For example, let's say olddomain.com/old-help-page/ is being redirected to newdomain.com/new-help-page/. If that's the case, add the following to your robots.txt file.
Disallow: /new-help-page/
There's no need to disallow the specific URLs that are being redirected to something else. Are you trying to get them removed from Google's index or something? If so, Google will update their index eventually based on your 301 redirects.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with old content after 301 redirect
I'm going through all our blog and FAQ pages to see which ones are performing well and which ones are competing with one another. Basically doing an SEO content clean up. Is there any SEO benefit to keeping the page published vs trashing it after you apply a 301 redirect to a better performing page?
Technical SEO | | LindsayE0 -
301 Redirect for multiple links
I just relaunched my website and changed a permalink structure for several pages where only a subdirectory name changed. What 301 Redirect code do I use to redirect the following? I have dozens of these where I need to change just the directory name from "urban-living" to "urban", and want it to catch the following all in one redirect command. Here is an example of the structure that needs to change. Old
Technical SEO | | shawnbeaird
domain.com/urban-living (single page w/ content)
domain.com/urban-living/tempe (single page w/ content)
domain.com/urban-living/tempe/the-vale (single page w/ content) New
domain.com/urban
domain.com/urban/tempe
domain.com/urban/tempe/the-vale0 -
301 Redirects, Sitemaps and Indexing - How to hide redirected urls from search engines?
We have several pages in our site like this one, http://www.spectralink.com/solutions, which redirect to deeper page, http://www.spectralink.com/solutions/work-smarter-not-harder. Both urls are listed in the sitemap and both pages are being indexed. Should we remove those redirecting pages from the site map? Should we prevent the redirecting url from being indexed? If so, what's the best way to do that?
Technical SEO | | HeroDesignStudio0 -
2 sitemaps on my robots.txt?
Hi, I thought that I just could link one sitemap from my site's robots.txt but... I may be wrong. So, I need to confirm if this kind of implementation is right or wrong: robots.txt for Magento Community and Enterprise ...
Technical SEO | | Webicultors
Sitemap: http://www.mysite.es/media/sitemap/es.xml
Sitemap: http://www.mysite.pt/media/sitemap/pt.xml Thanks in advance,0 -
301 redirect adding trailing slash to url
I am looking into a .htacess file for a site I look after and have noticed that the urls are all 301 redirecting from a none slash directory to a trailing slashed directory/folders. e.g. www.domain.com/folder gets 301 redirected to www.domain.com/folder/ Will this do much harm and reduce the effect on the page and any links pointing to the site be lessened? Secondly I am not sure what part of my htaccess is causing the redirect. RewriteCond %{HTTP_HOST} !^www.domain.co.uk [NC] RewriteCond %{HTTP_HOST} !^$
Technical SEO | | TimHolmes
RewriteRule ^(.*) http://www.domain.co.uk/$1 [L,R,NE] RewriteCond %{THE_REQUEST} ^./index.php
RewriteRule ^(.)index.php$ /$1 [R=301,L] or could a wordpress ifmodule be causing the problem? Any info would be apreciated.0 -
Remove html file extension and 301 redirects
Hi Recently I ask for some work done on my website from a company, but I am not sure what they've done is right.
Technical SEO | | ulefos
What I wanted was html file extensions to be removed like
/ash-logs.html to /ash-logs
also the index.html to www.timports.co.uk
I have done a crawl diagnostics and have duplicate page content and 32 page title duplicates. This is so doing my head in please help This is what is in the .htaccess file <ifmodule pagespeed_module="">ModPagespeed on
ModPagespeedEnableFilters extend_cache,combine_css, collapse_whitespace,move_css_to_head, remove_comments</ifmodule> <ifmodule mod_headers.c="">Header set Connection keep-alive</ifmodule> <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews</ifmodule> DirectoryIndex index.html RewriteEngine On
# Rewrite valid requests on .html files RewriteCond %{REQUEST_FILENAME}.html -f RewriteRule ^ %{REQUEST_URI}.html?rw=1 [L,QSA]
# Return 404 on direct requests against .html files RewriteCond %{REQUEST_URI} .html$
RewriteCond %{QUERY_STRING} !rw=1 [NC]
RewriteRule ^ - [R=404] AddCharset UTF-8 .html # <filesmatch “.(js|css|html|htm|php|xml|swf|flv|ashx)$”="">#SetOutputFilter DEFLATE #</filesmatch> <ifmodule mod_expires.c="">ExpiresActive On
ExpiresByType image/gif "access plus 1 years"
ExpiresByType image/jpeg "access plus 1 years"
ExpiresByType image/png "access plus 1 years"
ExpiresByType image/x-icon "access plus 1 years"
ExpiresByType image/jpg "access plus 1 years"
ExpiresByType text/css "access 1 years"
ExpiresByType text/x-javascript "access 1 years"
ExpiresByType application/javascript "access 1 years"
ExpiresByType image/x-icon "access 1 years"</ifmodule> <files 403.shtml="">order allow,deny allow from all</files> redirect 301 /PRODUCTS http://www.timports.co.uk/kiln-dried-logs
redirect 301 /kindling_firewood.html http://www.timports.co.uk/kindling-firewood.html
redirect 301 /about_us.html http://www.timports.co.uk/about-us.html
redirect 301 /log_delivery.html http://www.timports.co.uk/log-delivery.html redirect 301 /oak_boards_delivery.html http://www.timports.co.uk/oak-boards-delivery.html
redirect 301 /un_edged_oak_boards.html http://www.timports.co.uk/un-edged-oak-boards.html
redirect 301 /wholesale_logs.html http://www.timports.co.uk/wholesale-logs.html redirect 301 /privacy_policy.html http://www.timports.co.uk/privacy-policy.html redirect 301 /payment_failed.html http://www.timports.co.uk/payment-failed.html redirect 301 /payment_info.html http://www.timports.co.uk/payment-info.html1 -
How do I fix a 301 Redirect Loop?
Saturday I waas doing some correcting of some duplicate titles, including nofollowing tags, etc. (my main problem was duplicate titles due to tags and categories being indexed). Now this morning I see that one of my pages refuses to load, citing a 301 redirect loop. http://www.incredibleinfant.com/feeding/switching-baby-formula/ Originally, the page was posted under the wrong category. http://www.incredibleinfant.com/uncategorized/switching-baby-formula I resaved it under the correct category (feeding) and now it won't load. Can someone help me figure out how to correct this mess? Thanks so much Heather
Technical SEO | | Gotmoxie0 -
Index.php and 301 redirect with Joomla
Hi, I'm running Joomla 1.7 with SEF on and I'm trying to do a htaccess redirect which fails. I have approximately 100 in effect so far and all working fine, but I have one snag. Index.php is not working as I need it to when it's redirected to www.myurl.com/ If I turn on index.php redirect to root using this code #index.php to root
Technical SEO | | NaescentAdam
RewriteCond %{HTTP_HOST} ^myurl.com$ [OR]
RewriteCond %{HTTP_HOST} ^www.myurl.com$
RewriteRule ^index.php$ "http://www.myurl.com/" [R=301,L] And then go to www.myurl.com/test.html I'm redirected to the homepage. I think this is because all pages are index.php in joomla. SEOMOZ and Google both think that index.php and root are duplicate pages. Does anyone have any advice for overcoming this? Thanks, Adam0