Robots.txt Sitemap with Relative Path
-
Hi Everyone,
In robots.txt, can the sitemap be indicated with a relative path? I'm trying to roll out a robots file to ~200 websites, and they all have the same relative path for a sitemap but each is hosted on its own domain.
Basically I'm trying to avoid needing to create 200 different robots.txt files just to change the domain. If I do need to do that, though, is there an easier way than just trudging through it?
-
Hi Nicholas,
Unfortunately not. The sitemap reference has to be absolute. (You can confirm this by using the crawler access tool within WMT's)
I'd suggest that you create a PHP script to create a robots.txt file with the correct domain rather than having to do it manually.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to submit Google xml sitemap properly in 2016?
Hello everyone!
Technical SEO | | SEObd
I'm new in the field of SEO. I'm looking for submitting XML web site guideline or tutorial. But there is no proper guideline. All of the tutorials are about the wordpress website. What should I do for my PHP website? Can I submit XML site map without help of developer? Please help me.0 -
Best way to create robots.txt for my website
How I can create robots.txt file for my website guitarcontrol.com ? It is having login and Guitar lessons.
Technical SEO | | zoe.wilson170 -
What are the negative implications of listing URLs in a sitemap that are then blocked in the robots.txt?
In running a crawl of a client's site I can see several URLs listed in the sitemap that are then blocked in the robots.txt file. Other than perhaps using up crawl budget, are there any other negative implications?
Technical SEO | | richdan0 -
How to use robots.txt to block areas on page?
Hi, Across the categories/product pages on out site there are archives/shipping info section and the texts are always the same. Would this be treated as duplicated content and harmful for seo? How can I alter robots.txt to tell google not to crawl those particular text Thanks for any advice!
Technical SEO | | LauraHT0 -
301 redirect relative or absolute path?
Hello everyone, Recently we've changed the URL structure on our website, and of course we had to 301 redirect the old urls to the coresponding new ones. The way the technical guys did this is: "http://www.domain.com/old-url.html" 301 redirect to "/new-url.html"
Technical SEO | | Silviu
meaning as a relative redirect path, not an absolute one like this:
"http://www.domain.com/old-url.html" 301 redirect to "http://www.domain.com/new-url.html" This happened for few thousands urls, and the fact is the organic traffic dropped for those pages after this change. (no other changes were made on these pages and the new urls are as seo friendly as possible, A grade on On-Page Grader). The question is: does the relative redirect negatively affects seo, or it counts the same as an absolute path redirect? Thanks,
S.0 -
Exclude Noindex, Followed pages from sitemap?
Hello Everyone! This is a question about my site, which is running on WordPress. Currently, I have category page to have the noindex, follow attributes, as they have little unique content. I do have them currently in my sitemap.xml file, however. Should I remove them from the sitemap since Google technically shouldn't index them? Thanks for your help!
Technical SEO | | Zachary_Russell0 -
Empty Meta Robots Directive - Harmful?
Hi, We had a coding update and a side-effect of that was that our directive was emptied, in other words it now reads as: on all of the site. I've since noticed that Google's cache date on all of the pages - at least, the ones I tested - have a Cached date of no later than 17 December '12 - that's the Monday after the directive was removed on mass. So, A, does anyone have solid evidence of an empty directive causing problems? Past experience, Matt Cutts, Fishkin quote, etc. And then B - It seems fairly well correlated but, does my entire site's homogenous Cached date point to this tag removal? Or is it fairly normal to have a particular cache date across a large site (we're a large ecommerce site). Our site: http://www.zando.co.za/ I'm having the directive reinstated as soon as Dev permitting. And then, for extra credit, is there a way with Google's API, or perhaps some other tool, to run an arbitrary list and retrieve Cached dates? I'd want to do this for diagnosis purposes and preferably in a way that OK with Google. I'd avoid CURLing for the cached URL and scraping out that dates with BASH, or any such kind of thing. Cheers,
Technical SEO | | RocketZando0 -
Partial mobile sitemap
Hi, We have a main www website with a standard sitemap. We also have a m. site for mobile content (but m. is only for our top pages and doesn't include the entire site). If a mobile client accesses one of our www pages we redirect to the m. page. If we don't have a m. version we keep them on the www site. Currently we block robots from the mobile site. Since our m. site only contains the top pages, I'm trying to determine the boost we might get from creating a mobile sitemap. I don't want to create the "partial" mobile sitemap and somehow have it hurt our traffic. Here is my plan update m. pages to point rel canonical to appropriate www page (makes sure we don't dilute SEO across m. and www.) create mobile sitemap and allow all robots to access site. Our www pages already rank fairly highly so just want to verify if there are any concerns since m. is not a complete version of www?
Technical SEO | | NicB10