Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
2 sitemaps on my robots.txt?
-
Hi,
I thought that I just could link one sitemap from my site's robots.txt but... I may be wrong.
So, I need to confirm if this kind of implementation is right or wrong:
robots.txt for Magento Community and Enterprise
...
Sitemap: http://www.mysite.es/media/sitemap/es.xml
Sitemap: http://www.mysite.pt/media/sitemap/pt.xmlThanks in advance,
-
We recently changed our protocol to https
We have in our robots.txt our new https sitemap link
Our agency is recommending we add another sitemap in our robots.txt file to our insecure sitemap - while google is reindexing our secure protocol. They recommend this as a way for all SEs to pick up on 301 redirects and swap out unsecured results in the index more efficiently.
Do you agree with this?
I am in the camp that we should have have our https sitemap and google will figure it out and having 2 sitemaps one to our old http and one to our new https in our robots.txt is redundant and may be viewed as duplicate content, not as a positive of helping SEs to see 301s better to reindex secure links.
Whats your thought? Let me know if I need to explain more.
-
Well if both sitemaps are for same site then it's OK. But it's much better to implement hreflang as this is explained here:https://support.google.com/webmasters/answer/2620865?hl=en
I'm not sure that Magento can do this but you always can hire 3rd party dev for building plugin/module for this.
-
ok, just one detail: these domains are for a multilang site.
I mean, both have quite the same content: one in spanish and the other un portuguese.
Thanks a lot.
-
You can also have multiple sitemaps on 3rd sites. Look at Moz robots.txt:
Sitemap: https://moz.com/blog-sitemap.xml
Sitemap: https://moz.com/ugc-sitemap.xml
Sitemap: https://moz.com/profiles-sitemap.xml
Sitemap: http://d2eeipcrcdle6.cloudfront.net/past-videos.xml
Sitemap: http://app.wistia.com/sitemaps/36357.xmlAlso Google.com robots.txt:
Sitemap: http://www.gstatic.com/culturalinstitute/sitemaps/www_google_com_culturalinstitute/sitemap-index.xml
Sitemap: http://www.gstatic.com/dictionary/static/sitemaps/sitemap_index.xml
Sitemap: http://www.gstatic.com/earth/gallery/sitemaps/sitemap.xml
Sitemap: http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml
Sitemap: http://www.gstatic.com/trends/websites/sitemaps/sitemapindex.xml
Sitemap: https://www.google.com/sitemap.xmlAlso Bing.com robots.txt:
Sitemap: http://cn.bing.com/dict/sitemap-index.xml
Sitemap: http://www.bing.com/offers/sitemap.xmlSo using multiple sitemaps it's OK and they can be also hosted on 3rd party server.
-
Hello,
Yes, multiple sitemaps are okay, and sometimes even advised!
You can read Google's official response here."..it's fine for multiple Sitemaps to live in the same directory (as many as you want!)..."
And you can see a case study showing how multiple sitemaps has helped traffic here on Moz.
Hope this helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting 'Indexed, not submitted in sitemap' for around a third of my site. But these pages ARE in the sitemap we submitted.
As in the title, we have a site with around 40k pages, but around a third of them are showing as "Indexed, not submitted in sitemap" in Google Search Console. We've double-checked the sitemaps we have submitted and the URLs are definitely in the sitemap. Any idea why this might be happening? Example URL with the error: https://www.teacherstoyourhome.co.uk/german-tutor/Egham Sitemap it is located on: https://www.teacherstoyourhome.co.uk/sitemap-subject-locations-surrey.xml
Technical SEO | | TTYH0 -
My site ranking has dropped in recent 2 weeks ?
I have a quick question to ask you guys,
Technical SEO | | Mustansar
I have my site (www.webuycarstoday.co.uk) It used to rank very well for all my target keywords. But recently all keywords dropped massively. I can't seems to find a good reason of this drop. We have not change anything which may have caused such drop.
Any suggestion or advice to get into the problem. I would appreciate your kind responses.0 -
Do I need a separate robots.txt file for my shop subdomain?
Hello Mozzers! Apologies if this question has been asked before, but I couldn't find an answer so here goes... Currently I have one robots.txt file hosted at https://www.mysitename.org.uk/robots.txt We host our shop on a separate subdomain https://shop.mysitename.org.uk Do I need a separate robots.txt file for my subdomain? (Some Google searches are telling me yes and some no and I've become awfully confused!
Technical SEO | | sjbridle0 -
Robots.txt blocking Addon Domains
I have this site as my primary domain: http://www.libertyresourcedirectory.com/ I don't want to give spiders access to the site at all so I tried to do a simple Disallow: / in the robots.txt. As a test I tried to crawl it with Screaming Frog afterwards and it didn't do anything. (Excellent.) However, there's a problem. In GWT, I got an alert that Google couldn't crawl ANY of my sites because of robots.txt issues. Changing the robots.txt on my primary domain, changed it for ALL my addon domains. (Ex. http://ethanglover.biz/ ) From a directory point of view, this makes sense, from a spider point of view, it doesn't. As a solution, I changed the robots.txt file back and added a robots meta tag to the primary domain. (noindex, nofollow). But this doesn't seem to be having any effect. As I understand it, the robots.txt takes priority. How can I separate all this out to allow domains to have different rules? I've tried uploading a separate robots.txt to the addon domain folders, but it's completely ignored. Even going to ethanglover.biz/robots.txt gave me the primary domain version of the file. (SERIOUSLY! I've tested this 100 times in many ways.) Has anyone experienced this? Am I in the twilight zone? Any known fixes? Thanks. Proof I'm not crazy in attached video. robotstxt_addon_domain.mp4
Technical SEO | | eglove0 -
IIS 7.5 - Duplicate Content and Totally Wrong robot.txt
Well here goes! My very first post to SEOmoz. I have two clients that are hosted by the same hosting company. Both sites have major duplicate content issues and appear to have no internal links. I have checked this both here with our awesome SEOmoz Tools and with the IIS SEO Tool Kit. After much waiting I have heard back from the hosting company and they say that they have "implemented redirects in IIS7.5 to avoid duplicate content" based on the following article: http://blog.whitesites.com/How-to-setup-301-Redirects-in-IIS-7-for-good-SEO__634569104292703828_blog.htm. In my mind this article covers things better: www.seomoz.org/blog/what-every-seo-should-know-about-iis. What do you guys think? Next issue, both clients (as well as other sites hosted by this company) have a robot.txt file that is not their own. It appears that they have taken one client's robot.txt file and used it as a template for other client sites. I could be wrong but I believe this is causing the internal links to not be indexed. There is also a site map, again not for each client, but rather for the client that the original robot.txt file was created for. Again any input on this would be great. I have asked that the files just be deleted but that has not occurred yet. Sorry for the messy post...I'm at the hospital waiting to pick up my bro and could be called to get him any minute. Thanks so much, Tiff
Technical SEO | | TiffenyPapuc0 -
2 questions about linkbuilding
1. Are these types of sites bad to submit a link to? http://www.mompack.com/mom2mom/ 2. If I submit my product for another blog to review (in turn they write a post for me with links to my website), is this GOOD? Look forward to hearing back from you, thanks
Technical SEO | | ChrisTS0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
How to allow one directory in robots.txt
Hello, is there a way to allow a certain child directory in robots.txt but keep all others blocked? For instance, we've got external links pointing to /user/password/, but we're blocking everything under /user/. And there are too many /user/somethings/ to just block every one BUT /user/password/. I hope that makes sense... Thanks!
Technical SEO | | poolguy0