Would a Search Engine treat a sitemap hosted in the cloud in the same way as if it was simply on /sitemap.htm?
-
Mainly to allow updates without the need for publishing - would Google interpret any differently?
Thanks
-
How can you submit them to Search console if they don't live on your root domain? I understand that you can reference the cloud sitemap URL it in the robots.txt but without it being in Search console you lose visibility to errors and indexing issues.
-
I can second this, it doesn't seem to really matter where you sitemaps are living. Definitely not if you link to them from your robots.txt file as it's a proof that you can influence their location.
-
I didn't run any experiment on this, but I think it can be done from robots.txt referencing the sitemap file. You can read more here -> https://www.sitemaps.org/protocol.html#sitemaps_cross_submits. So basically, you provide the link to the cloud file and tell the crawlers that it is a sitemap for a given website. I don't think Google will treat these files any differently.
[robots.txt ...] Sitemap: https://yourcloudprovider.com/sitemap.htm (or xml or whatever)
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adding /es version to google search console
I have a Wordpress site and we are using WPML for making it bilingual. The domain is: https://www.designerfreelance.net and for Spanish https://www.designerfreelance.net/es Do I have to add to Google search console the /es version? And the no www: https://www.designerfreelance.net https://www.designerfreelance.net/es https://designerfreelance.net https://designerfreelance.net/es and do I have to add the non ssl version? http://www.designerfreelance.net http://www.designerfreelance.net/es http://designerfreelance.net http://designerfreelance.net/es Thanks
Technical SEO | | Trazo0 -
URL / sitemap structure for support pages
I am creating a site that has four categories housed in folders off of the TLD. Example: example.com/category-1
Technical SEO | | InterCall
example.com/category-2
example.com/category-3
example.com/category-4 Those category folders contain sub-folders that house the products inside each category. Example: example.com/category-1/product-1
example.com/category-2/product-1
etc. Each of the products have a corresponding support page with technical information, FAQs, etc. I have three options as to how to structure the support pages' URLs. Option 1 - Add new sub-folder with "support" added to string: example.com/category-1/product-1-support Option 2 - Add a second sub-folder off of the product sub-folder for support: example.com/category-1/product-1/support Option 3 - Create a "support" folder with product sub-folders: example.com/support/product-1 Which of these three options would you choose? I don't like having one large /support folder that houses all products. It seems like this would create a strange crawling and UX situation. The sitemap would have a huge /support folder with all of my products in it and the keywords in my category folders would be replaced with the word "support." Because I would rather have the main product pages ranking over any of the support pages (outside of searches containing the word "support"), I am leaning toward Option 2: example.com/category-1/product-1/support. I think this structure indicates to crawlers that the more important page is the product page, while the support page is secondary to that. It also makes it clear to users that this is the support page for that particular product. Does anyone have any experience or perspective on this? I'm open to suggestions and if I'm overthinking it, tell me that too. Thanks, team.0 -
Https vs http sitemap
I have a site that does a 301 redirect from http to https I currently have a sitemap auto submitted to google webmaster tools using the http pages. (because i didnt have https before) should I disable that sitemap for http and create one for the https only?
Technical SEO | | puremobile0 -
How could i create sitemap with 1000 page and should i update sitemap frequently?
My website have over 1000 pages but the sitemap creator tools i knew only create maximum 500 pages, how could i create sitemap with full of my webpage?
Technical SEO | | magician0 -
How can the search engines can crawl my java script generated web pages
For example when I click in a link of this movie from the home page, the link send me to this page http://www.vudu.mx/movies/#!content/293191/Madagascar-3-Los-Fugitivos-Madagascar-3-Europes-Most-Wanted-Doblada but in the source code I can't see the meta tittle and description and I think the search engines wont see that too, am I right? I guess that only appears the source code of that "master template" and that it is not usefull for me. So, my question is, how can I add dynamically this data to every page of each movie to allow crawl all the pages to the search engines? Thank you.
Technical SEO | | mobile3600 -
Best XML Sitemap generator
Do you guys have any suggestions on a good XML Sitemaps generator? hopefully free, but if it's good i'd consider paying I am using a MAC so would prefer a online or mac version
Technical SEO | | kevin48030 -
Include pagination in sitemap.xml?
Curious on peoples thoughts around this. Since restructuring our site we have seen a massive uplift in pages indexed and organic traffic with our pagination. But we haven't yet included a sitemap.xml. It's an ancient site that never had one. Given that Google seems to be loving us right now, do we even need a sitemap.xml - aside from the analytical benefis in WM Tools? Would you include pagination URL's (don't worry, we have no duplicate content) in the sitemap.xml? Cheers.
Technical SEO | | sichristie0 -
Warnings on Pages excluded from Search Engines
I am new to this, so my question may seem a little rookie type... When looking at my crawl diagnostic errors there are 1604 warnings for "302 redirects". Of those 1604 warnings 1500 of them are for the same page with different product ID's on them such as: www.soccerstop.com/EMailproduct.aspx?productid=999
Technical SEO | | SoccerStop
www.soccerstop.com/EMailproduct.aspx?productid=998 In our robots.txt file we have Disallow: /emailproduct.aspx Wouldn't that take care of this problem? If so, why is it still giving me these warning errors? It does take into account our robots.txt file when generating this report does it not? Thanks for any help you can provide.
James0