Search Console rejecting XML sitemap files as HTML files, despite them being XML
-
Hi Moz folks,
We have launched an international site that uses subdirectories for regions and have had trouble getting pages outside of USA and Canada indexed.
Google Search Console accounts have finally been verified, so we can submit the correct regional sitemap to the relevant search console account.
However, when submitting non-USA and CA sitemap files (e.g. AU, NZ, UK), we are receiving a submission error that states, "Your Sitemap appears to be an HTML page," despite them being .xml files, e.g. http://www.t2tea.com/en/au/sitemap1_en_AU.xml.
Queries on this suggest it's a W3 Cache plugin problem, but we aren't using Wordpress; the site is running on Demandware.
Can anyone guide us on why Google Search Console is rejecting these sitemap files? Page indexation is a real issue.
Many thanks in advance!
-
Thanks, both. We'll explore a better solution with Demandware.
-
agree
-
Quite sure that's the case. When I'm following the URL the site also redirects me to a normal page. What is likely is that the same thing is happening to the bots of Google.
-
Extra thought: We're wondering if it's a bigger issue involving the redirect mechanic? Currently, users from a specific country are automatically redirected to their respective locale (e.g. US users trying to access Australian URLs are redirected to /en/us/). Is there something in this where Googlebots aren't able to access AU, NZ and UK subdirectories and sitemap files because they're coming from North America?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is sitemap required on my robots.txt?
Hi, I know that linking your sitemap from your robots.txt file is a good practice. Ok, but... may I just send my sitemap to search console and forget about adding ti to my robots.txt? That's my situation: 1 multilang platform which means... ... 2 set of pages. One for each lang, of course But my CMS (magento) only allows me to have 1 robots.txt file So, again: may I have a robots.txt file woth no sitemap AND not suffering any potential SEO loss? Thanks in advance, Juan Vicente Mañanas Abad
Technical SEO | | Webicultors0 -
Google ranks my sitemap.xml instead of blog post
Hello, For some reason Google shows sitemap results when i search for my blog url website.com/blog/postwhy is Google ranking my sitemap but not a post, especially when i search for full URL? Thanks
Technical SEO | | KentR0 -
Should component pages be visible in the search result?
Hi everyone, My question is suppose i have a blog having 200 pages arranged in footer like seomoz blog and when i move to 2nd page and say the url is http://www.seomoz.org/blog?page=2 and when i search exact url on google should this page be visible in search result or not. Since all component pages of seomoz blog are visible, i think this should not be a problem but when i see other popular blogs like SEJ and seroundtable none of their component pages are visible in search result. By the way i am using rel=prev and next but not robots: noindex, follow
Technical SEO | | himanshu3019890 -
XML Feed
If a site has an xml feed being used by 100 companies to create the content on their site. Will those 100 sites receive any link juice? Is there any way content may be classed as duplicate across these sites? And should the page on the site where the xml feed is coming from have the page indexed first?
Technical SEO | | jazavide0 -
Is my robots.txt file working?
Greetings from medieval York UK 🙂 Everytime to you enter my name & Liz this page is returned in Google:
Technical SEO | | Nightwing
http://www.davidclick.com/web_page/al_liz.htm But i have the following robots txt file which has been in place a few weeks User-agent: * Disallow: /york_wedding_photographer_advice_pre_wedding_photoshoot.htm Disallow: /york_wedding_photographer_advice.htm Disallow: /york_wedding_photographer_advice_copyright_free_wedding_photography.htm Disallow: /web_page/prices.htm Disallow: /web_page/about_me.htm Disallow: /web_page/thumbnails4.htm Disallow: /web_page/thumbnails.html Disallow: /web_page/al_liz.htm Disallow: /web_page/york_wedding_photographer_advice.htm Allow: / So my question is please... "Why is this page appearing in the SERPS when its blocked in the robots txt file e.g.: Disallow: /web_page/al_liz.htm" ANy insights welcome 🙂0 -
Do I need an XML sitemap?
I have an established website that ranks well in Google. However, I have just noticed that no xml sitemap has been registered in Google webmaster tools, so the likelihood is that it hasn't been registered with the other search engines. However, there is an html sitemap listed on the website. Seeing as the website is already ranking well, do I still need to generate and submit an XML sitemap? Could there be any detriment to current rankings in doing so?
Technical SEO | | pugh0 -
Changing .html to .asp in URLs
Hi Mozzers, I have a question. The webmaster of a client of mine needs to make changes to some files which will effect the URL's. Essentially everything is staying the same but the end of the URL will change from .html to .asp. This is because the site will be dynamically loading content (perhaps from a database) (i.e. latest news to come from their blog etc..) In order to do this we would need to change the filenames of the whole website. (i.e. personnel.html would become personel.asp). Changing URLs can harm indexation but a small change to the end - would Google drop these pages? A 301 redirect is not possible from old URL to new. What impact would this have on Rankings? Thanks Gareth
Technical SEO | | Bush_JSM0