Search Console rejecting XML sitemap files as HTML files, despite them being XML
-
Hi Moz folks,
We have launched an international site that uses subdirectories for regions and have had trouble getting pages outside of USA and Canada indexed.
Google Search Console accounts have finally been verified, so we can submit the correct regional sitemap to the relevant search console account.
However, when submitting non-USA and CA sitemap files (e.g. AU, NZ, UK), we are receiving a submission error that states, "Your Sitemap appears to be an HTML page," despite them being .xml files, e.g. http://www.t2tea.com/en/au/sitemap1_en_AU.xml.
Queries on this suggest it's a W3 Cache plugin problem, but we aren't using Wordpress; the site is running on Demandware.
Can anyone guide us on why Google Search Console is rejecting these sitemap files? Page indexation is a real issue.
Many thanks in advance!
-
Thanks, both. We'll explore a better solution with Demandware.
-
agree
-
Quite sure that's the case. When I'm following the URL the site also redirects me to a normal page. What is likely is that the same thing is happening to the bots of Google.
-
Extra thought: We're wondering if it's a bigger issue involving the redirect mechanic? Currently, users from a specific country are automatically redirected to their respective locale (e.g. US users trying to access Australian URLs are redirected to /en/us/). Is there something in this where Googlebots aren't able to access AU, NZ and UK subdirectories and sitemap files because they're coming from North America?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My pages are being crawled, but not indexed according to Search Console
According to Google Search Console, my pages are being crawled by not indexed. We use Shopify and about two weeks ago I selected that Traffic from all our domains redirects to our primary domain. So everything from www.url.com and https://url.com and so on, would all redirect to one url. Have added an attached image from Search Console. 6fzEQg8
Technical SEO | | HariOmHemp0 -
How to remove Parameters from Google Search Console?
Hi All, Following are parameter configuration in search console - Parameters - fl
Technical SEO | | adamjack
Does this parameter change page content seen by the user? - Yes, Changes, reorders, or narrows page content.
How does this parameter affect page content? - Narrow
Which URLs with this parameter should Googlebot crawl? - Let Googlebot decide (Default) Query - Actually it is filter parameter. I have already set canonical on filter page. Now I am doing tracking of filter pages via data layer and tag manager so in google analytic I am not able to see filter url's because of this parameter. So I want to delete this parameter. Can anyone please help me? Thanks!0 -
Html Improvements Missing title tags
Hi, Webmaster suddently today i could see 100's of posts showing title tags are missing Html Improvements Missing title tags https://www.google.com/webmasters/tools/html-suggestions?hl=en&siteUrl= Please see image Can anyone help on why this happened suddenly and what is this and whats the solution http://imgur.com/zHsXVJN Thanks
Technical SEO | | mtthompsons0 -
Sitemaps and "noindex" pages
Experimenting a little bit to recover from Panda and added "noindex" tag for quite a few pages. Obviously now we need Google to re-crawl them ASAP and de-index. Should we leave these pages in sitemaps (with updated "lastmod") for that? Or just patiently wait? 🙂 What's the common/best way?
Technical SEO | | LocalLocal0 -
Pagination and SEO: How do I fix it during search parameters?
Today, I have watched very interesting video on YouTube about Pagination and SEO. I have implemented pagination with rel="next" and rel="prev" on my paginated page. You can get more idea by visit following pages. www.vistastores.com/patio-umbrellas www.vistastores.com/patio-umbrellas?p=2 www.vistastores.com/patio-umbrellas?p=3 I have added NOINDEX FOLLOW attribute to page 2, page 3 and so on. There is simple question from my side. Can I remove NOINDEX FOLLOW attribute from paginated page or not? I have big confusion & issues when paginated URLs contain search parameters. You can get more idea by visiting following URLs. http://www.vistastores.com/patio-umbrellas?dir=asc&order=name&p=2 http://www.vistastores.com/patio-umbrellas?dir=asc&order=name&p=3 What is best suggestion for this kind of pages?
Technical SEO | | CommercePundit0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0 -
Indexing of flash files
When Google indexes a flash file, do they use a library for such a purpose ? What set me thinking was this blog post ( although old ) which states - "we expanded our SWF indexing capabilities thanks to our continued collaboration with Adobe and a new library that is more robust and compatible with features supported by Flash Player 10.1."
Technical SEO | | seoug_20050 -
Partial mobile sitemap
Hi, We have a main www website with a standard sitemap. We also have a m. site for mobile content (but m. is only for our top pages and doesn't include the entire site). If a mobile client accesses one of our www pages we redirect to the m. page. If we don't have a m. version we keep them on the www site. Currently we block robots from the mobile site. Since our m. site only contains the top pages, I'm trying to determine the boost we might get from creating a mobile sitemap. I don't want to create the "partial" mobile sitemap and somehow have it hurt our traffic. Here is my plan update m. pages to point rel canonical to appropriate www page (makes sure we don't dilute SEO across m. and www.) create mobile sitemap and allow all robots to access site. Our www pages already rank fairly highly so just want to verify if there are any concerns since m. is not a complete version of www?
Technical SEO | | NicB10