Submitting a new sitemap index file. Only one file is getting read. What is the error?
-
Hi community,
I am working to submit a new a new sitemap index files, where about 5 50,000 sku files will be uploaded. Webmasters is reporting that only 50k skus have been submitted.
Google Webmasters is accepting the index, however only the first file is getting read. I have 2 errors and need to know if this is the reason that the multiple files are not getting uploaded.
Errors:
| 1 | | Warnings | Invalid XML: too many tags | Too many tags describing this tag. Please fix it and resubmi |
| 2 | | Warnings | Incorrect namespace | Your Sitemap or Sitemap index file doesn't properly declare the namespace. | 1 |
Here is the url I am submitting: http://www.westmarine.com/sitemap/wm-sitemap-index.xml
| 1 | | | | |
-
thank you,
I grabbed this and put into a text file for my developer to see and it and have made sense of your corrections.
-
The forum is butchering my formatting, but I hope you get the idea.
-
Try changing the first line of your index file to:
<sitemapindex xmlns="http: www.sitemaps.org="" schemas="" sitemap="" 0.9"=""></sitemapindex xmlns="http:>
That should get rid of the second warning. For the first warning, you are not using enough <sitemap>tags, you need to encapsulate each of the <loc>tags in one, so your sitemap should look like:</loc></sitemap>
<sitemapindex xmlns="http: www.sitemaps.org="" schemas="" sitemap="" 0.9"=""></sitemapindex xmlns="http:><sitemap></sitemap>http://www.westmarine.com/sitemap/wm-products01.xml.gz<sitemap></sitemap>http://www.westmarine.com/sitemap/wm-products02.xml.gz<sitemap></sitemap>http://www.westmarine.com/sitemap/wm-brands01.xml.gz<sitemap></sitemap>http://www.westmarine.com/sitemap/wm-categories01.xml.gz<sitemap></sitemap>http://www.westmarine.com/sitemap/wm-content-pages01.xml.gz<sitemap></sitemap>http://www.westmarine.com/sitemap/wm-commerce-pages01.xml.gz
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Sitemap Rules
Hello there, I have some questions pertaining to sitemaps that I would appreciate some guidance on. 1. Can an XML sitemap contain URLs that are blocked by robots.txt? Logically, it makes sense to me to not include pages blocked by robots.txt but would like some clarity on the matter i.e. will having pages blocked by robots.txt in a sitemap, negatively impact the benefit of a sitemap? 2. Can a XML sitemap include URLs from multiple subdomains? For example: http://www.example.com/www-sitemap.xml would include the home page URL of two other subdomains i.e. http://blog.example.com/ & http://blog2.example.com/ Thanks
Technical SEO | | SEONOW1230 -
Sitemap indexation
3 days ago I sent in a new sitemap for a new platform. Its 23.412 pages but until now its only 4 pages (!!) that are indexed according to the Webmaster Tools. Why so few? Our stage-enviroment got indexed (more than 50K pages) in a few days by a mistake.
Technical SEO | | Morten_Hjort0 -
Sitemap question
Hello, In your opinion what is better for a root domain and micro-sites using sub-domains?, to have a single sitemap for the root domain including all links to the sub-domains or to have a separate sitemap for each sub-domain? Thanks Arnold
Technical SEO | | arnoldwender0 -
Google Indexing - what did I missed??
Hello, all SEOers~ I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses. But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious. Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much. I am re-doing 301 redirect within today, but I am not sure it means anything anymore. Any advise or opinion?? Thanks in advance~!
Technical SEO | | Yunhee.Choi0 -
Do I need a link to my sitemap?
I have a very large sitemap. I submit it to both Google and Bing, but do I need a link to it? If someone went there it would probably lock their browser. Is there any danger of not having a link if I submit it to Google and Bing?
Technical SEO | | EcommerceSite0 -
Noob 101 - Sitemaps
Hi guys, looking for some sitemap help. I'm running two seperate systems so my auto-generated sitemap on the main system has a few holes in it. I'd like to submit this to webmaster anyway, and then plug the holes with missing pages by adding them to 'Fetch as Google'. Does that make sense or will Google ignore one of them? Many thanks, Idiot
Technical SEO | | uSwSEO0