Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Submitting multiple sitemaps
-
I recently moved over from html to wordpress. I have the google sitemap plugin on the new wordpress site, but in webmaster tools, it's only showing 71 pages, and I have hundreds, but many are html.
Is it okay, to submit an html sitemap as well as the wp sitemap that's already in there?
-
I agree with it. If you want to go with multiple XML site maps so you have to wait after submission.
I have very good experience with multiple XML site maps.
I am working on eCommerce website and submitted 24 XML site maps to Google webmaster tools.
Just look in to multiple XML site maps for Lamps Lighting and More!
You can see that Google webmaster tools shows very few index URLs.
I have similar experience for my another eCommerce website where I have submitted 7K+ URLs and 300+ indexed by Google with in 15 days.
-
Can someone help me here?
I used the sitemap generator, got like 500 plus pages.
I uploaded it to the root of my server, submitted it a second time to google, and got:
Parsing error
We were unable to read your Sitemap. It may contain an entry we are unable to recognize. Please validate your Sitemap before resubmitting.
I don't know how to fix this**.**
-
Well, I created a new sitemap using the above; renamed it; uploaded it to server; submitted it to google, and google did not accept saying error.
-
I'm not saying the sitemap is html, I'm saying the pages are html. And, that already have one xml sitemap that is autogenerated by the new wordpress platform, but I have a ton of html pages the new sitemap is not picking up.
So do I just create another one and add all those pages? So then there will be 2 sitemaps.
Edit: Just ran the sitemap generator. Pretty cool. Now there are some duplicates. So do I need to go in and remove those pages that already show in the first sitemap, or is it okay to have them in both sitemaps?
-
Google does not support html sitemaps and will only crawl them as any other webpage. But you can submit more xml sitemaps both in bing and google. I personally use a program called sitemap generator.
-
oh- and add both of them to your robots.txt file or create a sitemapindex.xml file that then lists both, and then just include that index version in the robots.txt file.
-
you can create one manually, or use a sitemap generator. Just be sure to call it something other than the name of your existing WP generated sitemap.xml file - so it could be sitemaphtml.xml or sitemap2.xml
They need to be in the XML format as outlined by sitemaps.org to be recognized by Google Webmaster Tools - and also submit both to Bing Webmaster Tools.
-
Well, the current sitemap google is recognizing is the wordpress (newer one) that is a .xml.
So how can I create an additional one that will show all the html pages, so google can easily find them?
-
I'm not sure about your HTML sitemap; I don't think HTML sitemaps are a supported format for you to submit to Google (I don't see them on sitemaps.org). You just need Google to crawl this page, and all the pages it links to? There is a plain text format (see here) that is allowed for sitemaps. You could probably change your HTML sitemap pretty easily to that format.
I'm pretty sure you're allowed to submit multiple sitemaps, but I can't find anything concrete saying you can or can't. The Google Webmaster Tools UI seems to support it, so my guess is that it would be fine. Try it and see if it works? You could also create a sitemap index file that references both these sitemaps.
You can read more about sitemaps on sitemaps.org. According to the Google help doc here, they adhere to these standards.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Issues with Multiple H1 tags on homepage?
Hi folks, My homepage has 3 identical H1 tags due to the fact that I have had to create individual hero images (with headings) for desktop, tablet and mobile. I couldn't get my theme to display the layout in exactly the way I wanted on each device without doing a specific hero image and tag for each device type. Does this have a major impact on my SEO? Thanks,
On-Page Optimization | | Veevlimike
Mike.0 -
Does Google penalize you for reindexing multiple URLS?
Hello, Just a quick, question! I was wanting to know if multiple page indexing (site overhaul) could cause a drop in organic traffic ranking or be penalized by Google for submitting multiple pages at one time. Thanks
On-Page Optimization | | InternetRep0 -
XML Sitemaps for Property Website
Hi all, I was hoping that someone might have a link to a good example of an XML Sitemap for a large property (real estate) website? Thanks in advance to anyone who does! 🙂 Gavin
On-Page Optimization | | IcanAgency0 -
SVG image files causing multiple title tags on page - SEO issue?
Does anyone have any experience with SVG image files and on-page SEO? A client is using them and it seems they use the title tag in the same way a regular image (JPG/PNG) would use an image ALT tag. I'm concerned that search engines will see the multiple title tags on the page and that this will cause SEO issues. Regular crawlers like Moz flag it as a second title tag, however it's outside the header and in a SVG wrap so the crawlers really should understand that this is a SVG title rather than a second page title. But is this the case? If anyone has experience with this, I'd love to hear about it.
On-Page Optimization | | mrdavidingram2 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
No follow for html sitemap?
Hi, I have been working on an e-commerce site and have been wondering if i should add the meta robot tag with no follows, on pages like delivery , terms, returns and the html sitemap? From what I have read this seems a good idea, but i am a little confused as what to do with the html sitemap. I can understand that having this on the homepage will allow crawlers quick access to the deeper pages within the site. But would it be better to guide them down the natural route of the category navigation instead? Thanks
On-Page Optimization | | Bmeisterali0 -
How do we handle sitemaps in robots.txt when multiple domains point to same physical location?
we have www.mysite.net, www.mysite.se, www.mysite.fi and so on. all of these domains point to the same physical location on our webserver, and we replace texts given back to client depending on which domain he/she requested. My problem is this: How do i configure sitemaps in robots.txt when robots.txt is used by multiple domains? If I for instance put the rows Sitemap: http://www.mysite.net/sitemapNet.xml
On-Page Optimization | | nordicnetproducts
Sitemap: http://www.mysite.net/sitemapSe.xml in robots.txt, would that result in some cross submission error?0 -
E-Commerce product pages that have multiple skus with unique pages.
Hey Guys, With the recent farm/panda update from google i'm at a cross roads as to how I should optimize product pages for a project i'm working on for a client. My client sells tires and one particular tire brand can have up to 15 models and each model can have up to 30 sizes. IE: 'Michelin Pilot Sport Cup' comes in 15 different sizes. Each size will have it's unique product page and description bringing me to my question. Should I use the same description on every size? I do plan on writting unique content for each tire model however i'm not sure if I should do it for every size. After all the tire model description is the same for every size, each size doesn't carry any unique characteristics that I can describe. Thanks in advance!
On-Page Optimization | | MikeDelaCruz770