Submitting multiple sitemaps
-
I recently moved over from html to wordpress. I have the google sitemap plugin on the new wordpress site, but in webmaster tools, it's only showing 71 pages, and I have hundreds, but many are html.
Is it okay, to submit an html sitemap as well as the wp sitemap that's already in there?
-
I agree with it. If you want to go with multiple XML site maps so you have to wait after submission.
I have very good experience with multiple XML site maps.
I am working on eCommerce website and submitted 24 XML site maps to Google webmaster tools.
Just look in to multiple XML site maps for Lamps Lighting and More!
You can see that Google webmaster tools shows very few index URLs.
I have similar experience for my another eCommerce website where I have submitted 7K+ URLs and 300+ indexed by Google with in 15 days.
-
Can someone help me here?
I used the sitemap generator, got like 500 plus pages.
I uploaded it to the root of my server, submitted it a second time to google, and got:
Parsing error
We were unable to read your Sitemap. It may contain an entry we are unable to recognize. Please validate your Sitemap before resubmitting.
I don't know how to fix this**.**
-
Well, I created a new sitemap using the above; renamed it; uploaded it to server; submitted it to google, and google did not accept saying error.
-
I'm not saying the sitemap is html, I'm saying the pages are html. And, that already have one xml sitemap that is autogenerated by the new wordpress platform, but I have a ton of html pages the new sitemap is not picking up.
So do I just create another one and add all those pages? So then there will be 2 sitemaps.
Edit: Just ran the sitemap generator. Pretty cool. Now there are some duplicates. So do I need to go in and remove those pages that already show in the first sitemap, or is it okay to have them in both sitemaps?
-
Google does not support html sitemaps and will only crawl them as any other webpage. But you can submit more xml sitemaps both in bing and google. I personally use a program called sitemap generator.
-
oh- and add both of them to your robots.txt file or create a sitemapindex.xml file that then lists both, and then just include that index version in the robots.txt file.
-
you can create one manually, or use a sitemap generator. Just be sure to call it something other than the name of your existing WP generated sitemap.xml file - so it could be sitemaphtml.xml or sitemap2.xml
They need to be in the XML format as outlined by sitemaps.org to be recognized by Google Webmaster Tools - and also submit both to Bing Webmaster Tools.
-
Well, the current sitemap google is recognizing is the wordpress (newer one) that is a .xml.
So how can I create an additional one that will show all the html pages, so google can easily find them?
-
I'm not sure about your HTML sitemap; I don't think HTML sitemaps are a supported format for you to submit to Google (I don't see them on sitemaps.org). You just need Google to crawl this page, and all the pages it links to? There is a plain text format (see here) that is allowed for sitemaps. You could probably change your HTML sitemap pretty easily to that format.
I'm pretty sure you're allowed to submit multiple sitemaps, but I can't find anything concrete saying you can or can't. The Google Webmaster Tools UI seems to support it, so my guess is that it would be fine. Try it and see if it works? You could also create a sitemap index file that references both these sitemaps.
You can read more about sitemaps on sitemaps.org. According to the Google help doc here, they adhere to these standards.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple page rank - harm or good?
Can someone explain this to me, as I just ranked a page on our website and intended to do the same for 3 other pages. our home page is currently ranked well, but after reading this (below from Moz) I'm concerned if i'm doing more harm then good. I just ranked the page 3 days ago, I haven't seen any drops yet. When multiple pages with the potential to rank well are combined into a single page, they not only stop competing with each other, but also create a stronger relevancy and popularity signal overall. This will positively impact your ability to rank well in the search engines. Thanks in advance, Lauren.
On-Page Optimization | | MissThumann0 -
One service - multiple locations
Just started working on a website with 2 services and a LOT of locations... So website is something like categorized in states and subcategories are cities... But it only offers 2 services. So each city out of 100 has one page per service... It seems to me like something not pretty good but I have no experience with such sites. I know that it is ok if you have stores on different locations so you can do local for all them but in this case website owner just wanted to rank high for 100 cities and actually he is doing pretty good... But I somehow think that may cause problems in future and google could consider it as a spam, no matter of unique content on 100 pages when it is actually the same... So if you have any advice for this situation, I am listening 🙂
On-Page Optimization | | m2webs0 -
How do I create multiple page URLs that are optimized for location and keywords that may be overlapping or the same?
Hi guys, I am attempting to create unique URLs for several different pages on a website. Let's say hypothetically that this is a website for a chain of Ice Cream Shops in Missouri. Let's say they have 15 locations in Springfield, Missouri. I would ideally like to optimize our Ice Cream Shop's in Springfield, Missouri with the main keyword (ice cream) but also the geo-specific location (Springfield), but we obviously can't have duplicate URLs for these 15 locations. We also have several secondary keywords, think things like: frozen yogurt or waffle cone that we can also use, although it would most likely be more powerful if we use the primary keyword. Any suggestions for how to go about doing this most effectively? Thanks!
On-Page Optimization | | GreenStone0 -
Multiple keywords over multiple domains - am I missing the point?
This I think, is a conceptual question related to Moz/ KeywordTracking in general. Q: What is a "good" way to setup tracking for keywords across many pages, potentially multiple domains? At present I've identified some keywords that are relevant to our products. That leads me to want to track not just for a specific page, but for their rankings across multiple pages, and potentially at least two domains. One site is our main product site, the other a blog/info site. This is I suppose mostly discovery at this point. Working out what, if any, of our pages are ranking for a full set of keywords that we believe are related to our products. It may be that I'm completely missing the point of tracking, that I'm not using it as intended. I want to learn how our pages track currently (for a bunch of keywords), see that change over time as we make changes, and also visualise what we're strong in and what we're not. To me, this would let me see just where the holes are in our SEO easily. The reason I ask is that it seems I have to manually enter a keyword plus a webpage in Moz. Given I've 20-30 keywords I want to track many pages, this is going to take me "quite some time" (tm). Is there a better way to do what I describe here? Am I missing the point of keyword tracking?
On-Page Optimization | | shinywhitebox0 -
Duplicating content on multiple domains
Hey guys, I've started working with a new client recently called Resource Investing News. I'm more a Social Media person, though I do have SEO experience. RIN has about 40 URLs all of which have original news content published on them. One SEO-related issue that I can see here though is that the primary domain re-publishes all of the original content that the other URLs do. In other words: resourceinvestingnews.com will have an article on it that is also published on goldinvestingnews.com with the same date stamp and a link out to the original article. E.g. http://resourceinvestingnews.com/42539-molybdenum-goes-far-beyond-steelmaking.html http://molyinvestingnews.com/5301-molybdenum-steelmaking-vehicle-demand-electronics-lubricant.html Does anyone have an idea if this is something that should be reviewed and/or whether the content is being negatively affected in search? Many thanks!
On-Page Optimization | | blahblahblah20150 -
Homepage Keywords for a Client with Multiple Services
We have a local client who services are very broad and there is no one service, i.e. keyword or keyword phrase, that covers all that they do. This is making it very challenging to select keywords to optimize the homepage for. Basically they do everything from engineering, surveying, planning, environmental, etc. We were thinking we would focus mainly on their most important service which is engineering and then use the other services as supporting phrases. However, this feels very diluted and there will be pages on the site for each of these services where the keywords will be targeted. Any thoughts or recommendations for this type of situation?
On-Page Optimization | | DragonSearch0 -
Multi-language on multiple domain
Hi, One of my clients has a big duplicate content issue on his site. He has two domain, on for each language (FR and EN) but each domain propose the two languages! Meaning you can reach every page with two URL. Example: http://www.brand-realestate.com/en/luxury/index.html (home page of the default site in english)
On-Page Optimization | | Pherogab
http://www.immobilier-brand.com/en/luxury/index.html (home page of the default site in french after clicking on the english link) Each of the two site has a default language and a link to the other one. When you click the link the page you are on just refresh and the URL stay the same with an added language parameter (ie:http://www.immobilier-brand.com/luxe/index.html?lang=english), then all the link in the navigation switch to the other language. So my question is, is it better to: Keep the two domain and instead of having the two languages on each send the traffic to the domain which has the targeted language by default (on the right page of course) Have both language on one domain and redirect all the pages from the other domain to this one (each page to the corresponding one) Just add a canonical URL on each alternative version of each domain Let me know if I'm clear. Thanks for the help. GaB0 -
E-Commerce product pages that have multiple skus with unique pages.
Hey Guys, With the recent farm/panda update from google i'm at a cross roads as to how I should optimize product pages for a project i'm working on for a client. My client sells tires and one particular tire brand can have up to 15 models and each model can have up to 30 sizes. IE: 'Michelin Pilot Sport Cup' comes in 15 different sizes. Each size will have it's unique product page and description bringing me to my question. Should I use the same description on every size? I do plan on writting unique content for each tire model however i'm not sure if I should do it for every size. After all the tire model description is the same for every size, each size doesn't carry any unique characteristics that I can describe. Thanks in advance!
On-Page Optimization | | MikeDelaCruz770