Which is best structure for Multiple XML Sitemap?
-
I have read such a great blog posts on Multiple XML Sitemaps on following websites before a week.
I have created multiple XML sitemaps for my eCommerce website with following structure and submitted to Google webmaster tools.
http://www.vistastores.com/main_sitemap.xml
http://www.vistastores.com/products_sitemap.xml
But, I am not satisfy with my second XML sitemap because it contain more than 7K+ product page URLs and looks like very slow crawling by Google!
I want to separate my XML sitemap with following structure.
With Root Level Category
http://www.vistastores.com/outdoor_sitemap.xml
http://www.vistastores.com/furniture_sitemap.xml
http://www.vistastores.com/kitchen_dining_sitemap.xml
http://www.vistastores.com/home_decor_sitemap.xml
OR::: End Level Category
http://www.vistastores.com/table_lamps_sitemap.xml
http://www.vistastores.com/floor_lamps_sitemap.xml
.
.
.
.
.
.
. etc....
So, Which is best structure for Multiple XML Sitemap?
-
I have developed multiple XML site maps for my eCommerce website (Lamps Lighting and More) and submitted to Google webmaster tools. You can see attached image to know more about it.
I measured that, Google is not crawling effectively after it. Please, check image and let me know regarding issue.
-
I think you already gave the answer yourself. That would be a great strategy in my opinion. Then you also will directly see what is working on your product page (for Google) and what is not. So you are able to optimize even better for the rest of your product(page)s.
Happy optimizing!
-
I am thinking on same way. If I will create individual XML sitemap for my Table Lamps category so help me more to focus more on specific category.
If any category will not index by Google so I can focus more on quality of product pages.
What you think about it?
-
Hi,
Great question, we had the same question a year ago when we started to work with sitemaps for our products. I cannot give you the answer which is the best, but I would recommend for now the end level category for your sitemap structure.
This will give you more useful insights in the crawling and indexation of your product urls, filtered at an end level category level. So that you can see for example that you're table lamps are better indexed than the floor lamps, which will make your data more actionable then at the root level category level. So you could for example, move your linkbuilding activities to another end level category.
By choosing the root level category it looks like to me you only separate the amount of urls you submit to Google and you still have to dig even harder to find the information you need or would like to see. That is also why you probably created the categories, to make searching for your users more usable instead of giving them an overview of 7000 courses.
You also thought about using a sitemap index file? This could be useful to keep track of all your sitemaps and makes submitting your sitemaps, in this case, a lot easier.
But I definitely would like to read more comments!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML Sitemap Question!
Hi All, I know that the sitemaps.xml URL must be findable but what about the sitemaps/pageinstructions.xml URL? Can we safely noindex the sitemaps/pageinstructions.xml URL? Thanks! Yael
Intermediate & Advanced SEO | | yaelslater0 -
Invest in a Image Sitemap - Yes or No?
Hey Mozers, 2 part question I'm reaching out to see if you all think Image Sitemaps are totally worth it for a big company. I can totally understand its value for a smaller mom & pop company. With a larger company they would have way more products so is it worth it having an image site map? I cant find examples of image sitemaps online. Would you be able to provide a website that is doing it? I can only find video sitemaps.
Intermediate & Advanced SEO | | rpaiva0 -
International Sitemaps
Hey Dudes, Quick question about international sitemaps. Basically we have a mix of subfolders, subdirectories, and ccTLDs for our different international/language sites. With this in mind how do you recommend we set up the site map. I'm thinking the best solution would be to move the subfolders and subdirectories onto an index and put the ccTLD site maps on their own root only. domain.ca/sitemap (This would only contain the Canada pages) domain.com, fr.domain.com, domain.com/eu/ (These pages would all have an index on domain.com/sitemap that points to each language/nations index) OR Should all site have a site map under their area. domain.com/sitemap, fr.domain.com/sitemap, domain.com/eu/sitemap, domain.ca/sitemap? I'm very new to international SEO. I know that our current structure probably isn't ideal... but it's what I've inherited. I just want to make sure I get a good foundation going here. So any tips are much appreciated!
Intermediate & Advanced SEO | | blake.runyon0 -
Robots.txt Blocking - Best Practices
Hi All, We have a web provider who's not willing to remove the wildcard line of code blocking all agents from crawling our client's site (user-agent: *, Disallow: /). They have other lines allowing certain bots to crawl the site but we're wondering if they're missing out on organic traffic by having this main blocking line. It's also a pain because we're unable to set up Moz Pro, potentially because of this first line. We've researched and haven't found a ton of best practices regarding blocking all bots, then allowing certain ones. What do you think is a best practice for these files? Thanks! User-agent: * Disallow: / User-agent: Googlebot Disallow: Crawl-delay: 5 User-agent: Yahoo-slurp Disallow: User-agent: bingbot Disallow: User-agent: rogerbot Disallow: User-agent: * Crawl-delay: 5 Disallow: /new_vehicle_detail.asp Disallow: /new_vehicle_compare.asp Disallow: /news_article.asp Disallow: /new_model_detail_print.asp Disallow: /used_bikes/ Disallow: /default.asp?page=xCompareModels Disallow: /fiche_section_detail.asp
Intermediate & Advanced SEO | | ReunionMarketing0 -
Changing Structure of Links... Yay or Nay?
Hello there,
Intermediate & Advanced SEO | | NikitaG
My site: MigrationLawyers.co.za was made with no sub structure.
It has no categories, and no child pages, all the pages are simply added to the end of the URL.
Sometimes this results in a rather lengthy URL like:
**/Immigration-Permanent-Residence-Work-Permit-South-**Africa
I was hoping to arrange the pages a bit into a logical, parented structure that looks more like:
**/Immigration/Permanent-Residence/Work-Permit-South-**Africa I would have parented pages, making up the same pretty much URL Now the Questions:
Is it worth it?
Will google read my parented URL with all the keywords, or only the page's keywords?
What should I expect to see from google?
Will my SERPs be all messed up? I will, without doubt, 301 redirect all the old URLs to the new parented ones. Any advice would be great,
Thanks,
Nikita0 -
E-commerce site, one product multiple categories best practice
Hi there, We have an e-commerce shopping site with over 8000 products and over 100 categories. Some sub categories belong to multiple categories - for example, A Christmas trees can be under "Gardening > Plants > Trees" and under "Gifts > Holidays > Christmas > Trees" The product itself (example: Scandinavian Xmas Tree) can naturally belong to both these categories as well. Naturally these two (or more) categories have different breadcrumbs, different navigation bars, etc. From an SEO point of view, to avoid duplicate content issues, I see the following options: Use the same URL and change the content of the page (breadcrumbs and menus) based on the referral path. Kind of cloaking. Use the same URL and display only one "main" version of breadcrumbs and menus. Possibly add the other "not main" categories as links to the category / product page. Use a different URL based on where we came from and do nothing (will create essentially the same content on different urls except breadcrumbs and menus - there's a possibiliy to change the category text and page title as well) Use a different URL based on where we came from with different menus and breadcrumbs and use rel=canonical that points to the "main" category / product pages This is a very interesting issue and I would love to hear what you guys think as we are finalizing plans for a new website and would like to get the most out of it. Thank you all!
Intermediate & Advanced SEO | | arikbar0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Google Places, Multiple locations best practice
What is the best practice with having multiple locations in Google Places. Does having multiple Google Places set up for each business have a big effect on local rankings for the individual areas? Should I use the home page for the website listed on each page or is it better to have a specific landing page for each Google Places listing? Any other tips? Thanks, Daniel
Intermediate & Advanced SEO | | iSenseWebSolutions0