Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Indexing product attributes in sitemap
-
Hey Mozzers!
I'm battling a few questions about the sitemap for my ecommerce store. Could you help me out?
- Is it necessary to include your product attributes in the sitemap? I'm not sure why it would matter to have a sitemap that lists everything in the color cherry. Also, if the attributes were included in the sitemap, would that count as duplicate content for the same products to show up in multiple attributes?
- Is there any benefit to submitting the sitemaps individually? For example, submitting /product-sitemap.xml, /product_brand-sitemap.xml versus just /sitemap.xml?
Any other best practices for managing my ecommerce sitemap, or great resources, would be very helpful.
Thank you!
-
Hello Localwork,
By "product attributes" do you mean URLs associated with product variants, like color and size? From the context of your question, I'll assume for now you mean that each product attribute / variant appears on it's own URL (e.g. /?color=red and /?color=blue) and you want to know whether these should be included in the sitemap.
As Andy mentions below, more information is needed before prescribing a best practice specifically to your situation. However, in this case you should probably only have the one "canonical" version of the product URL (e.g. without variants). There are many ways to handle this and I recommend Googling "SEO for product variants" to familiarize yourself with the pros and cons of each.
To answer your question about sitemap segmentation, yes it is a good thing to do for several reasons, most important of which is easier diagnoses of crawl issues, such as which "sections" of your sites have indexation problems. It also helps on large sites with issues reaching URL limits in sitemaps, and is a more logical tree-like structure for people and machines to follow than having every URL in one sitemap.
-
Hi,
Without knowing a little more detail, it's hard to say with 100% certainty, but I can't see why the sitemap should have every iteration of a product in there. These pages (pages that are produced due to an attribute change) should rel=canonical back to the main product page anyway and this will handle duplication.
And unless you many many thousands of products in each sitemap, then you wouldn't want to be splitting them up like this, although you can rationalize these somewhat depending on the products and site.
Just remember that the sitemap is only there as an aid to helping Google crawl and there is no actual SEO benefit to this. It is whatever is going to make the most sense to the site and to Google.
-Andy
Edit: Just Tweeted this out as well to see if others wish to chime in

Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is indexing bad URLS
Hi All, The site I am working on is built on Wordpress. The plugin Revolution Slider was downloaded. While no longer utilized, it still remained on the site for some time. This plugin began creating hundreds of URLs containing nothing but code on the page. I noticed these URLs were being indexed by Google. The URLs follow the structure: www.mysite.com/wp-content/uploads/revslider/templates/this-part-changes/ I have done the following to prevent these URLs from being created & indexed: 1. Added a directive in my Htaccess to 404 all of these URLs 2. Blocked /wp-content/uploads/revslider/ in my robots.txt 3. Manually de-inedex each URL using the GSC tool 4. Deleted the plugin However, new URLs still appear in Google's index, despite being blocked by robots.txt and resolving to a 404. Can anyone suggest any next steps? I Thanks!
Technical SEO | | Tom3_150 -
URL Structure On Site - Currently it's domain/product-name NOT domain/category/product name is this bad?
I have a eCommerce site and the site structure is domain/product-name rather than domain/product-category/product-name Do you think this will have a negative impact SEO Wise? I have seen that some of my individual product pages do get better rankings than my categories.
Technical SEO | | the-gate-films0 -
Automate XML Sitemaps
Quick question, which is the best method that people have for automating sitemaps. We publish around 200 times a day and I would like to make sure as soon as we publish it gets updated in the site map. What is the best method of updating a sitemap so it gets updated immediately after it is published.
Technical SEO | | mattdinbrooklyn0 -
My video sitemap is not being index by Google
Dear friends, I have a videos portal. I created a video sitemap.xml and submit in to GWT but after 20 days it has not been indexed. I have verified in bing webmaster as well. All videos are dynamically being fetched from server. My all static pages have been indexed but not videos. Please help me where am I doing the mistake. There are no separate pages for single videos. All the content is dynamically coming from server. Please help me. your answers will be more appreciated................. Thanks
Technical SEO | | docbeans0 -
How to determine which pages are not indexed
Is there a way to determine which pages of a website are not being indexed by the search engines? I know Google Webmasters has a sitemap area where it tells you how many urls have been submitted and how many are indexed out of those submitted. However, it doesn't necessarily show which urls aren't being indexed.
Technical SEO | | priceseo1 -
CDN Being Crawled and Indexed by Google
I'm doing a SEO site audit, and I've discovered that the site uses a Content Delivery Network (CDN) that's being crawled and indexed by Google. There are two sub-domains from the CDN that are being crawled and indexed. A small number of organic search visitors have come through these two sub domains. So the CDN based content is out-ranking the root domain, in a small number of cases. It's a huge duplicate content issue (tens of thousands of URLs being crawled) - what's the best way to prevent the crawling and indexing of a CDN like this? Exclude via robots.txt? Additionally, the use of relative canonical tags (instead of absolute) appear to be contributing to this problem as well. As I understand it, these canonical tags are telling the SEs that each sub domain is the "home" of the content/URL. Thanks! Scott
Technical SEO | | Scott-Thomas0 -
Hosting sitemap on another server
I was looking into XML sitemap generators and one that seems to be recommended quite a bit on the forums is the xml-sitemaps.com They have a few versions though. I'll need more than 500 pages indexed, so it is just a case of whether I go for their paid for version and install on our server or go for their pro-sitemaps.com offering. For the pro-sitemaps.com they say: "We host your sitemap files on our server and ping search engines automatically" My question is will this be less effective than my installing it on our server from an SEO perspective because it is no longer on our root domain?
Technical SEO | | design_man0 -
Block a sub-domain from being indexed
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines? One item i cannot use is the meta "no follow" tag. Thanks! - Kyle
Technical SEO | | kchandler0