Is this a good sitemap hierarchy for a big eCommerce site (50k+ pages).
-
Hi guys, hope you're all good.
I am currently in the process of designing a new sitemap hierarchy to ensure that every page on the site gets indexed and is accessible via Google. It's important that our sitemap file is well structured, divided and organised into relevant sub-categories to improve indexing.
I just wanted to make sure that it's all good before forwarding onto the development team for them to consider. At the moment the site has everything thrown into /sitemap.xml/ and it exceeds the 50k limit. Here is what I have came up with:
A primary sitemap.xml referencing other sitemap files, each of the following areas will have their own sitemap of which is referenced by /sitemap.xml/. As an example, sitemap.xml will contain 6 links, all of which link to other sitemaps.
- Product pages;
- Blog posts;
- Categories and sub categories;
- Forum posts, pages etc;
- TV specific pages (we have a TV show);
- Other pages.
Is this format correct? Once it has been implemented I can then go ahead and submit all 6 separate sitemaps to webmaster tools + add a sitemap link to the footer of the site.
All comments are greatly appreciated - if you know of a site which has a good sitemap architecture, please send the link my way!
Brett
-
Have a read of what Google say about them here.
And yes, image search is huge. As for the way it's used, I can't comment on what everyone else does.
-Andy
-
Interesting, I haven't ever came across someone who said that I should put image URL's in a sitemap. Do users really search via Google images though - if they do aren't they just looking to copy an image / and or download it?
I can't see the site generating qualified leads through image based searches.
-
Duplicate content is when two or more URLs show the same content.
I referred to the fact that sometime categories, tags or subcategories show the same content. By the latter, i mean the same posts.Just to clarify, imagine that you have a category: Dogs and the subcategory: Puppies. And the last 5 articles/posts have both, category and subcategory.
When visiting the main page fo both(cat and subcat) will show the same content, the same 5 posts/articlesDid I make myself clear?
-
Thanks for getting back to me so quickly Gaston, I appreciate it.
You mentioned duplicate content - what do you mean? If the page has already been indexed, Google will skip/re-crawl the page. Not too sure what you mean by that?
Brett
-
Hi Brett,
Don't forget to add an images sitemap, as Google is pretty hot on those, and make sure you do some good image marketing as well.
But what you suggest is absolutely fine. From the main Sitemap, Google will find all of the others as well.
Just as a note, do make sure you know which pages need more crawling through using the last modified date. This will help them know which pages they should be recrawling more often.
-Andy
-
Hi brett,
Yeap, the hierarchy is ok. You should keep in mind to only submit to index the pages that are of yout interest and dont generate duplicate content, just a reminder.
Then, just submit every sitemap to search console.
Hope it helps.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Keyword stuffing on category pages - eCommerce site
Hi there fellow Mozzers. I work for a wine company, and I have a theory that some of our category pages are not ranking as well as they could, due to keyword stuffing. The best example is our Champagne category page, which we are trying to rank for the keyword Champagne, currently rank 6ish. However, when I load the page into Moz, it tells me that I might be stuffing, which I am not, BUT my products might be giving both Moz and Google this impression as well. Our product names for any given Champagne is "Champagne - {name}" and the producer is "Champagne {producer name}. Now, on the category pages we have a list of Champagnes, actually 44 Which means that with the way we display them, with both name of the wine, the name of the producer AND the district. That means we have 132 mentions of the word "Champagne" + the content text that I have written. I am wondering, how good is Google at identifying that this is in fact not stuffing, but rather functionality that makes for this high density of the keyword? Is there anything I can do? I mean, we can change it so it's not listed with Champagne on all the products, but I believe it would make the usability suffer a bit, not a lot - but it's a question of balance and I would like to hear if anyone has encountered a similar problem, if it is in fact a problem?
Intermediate & Advanced SEO | | Nikolaj-Landrock2 -
What do you think about SEO of big sites ?
Hi, I was doing some research of new huge sites for example carstory.com that have over million pages and i notice that many new sites have strong growing for number of keywords and then at some point everything start going down (Image of traffic drop attached) there are no major updates at this time but you can clearly see even on recent kewyords changes that this site start loosing keywords every day , so number of new keywords are much less that lost keywords. How would you explain it ? Is that at some point when site have more than X number of indexed pages then power of domain is not enough to keep all of them at the top and those keywords start dropping ? Please share you opinion and if you have any experience by yourself with huge sites. Thank You very appreciated 2LC3AxE
Intermediate & Advanced SEO | | logoderivv0 -
After adding a ssl certificate to my site I encountered problems with duplicate pages and page titles
Hey everyone! After adding a ssl certificate to my site it seems that every page on my site has duplicated it's self. I think that is because it has combined the www.domainname.com and domainname.com. I would really hate to add a rel canonical to every page to solve this issue. I am sure there is another way but I am not sure how to do it. Has anyone else ran into this problem and if so how did you solve it? Thanks and any and all ideas are very appreciated.
Intermediate & Advanced SEO | | LovingatYourBest0 -
Google penalized site--307/302 redirect to new site-- Via intermediate link—New Site Ranking Gone..?
Hi, I have a site that google had placed a manual link penalty on, let’s call this our
Intermediate & Advanced SEO | | Robdob2013
company site. We tried and tried to get the penalty removed, and finally gave up and purchased another name. It was our understanding that we could safely use either a 302 or 307 temporary redirect in order to redirect people from our old domain to our new one.. We put this into place several months and everything seemed to be going along well. Several days ago I noticed that our root domain name had dropped for our selected keyword from position 9 to position 65. Upon looking into our GWT under “Links to Your site” , I have found many, many, many links which were pointed to our old google penalized domain name to our new root domain name each of this links had a sub heading “Via this intermediate link -> Our Old Domain Google Penalized Domain Name” In light of all of this going on, I have removed the 307/302 redirect, have brought the
old penalized site back which now consists of a basic “we’ve moved page” which is linked to our new site using a rel=’nofollow’ I am hoping that -1- Our new domain has probably not received a manual penalty and is most likely now
received some sort of algorithmic penalty, and that as these “intermediate links” will soon disappear because I’m no longer doing the 302/307 from the old sight to the new. Do you think this is the case now or that I now have a new manual penalty place on the new
domain name.. I would very much appreciate any comments and/or suggestions as to what I should or can do to get this fixed. I need to still keep the old domain name as this address has already been printed on business cards many, many years ago.. Also on a side note some of the sub pages of the new root domain are still ranking very
well, it’s only the root domain that is now racking awfully.. Thanks,0 -
Best-of-the-web content in steep competition, ecommerce site
Hello, I'm helping my client write a long, comprehensive, best-of-the-web piece of content. It's a boring ecommerce niche, but on the informational side the top 10 competitors for the most linked to topic are all big players with huge domain authority. There's not a lot of links in the industry, should I try to top all the big industries through better content (somehow), pictures, illustrations, slideshows with audio, and by being more thorough than these very good competitors? Or should I go for something that's less linked to (maybe 1/5 as much people linking to it) but easier? or both? We're on a short timeline of 3 and 1/2 months until we need traffic and our budget is not huge
Intermediate & Advanced SEO | | BobGW1 -
Is there a way to redirect pages from an old site?
I have no access to an old wordpress site of a client's, but have parked the domain on their new site, gone into webmaster central and requested a change of address and wait... the old domain still shows in the search listings in place of the new site domain and the log files show 404 errors from links to the old site which go nowhere - can anyone suggest a way of managing this on the new site - is there a workaround to what should have been done - 301 redirects on the old site before it was taken down. many thanks
Intermediate & Advanced SEO | | Highlandgael0 -
What causes internal pages to have a page rank of 0 if the home page is PR 5?
The home page PageRank is 5 but every single internal page is PR 0. Things I know I need to address each page has 300 links (Menu problem). Each article has 2-3 duplicates caused from the CMS working on this now. Has anyone else had this problem before? What things should I look out for to fix this issue. All internal linking is follow there is no page rank sculpting happening on the pages.
Intermediate & Advanced SEO | | SEOBrent0 -
Should the sitemap include just menu pages or all pages site wide?
I have a Drupal site that utilizes Solr, with 10 menu pages and about 4,000 pages of content. Redoing a few things and we'll need to revamp the sitemap. Typically I'd jam all pages into a single sitemap and that's it, but post-Panda, should I do anything different?
Intermediate & Advanced SEO | | EricPacifico0