How to create site map for large site (ecommerce type) that has 1000's if not 100,000 of pages.
-
I know this is kind of a newbie question but I am having an amazing amount of trouble creating a sitemap for our site Bestride.com. We just did a complete redesign (look and feel, functionality, the works) and now I am trying to create a site map. Most of the generators I have used "break" after reaching some number of pages. I am at a loss as to how to create the sitemap. Any help would be greatly appreciated!
Thanks
-
I agree with Chris. With such large websites it would be advisable having a sitemap index and then splitting the index into various individual indexes such as Pages, Products, Categories, images, media, tags etc.
-
The easiest thing i can think of is to write a script that works with your dispatcher to create a site map. The format I would use is add the page and all of the "product images" on the page to the map and move to the next. At the same time I would use an auto increment variable to keep track of how many lines you have written. When you get around 50k, write out the name of the next site map file that the program will create and have them chained together this way.
-
That's a great help Chris, thank you! And thanks to all for your help!
-
Typically, a sitemap is going to include every page on the site. As Francesca said, each sitemap can be up to 50K urls and if you need multiple sitemaps then you create a sitemap index that points to the rest of the sitemaps.
-
Thanks for the feedback!
I will look into screamingfrog for sure.
@Lesley - we are using a custom platform (in house) so we don't have that functionality. The issue is that we have a lot of inventory (millions) of cars. We have built (and are releasing new functionality today) to provide internal links so that Google can crawl all the inventory easily (users can too :). My question about sitemaps has boiled down to this: Do we need to build the sitemap to include every single page (all the inventory) or do we provide a "map" so that google can find the top pages and then crawl the inventory from there. Again the site is bestride.com. If anyone wants to take a look at the site, that would be fantastic!
Thanks
-
Are you using a custom platform or an off the shelf e-commerce package? Most off the shelf packages actually have a module that can create a site map and a lot have it where you can cron it too.
-
Of course, you can also use the moz's crawl test report at http://pro.moz.com/tools/crawl-test
-
Hi Kristin,
Each sitemap.xml can support maximum 50.000 URLs. So, If you have a site with more than 100K, It'd be better to create 2 or 3 o 4 etc sitemaps.xml in order to contain all URLs. Hope it is useful.
Kind regards!
Francesca
-
You can use screamingfrog to create your sitemap. You just need to license it for crawl more than 500 URI.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Create Longer Content or Separate Pages
Good Afternoon We've been helping out on a site which for instance offers Hotel Breaks in Birmingham. There is a page on the site birmingham-hotel-breaks and the most popular package that people book is actually Hotel Breaks with a Transfer package. The Birmingham Hotel page focuses mainly on the Hotel and Transfer packages but it's been suggested that we build out a separate page birmingham-hotel-and-transfer to be more keyword targeted and change the original page to focus solely on the Hotel side. I wasn't sure whether it would be better to build out the content on the existing page as people are already linking to us for the packages
Technical SEO | | Ham19790 -
These days on Google results, it also shows the site map. I submitted my company's sitemap and it still does not show?What am I doing wrong?
Look at the image in the link. I want my company to look like the "pluralsight" website in Google. I want it to show the sitemap. I have already submitted the sitemap to Google few days back, what am I doing wrong? search?sourceid=chrome-psyapi2&ion=1&espv=2&ie=UTF-8&q=pluralsight&oq=pluralsight&aqs=chrome..69i57j0l5.11024j0j8
Technical SEO | | Deein0 -
Akamai's Edge Redirector good for SEO?
Hey guys, Just wondering if anyone has used/tested Akamai's new 'Edge Redirector' cloudlet?http://www.akamai.com/html/technology/edge-redirector.html It seems like it would be a better/faster option than redirects at the server level via htaccess.. thoughts? Thanks!,
Technical SEO | | wojkwasi
Woj1 -
New site: More pages for usability, or fewer more detailed pages for greater domain authority flow?
Ladies and gents! We're building a new site. We have a list of 28 professions, and we're wondering whether or not to include them all on one long and detailed page, or to keep them on their own separate pages. Thinking about the flow of domain authority - I could see 28 pages diluting it quite heavily - but at the same time, I think having the separate pages would be better for the user. What do you think?
Technical SEO | | Muhammad-Isap1 -
Are the duplicate content and 302 redirects errors negatively affecting ranking in my client's OS Commerce site?
I am working on an OS Commerce site and struggling to get it to rank even for the domain name. Moz is showing a huge number of 302 redirects and duplicate content issues but the web developer claims they can not fix those because ‘that is how the software in which your website is created works’. Have you any experience of OS Commerce? Is it the 302 redirects and duplicate content errors negatively affecting the ranking?
Technical SEO | | Web-Incite0 -
Proper way to 404 a page on an Ecommerce Website
Hello. I am working on a website that has over 15000 products. When one of these is no longer available - like it's discontinued or something - the page it's on 302s to a 404 page. Example - www.greatdomain.com/awesome-widget Awesome widget is no longer available www. greatdomain.com/awesome-widget 302s to -www.greatdomain.com/404 page. For the most part, these are not worthy of 301s because of lack of page rank/suitable LPs, but is this the correct way to handle them for search engines? I've seen varying opinions. Thanks!
Technical SEO | | Blenny0 -
Is optimising on page mobile site content a waiste of time?
Good Morning from dull & overcast 2 degrees C wetherby UK 😞 Whilst Ive changed markup for seo purposes on desktop versions I would like to know if the principles of optimising on page content ie modifyting <title><h1> is exactly the same for <a href="http://www.innoviafilms.com/m/Home.aspx">http://www.innoviafilms.com/m/Home.aspx</a></p> <p>Whilst the desktop version of innovia films ranks well for the terms the client requested some time back now their attention is focusing on the mobile site but I feel a bit confused and I'll try my best to explain...</p> <p>Is it not totally redundant to "Optimise" a mobile site content as when i search via google on a smartphone i'm seeing the SERPS from the desktop version and when I click on a snippet the mobile site just piggybacks on the back of the listing anyway.</p> <p>Put another way is it not a royal waist of time tinkering with mobile site on page content for long as Googles SERPS on a smartphone are exactly the same as on a desktop ie they are not too seperate entities.</p> <p>Or am i totally wrong and you could optimise a mobile for a completely different term to its parent desktop version.?</p> <p>Tried to explain this the best i can, my head hurts... :-(</p> <p>Any insights</p> <p>welcome :-)</p></title>
Technical SEO | | Nightwing0 -
I have 15,000 pages. How do I have the Google bot crawl all the pages?
I have 15,000 pages. How do I have the Google bot crawl all the pages? My site is 7 years old. But there are only about 3,500 pages being crawled.
Technical SEO | | Ishimoto0