How would you create and then segment a large sitemap?
-
I have a site with around 17,000 pages and would like to create a sitemap and then segment it into product categories.
Is it best to create a map and then edit it in something like xmlSpy or is there a way to silo sitemap creation from the outset?
-
Thanks Saijo,
We are trying to silo product types/categories and break them into different sitemaps. I'm familiar with SF but I don't think it will create sitemaps with the granularity that we are looking for.
I'm using XMLSpy but I'm finding it hard to break out blocks of content.
-
To my knowledge, Screaming Frog doesn't allow you to create an XML sitemap. Perhaps Excel allows you to format the output from SF but I'm not sure. I did find a utility called XMLSpy which, though pricey, allows me to do some of the sorting I was looking for. Once sorted, I can manually pull out sections to segment my sitemap. It is a pain in the neck because I can determine a silo and do it automatically. That being said, I think I can develop a sitemap template and have our new web programmer to develop a way to auto generate a group of segmented sitemaps.
Anyone know if there is a canned solution that works with IIS?
-
If you site is structured such that the urls contain the categories you wish to sort , you can use something like Screaming Frog ( http://www.screamingfrog.co.uk/seo-spider/ ) and export all the urls and sort them out via excel in to categories and go that way
NOTE : the free version has a 500 url limit, so you might want to look at paid ( ask them if it can handle 17,00 urls before getting it ) or look at http://home.snafu.de/tilman/xenulink.html ( I haven't used it myself , so don't know if you can export stuff to excel from there )
Good luck mate , sounds like you have a big job ahead of you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
XML sitemap and rel alternate hreflang requirements for Google Shopping
Our company implemented Google Shopping for our site for multiple countries, currencies and languages. Every combination of language and country is accessible via a url path and for all site pages, not just the pages with products for sale. I was not part of the project. We support 18 languages and 14 shop countries. When the project was finished we had a total of 240 language/country combinations listed in our rel alternate hreflang tags for every page and 240 language/country combinations in our XML sitemap for each page and canonicals are unique for every one of these page. My concern is with duplicate content. Also I can see odd language/country url combinations (like a country with a language spoken by a very low percentage of people in that country) are being crawled, indexed, and appearing in serps. This uses up my crawl budget for pages I don't care about. I don't this it is wise to disallow urls in robots.txt for that we are simultaneously listing in the XML sitemap. Is it true that these are requirements for Google Shopping to have XML sitemap and rel alternate hreflang for every language/country combination?
Technical SEO | | awilliams_kingston0 -
Sitemap and canonical
In my sitemap I have two entries for my page ContactUs.asp ContactUs.asp?Lng=E ContactUs.asp?Lng=F What should I use in my page ContactUS.asp ? Is this correct?
Technical SEO | | CustomPuck0 -
Image sitemap
I work on a big eCommerce site with thousands of pages. We are talking about crating a separate image sitemap. Any idea example of an eCommerce site who has a separate image sitemap? I looked several and cant find one. Also, what are the best practices for creating a good image sitemap? thanks!
Technical SEO | | bizuH0 -
Sudden Drop in Indexed Pages and Images under Sitemap
Hello! Just a couple days back, realised that under the Google Webmaster Tool > Sitemap, my website www.bibliotek.co has a sudden drop in indexed pages and images. Previously, it was almost fully indexed. However, I checked and the Google Index > Index Status, it is still fully indexed Any reason why and how do I resolve? Any help is very much appreciated! Thanks in advance!
Technical SEO | | Bibliotek1230 -
How do I setup sitemaps for an international website?
I am adding translated versions of my sites to a subdomain for example es.example.com. Will I add each subdomain into Google Webmaster Tools? Will each need its own sitemap?
Technical SEO | | EcommerceSite0 -
Do Collections in Shopify create Duplicate Pages according to Google/Bing/Yahoo?
I'm using the e-commerce platform Shopify to host an e-store. We've put our products into different collections. Shopify automatically creates different URL paths to a product in multiple collections. I'm worried that the same product listed in different collections is soon as different pages, and therefore duplicate content by Google/Bing/Yahoo. Would love to get your opinion on this concern! Thanks! Matthew
Technical SEO | | HappinessDigital0 -
Omniture tracking code URLs creating duplicate content
My ecommerce company uses Omniture tracking codes for a variety of different tracking parameters, from promotional emails to third party comparison shopping engines. All of these tracking codes create URLs that look like www.domain.com/?s_cid=(tracking parameter), which are identical to the original page and these dynamic tracking pages are being indexed. The cached version is still the original page. For now, the duplicate versions do not appear to be affecting rankings, but as we ramp up with holiday sales, promotions, adding more CSEs, etc, there will be more and more tracking URLs that could potentially hurt us. What is the best solution for this problem? If we use robots.txt to block the ?s_cid versions, it may affect our listings on CSEs, as the bots will try to crawl the link to find product info/pricing but will be denied. Is this correct? Or, do CSEs generally use other methods for gathering and verifying product information? So far the most comprehensive solution I can think of would be to add a rel=canonical tag to every unique static URL on our site, which should solve the duplicate content issues, but we have thousands of pages and this would take an eternity (unless someone knows a good way to do this automagically, I’m not a programmer so maybe there’s a way that I don’t know). Any help/advice/suggestions will be appreciated. If you have any solutions, please explain why your solution would work to help me understand on a deeper level in case something like this comes up again in the future. Thanks!
Technical SEO | | BrianCC0 -
When creating articles what are the rules of thumb for titles and url links.
For example lets say I'm looking to rank for “Window Glass Replacement”. What kinds of articles should I create for this? Does it matter? Should I create articles such as How to know if your windows need replacement, then have the text in my link say “Window Glass Replacement.” Should I try and vary the link name? Should I vary the titles of my articles, or just make sure the content is different?
Technical SEO | | marker-3115280