Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How would you create and then segment a large sitemap?
-
I have a site with around 17,000 pages and would like to create a sitemap and then segment it into product categories.
Is it best to create a map and then edit it in something like xmlSpy or is there a way to silo sitemap creation from the outset?
-
Thanks Saijo,
We are trying to silo product types/categories and break them into different sitemaps. I'm familiar with SF but I don't think it will create sitemaps with the granularity that we are looking for.
I'm using XMLSpy but I'm finding it hard to break out blocks of content.
-
To my knowledge, Screaming Frog doesn't allow you to create an XML sitemap. Perhaps Excel allows you to format the output from SF but I'm not sure. I did find a utility called XMLSpy which, though pricey, allows me to do some of the sorting I was looking for. Once sorted, I can manually pull out sections to segment my sitemap. It is a pain in the neck because I can determine a silo and do it automatically. That being said, I think I can develop a sitemap template and have our new web programmer to develop a way to auto generate a group of segmented sitemaps.
Anyone know if there is a canned solution that works with IIS?
-
If you site is structured such that the urls contain the categories you wish to sort , you can use something like Screaming Frog ( http://www.screamingfrog.co.uk/seo-spider/ ) and export all the urls and sort them out via excel in to categories and go that way
NOTE : the free version has a 500 url limit, so you might want to look at paid ( ask them if it can handle 17,00 urls before getting it ) or look at http://home.snafu.de/tilman/xenulink.html ( I haven't used it myself , so don't know if you can export stuff to excel from there )
Good luck mate , sounds like you have a big job ahead of you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does changing sitemaps affect SEO
Hi all, I have a question regarding changing the size of my sitemaps. Currently I generate sitemaps in batches of 50k. A situation has come up where I need to change that size to 15k in order to be crawled by one of our licensed services. I haven't been able to find any documentation on whether or not changing the size of my sitemaps(but not the pages included in them) will affect my rankings negatively or my SEO efforts in general. If anyone has any insights or has experienced this with their site please let me know!
Technical SEO | | Jason-Reid0 -
Google Search console says 'sitemap is blocked by robots?
Google Search console is telling me "Sitemap contains URLs which are blocked by robots.txt." I don't understand why my sitemap is being blocked? My robots.txt look like this: User-Agent: *
Technical SEO | | Extima-Christian
Disallow: Sitemap: http://www.website.com/sitemap_index.xml It's a WordPress site, with Yoast SEO installed. Is anyone else having this issue with Google Search console? Does anyone know how I can fix this issue?1 -
Bingbot appears to be crawling a large site extremely frequently?
Hi All! What constitutes a normal crawl rate for daily bingbot server requests for large sites? Are any of you noticing spikes in Bingbot crawl activity? I did find a "mildly" useful thread at Black Hat World containing this quote: "The reason BingBot seems to be terrorizing your site is because of your site's architecture; it has to be misaligned. If you are like most people, you paid no attention to setting up your website to avoid this glitch. In the article referenced by Oxonbeef, the author's issue was that he was engaging in dynamic linking, which pretty much put the BingBot in a constant loop. You may have the same type or similar issue particularly if you set up a WP blog without setting the parameters for noindex from the get go." However, my gut instinct says this isn't it and that it's more likely that someone or something is spoofing bingbot. I'd love to hear what you guys think! Dana
Technical SEO | | danatanseo1 -
Removing images from site and Image Sitemap SEO advice
Hello again, I have received an update request where they want me to remove images from this site (as of now its a bunch of thumbnails) current page design: http://1stimpressions.com/portfolio/car-wraps/ and turn it into a new design which utilized a slider (such as this): http://1stimpressions.com/portfolio/ They don't want the thumbnails on the page anymore. My question is since my site has a image sitemap that has been indexed will removing all the images hurt my SEO greatly? What would the recommended steps to take to reduce any SEO damage be, if so? Thank you again for your help, always great and very helpful feedback! 🙂 cheers!
Technical SEO | | allstatetransmission0 -
Creating a CSV file for uploading 301 redirect URL map
Hi if i'm bulk uploading 301 redirects whats needed to create a csv file? is it just a case of creating an excel spreadsheet & have the old urls in column A and new urls in column B and then just convert to csv and upload ? or do i need to put in other details or paremeters etc etc ? Cheers Dan
Technical SEO | | Dan-Lawrence0 -
How to create unique content for businesses with multiple locations?
I have a client that owns one franchise location of a franchise company with multiple locations. They have one large site with each location owning it's own page on the site, which I feel is the best route. The problem is that each location page has basically duplicate content on each page resulting in like 80 pages of duplicate content. I'm looking for advice on how to create unique content for each location page? What types of information can we write about to make each page unique, because you can only twist sentences and content around so much before it just all sounds cookie cutter and therefore offering little value.
Technical SEO | | RonMedlin0 -
How to create a delayed 301 redirect that still passes juice?
My company is merging one of our sites into another site. At first I was just going to create a 301 redirect from domainA.com to domainB.com but we decided that would be too confusing for customers expecting to see domainA.com so we want to create a page that says something like "We've moved. please visit domainB.com or be redirected after 10 seconds". My question is, how do I create a redirect that has a delay and will this still pass the same amount of juice that a regular 301 redirect would? I've heard that meta refreshes are considered spammy by Google.
Technical SEO | | bewoldt0 -
How do I create a Video Sitemap for Youtube Embedded Videos?
I've been seeing a lot of people recommend creating a video sitemap or Media RSS feed (mRSS) and submit to Google. We have videos hosted on Brightcove and most on YouTube. Brightcove can generate the sitemap for us. But does anyone know how to generate a YouTube Video Sitemap for those videos embedded on our pages? Note: I realize I could manually assemble the video sitemap, however manually assembling the sitemap is probably not an option for us due to the volume of videos we've published.
Technical SEO | | LDS-SEO1