Segmenting Website into XML Sitemaps
-
Hi all,
I'm about to begin the process of chopping up a 1,000 page website into separate sitemaps. I'm going for a three tiered approach so that I can check indexation on each level for:
Category, Subcategory, Product
What's the easiest way to create three separate XML sitemaps for this?
Thanks,
Nick
-
Hi Nick,
I use sometimes GSiteCrawler (http://gsitecrawler.com/)
After Crawling, you can dissect your sitemap the way you want and export them
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Missing xml tag error
Our xml sitemap is divided up in to many smaller xml sitemaps so we have fewer products per sitemap, in order to easily identify errors. A couple of weeks ago, we changed our xml sitemap by reordering some of the products. However, this has left some old xml sitemaps without any data, and they are no longer appearing in our xml sitemap. But, Google is still identifying these sitemaps since they once existed, and they are giving errors since they can't locate them. Should we 404 those xml sitemaps, or is there a better way to handle this?
Technical SEO | | ang0 -
Sitemap issue - Tons of 404 errors
We've recreated a client site in a subdirectory (mysite.com/newsite) of his domain and when it was ready to go live, added code to the htaccess file in order to display the revamped website on the main url. These are the directions that were followed to do this: http://codex.wordpress.org/Giving_WordPress_Its_Own_Directory and http://codex.wordpress.org/Moving_WordPress#When_Your_Domain_Name_or_URLs_Change. This has worked perfectly except that we are now receiving a lot of 404 errors am I'm wondering if this isn't the root of our evil. This is a WordPress self-hosted website and we are actively using the WordPress SEO plugin that creates multiple folders with only 50 links in each. The sitemap_index.xml file tests well in Google Analytics but is pulling a number of links from the subdirectory folder. I'm wondering if it really is the manner in which we made the site live that is our issue or if there is another problem that I cannot see yet. What is the best way to attack this issue? Any clues? The site in question is www.atozqualityfencing.com https://wordpress.org/plugins/wordpress-seo/
Technical SEO | | JanetJ0 -
Website content has been scraped - recommended action
So whilst searching for link opportunities, I found a website that has scraped content from one of our websites. The website looks pretty low quality and doesn't link back. What would be the recommended course of action? Email them and ask for a link back. I've got a feeling this might not be the best idea. The website does not have much authority (yet) and a link might look a bit dodgy considering the duplicate content Ask them to remove the content. It is duplicate content and could hurt our website. Do nothing. I don't think our website will get penalised for it since it was here first and is in the better quality website. Possibly report them to google for scraping? What do you guys think?
Technical SEO | | maxweb0 -
XML Sitemap Creation
I am looking for a tool where I can add a list of URL's and output an XML sitemap. Ideally this would be Web based or work on the mac? Extra bonus if it handles video sitemaps. My alternative is XLS and a bunch of concatenates, but I'd rather something cleaner. It doesn't need to crawl the site. Thanks.
Technical SEO | | Jeff_Lucas0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
Omitting URLs from XML Sitemap - Bad??
Hi all, We are working on an extremely large retail site with some major duplicate content issues that we are in the process of remedying. The site also does not currently have an XML sitemap. Would it be advisable to create a small XML sitemap with only the main category pages for the time being, and then after our duplicate content issues are resolved, uploading the complete sitemap? Or should we wait to upload anything until all work is complete down to the product page level and canonicals are in place? Will uploading a incomplete sitemap be fraudulent or misleading in the eyes of the search engines and prompt a penalty, or would having at least the main pages mapped while we continue work be okay? Please let me know if more info is needed to answer! Thanks in advance!
Technical SEO | | seo320 -
Optimizing a website which uses JavaScript and jQuery
Just a quick question (or 2) If I have divs which are hidden on my page, but are displayed when a user clicks on a p tag and the hidden div is displayed using jquery a user clicks on an a tag and the hidden div is displayed using jquery with the href being cancelled in both examples, will the hidden content be optimized, or will the fact it is initially hidden make it harder to optimize? Thanks for any answers!
Technical SEO | | PhatJP0 -
Website is extreemly slow
A couple of days a go one of our websites became extreemly slow. I'm not sure if this is the right place to ask this question but frankly i don't where else to ask it Our hosting provider mentioned it was a socket exploid but even after removing all the infected files we are still running into a strange wait time of 45 seconds (See attachements) This has mayor efects on the SEO as well the link is www[dot]schouw[dot]org Hopefully there is someone how can help me out 12.png
Technical SEO | | TiasNimbas0