Add selective URLs to an XML Sitemap
-
Hi!
Our website has a very large no of pages. I am looking to create an XML Sitemap that contains only the most important pages (category pages etc). However, on crawling the website in a tool like Xenu (the others have a 500 page limit), I am unable to control which pages get added to the XML Sitemap, and which ones get excluded.
Essentially, I only want pages that are upto 4 clicks away from my homepage to show up in the XML Sitemap.
How should I create an XML sitemap, and at the same time control which pages of my site I add to it (category pages), and which ones I remove (product pages etc).
Thanks in advance!
Apurv
-
Thanks a lot for sharing Travis. This is really helpful!
Appreciate your help here.
-
Hey Intermediate,
Here's my setup - image - http://screencast.com/t/qThC401hQVUp Be careful of the line breaks if you want your sitemap to be pretty (I'm not sure if it also works if everything is on a single line).
Column A:
Column B:
URLColumn
<lastmod>2013-08-27</lastmod>
Column
<changefreq>always</changefreq>Column E:
<priority>1</priority>Column F:
=CONCATENATE(A2,B2,C2,D2,E2)You will need to add this as first 2 lines in your sitemap:
and add to the end, but you should be good to go!
I Hope that helps! -
Thanks Schwaab!
-
Hi Travis
That sounds like a smart way to go about this. Could you please guide me regarding how to add parameters like lastmod, priority, changefreq etc in the XML sitemap, using the URLs that I have in the Excel sheet.
Thanks!
-
If you have a list of all the URLs on your site, it is easy to create a sitemap using excel. I have a template that I use and I can crank out a 50k URL sitemap in 5 minutes.
-
I would recommend purchasing Screaming Frog. You can crawl the site and sort the URLs by level. Remove the URLs that are too deep from the crawl and export to XML sitemap. Screaming Frog is definitely worth the price to unlock all of its features and have an unlimited crawl limit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO URLs: 1\. URLs in my language (Greek, Greeklish or English)? 2\. Αt the end it is good to put -> .html? What is the best way to get great ranking?
Hello all, I must put URLs in my language Greek, Greeklish or in English? And at the end of url it is good to put -> .html? For exampe www.test.com/test/test-test.html ? What is the best way to get great ranking? I am a new digital marketing manager and its my first time who works with a programmer who doesn't know. I need to know as soon as possible, because they want to be "on air" tomorrow! Thank you very much for your help! Regards, Marios
Technical SEO | | marioskal0 -
50 Duplicate URLS, but not the same
Hi According to my latest site crawl, many of my pages are showing up to 50 duplicate urls. However this isn't the case in real life. http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/hitachi/ex-33mu.html is showing 31 duplicate URL. Examples include: http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/parts/x430.html
Technical SEO | | JDadd
http://www.fortusgroup.com.au/browse-products/rubber-tracks/excavator-rubber-tracks/case/cx-75sr.html Obviously these URL's are very similar and I know that Moz judges URLs by 90% of their similarity, but is this affecting my actual raking on google? If so, what can I do? This pages are also very similar in code and content, so they are also showing as duplicate content etc as well. Worried that this is having an affect on my SERP rankings, as this pages arent ranking particularly well. Thanks, Ellie0 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
Shorter URLs
Hi Is there a real value in having the keywords in the URL structure? we could use the URL: Mybrand.com/software/tablets/ipad/supertrader.html Or instead have the CMS create the shorter version mybrand.com/supertrader.html and just optimize this page for the keyword 'supertrader ipad software'
Technical SEO | | FXDD1 -
Sitemap.xml - autogenerated by CMS is full of crud
Hi all, hope you can help. the Magento ecommerce system I'm working with autogenerates sitemap.xml - it's well formed with priority and frequency parameters. However, it has generated lots of URLs that are pointing to broken pages returning fatal erros, duplicate URLs (not canonicals), 404s etc I'm thinking of hand creating sitemap.xml - the site has around 50 main pages including products and categories, and I can get the main page URLs listed by screaming frog or xenu. Then I'll have to get into the hand editing the crud pages with noindex, and useful duplicates with canonicals. Is this the way to go or is there another solution thanks in advance for any advice
Technical SEO | | k3nn3dy30 -
Could somebody suggest a GOOD Wordpress XML sitemap generator?
We have been putzing around with Google XML Sitemaps Generator (a plug-in on Wordpress) for our Wordpress blog and we cannot get it to write an XML sitemap! Could somebody suggest a viable alternative that actually works? Thank you for your help! Jay
Technical SEO | | theideapeople0 -
How does a sitemap affect the definition of canonical URLs?
We are having some difficulty generating a sitemap that includes our SEO-friendly URLs (the ones we want to set as canonical), and I was wondering if we might be able to simply use the non-SEO-friendly, non-canonical URLs that the sitemap generator has been producing and then use 301 redirects to send them to the canonical. Is there a reason why we should not be doing this? We don't want search engines to think that the sitemap URLs are more important than the pages to which they redirect. How important is it that the sitemap URLs match the canonical URLs? We would like to find a solution outside of the generation of the sitemap itself as we are locked into using a vendor’s product in order to generate the sitemap. Thanks!
Technical SEO | | emilyburns0