Add selective URLs to an XML Sitemap
-
Hi!
Our website has a very large no of pages. I am looking to create an XML Sitemap that contains only the most important pages (category pages etc). However, on crawling the website in a tool like Xenu (the others have a 500 page limit), I am unable to control which pages get added to the XML Sitemap, and which ones get excluded.
Essentially, I only want pages that are upto 4 clicks away from my homepage to show up in the XML Sitemap.
How should I create an XML sitemap, and at the same time control which pages of my site I add to it (category pages), and which ones I remove (product pages etc).
Thanks in advance!
Apurv
-
Thanks a lot for sharing Travis. This is really helpful!
Appreciate your help here.
-
Hey Intermediate,
Here's my setup - image - http://screencast.com/t/qThC401hQVUp Be careful of the line breaks if you want your sitemap to be pretty (I'm not sure if it also works if everything is on a single line).
Column A:
Column B:
URLColumn
<lastmod>2013-08-27</lastmod>
Column
<changefreq>always</changefreq>Column E:
<priority>1</priority>Column F:
=CONCATENATE(A2,B2,C2,D2,E2)You will need to add this as first 2 lines in your sitemap:
and add to the end, but you should be good to go!
I Hope that helps! -
Thanks Schwaab!
-
Hi Travis
That sounds like a smart way to go about this. Could you please guide me regarding how to add parameters like lastmod, priority, changefreq etc in the XML sitemap, using the URLs that I have in the Excel sheet.
Thanks!
-
If you have a list of all the URLs on your site, it is easy to create a sitemap using excel. I have a template that I use and I can crank out a 50k URL sitemap in 5 minutes.
-
I would recommend purchasing Screaming Frog. You can crawl the site and sort the URLs by level. Remove the URLs that are too deep from the crawl and export to XML sitemap. Screaming Frog is definitely worth the price to unlock all of its features and have an unlimited crawl limit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adding your sitemap to robots.txt
Hi everyone, Best practice question: When adding your sitemap to your robots.txt file, do you add the whole sitemap at once or do you add different subcategories (products, posts, categories,..) separately? I'm very curious to hear your thoughts!
Technical SEO | | WeAreDigital_BE0 -
What are the negative implications of listing URLs in a sitemap that are then blocked in the robots.txt?
In running a crawl of a client's site I can see several URLs listed in the sitemap that are then blocked in the robots.txt file. Other than perhaps using up crawl budget, are there any other negative implications?
Technical SEO | | richdan0 -
Best URL format for pagination
We're currently changing the URL format of our website search, we have been discussing a lot and cannot decide the past way to pass the pagination parameter for SEO. We narrowed down to the options. www.website.com/apples/p2 - www.website.com/apples?page=2 - www.website.com/apples/page/2 What would give us best ranking returns? What do you think?
Technical SEO | | HelpSaude0 -
301 on certain url string
I have a few thousand old urls with the string /content/ in them and are looking for a way to 301 batch redirect them. So for all the urls that contain the word 'content' I would like to redirect to 1 specific page. I have tried the methods below without success. Regular 301's are working fine but this particular method is not working for me. I am running a Joomla site but I don't imagine that would have any impact. Any suggestions would be greatly appreciated. Redirect 301 ^content/.*$ http://www.mysite.com Redirect 301 ^content/ http://www.mysite.com
Technical SEO | | omega0 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | | askotzko0 -
URL paths and keywords
I'm recommending some on-page optimization for a home builder building in several new home communities. The site has been through some changes in the past few months and we're almost starting over. The current URL structure is http://homebuilder.com/oakwood/features where homebuilder = builder name Oakwood Estates= name of community features = one of several sub-paths including site plan, elevations, floor plans, etc. The most attainable keyword phrases include the word 'home' and 'townname' I want to change the URL path to: http://homebuilder.com/oakwood-estates-townname-homes/features Is there any problem with doing this? It just seems to make a lot of sense. Any input would be appreciated.
Technical SEO | | mikescotty0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0 -
URL Rewrite
Using the .htaccess file how do I rewrite a url from www.exampleurl.com/index.php?page=example to www.exampleurl.com/example removing index.php?page= Any help is muchly appreciated
Technical SEO | | CraigAddyman0