Add selective URLs to an XML Sitemap
-
Hi!
Our website has a very large no of pages. I am looking to create an XML Sitemap that contains only the most important pages (category pages etc). However, on crawling the website in a tool like Xenu (the others have a 500 page limit), I am unable to control which pages get added to the XML Sitemap, and which ones get excluded.
Essentially, I only want pages that are upto 4 clicks away from my homepage to show up in the XML Sitemap.
How should I create an XML sitemap, and at the same time control which pages of my site I add to it (category pages), and which ones I remove (product pages etc).
Thanks in advance!
Apurv
-
Thanks a lot for sharing Travis. This is really helpful!
Appreciate your help here.
-
Hey Intermediate,
Here's my setup - image - http://screencast.com/t/qThC401hQVUp Be careful of the line breaks if you want your sitemap to be pretty (I'm not sure if it also works if everything is on a single line).
Column A:
Column B:
URLColumn
<lastmod>2013-08-27</lastmod>
Column
<changefreq>always</changefreq>Column E:
<priority>1</priority>Column F:
=CONCATENATE(A2,B2,C2,D2,E2)You will need to add this as first 2 lines in your sitemap:
and add to the end, but you should be good to go!
I Hope that helps! -
Thanks Schwaab!
-
Hi Travis
That sounds like a smart way to go about this. Could you please guide me regarding how to add parameters like lastmod, priority, changefreq etc in the XML sitemap, using the URLs that I have in the Excel sheet.
Thanks!
-
If you have a list of all the URLs on your site, it is easy to create a sitemap using excel. I have a template that I use and I can crank out a 50k URL sitemap in 5 minutes.
-
I would recommend purchasing Screaming Frog. You can crawl the site and sort the URLs by level. Remove the URLs that are too deep from the crawl and export to XML sitemap. Screaming Frog is definitely worth the price to unlock all of its features and have an unlimited crawl limit.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to add 301 for many urls
Hi I need to redirect many urls in a website and I was wondering if instead of doing it one by one there is a way to get it the other way round.... Redirect all pages but a few. I get a feeling this is not possible, but prefer asking just in case. Thanks for any feedback
Technical SEO | | turismodevino10 -
Handling XML Sitemaps for Ad Classified Sites
Let's put on a scenario for a Job Classified site, So far the way we are handling xml sitemaps is in a consecutive number containing only ads historically: http://site.com/sitemap_ads_1.xml http://site.com/sitemap_ads_2.xml http://site.com/sitemap_ads_99.xml Those sitemaps are constantly updating as each ad is published, keeping expired ads but I'm sure there is a better way to handle them. For instance we have other source of content besides ads pages, like those related to search results (Careers, Location, Salary, level, type of contract, etc) and blog content, but we are not adding them yet So what I'm suggesting is to reduce the amount of xml sitemaps ads to just one, including just the ones that are active (not expired), add another xml sitemap based on search results, another one on blog content, another one on images and finally one for static content such as home, faq, contact, etc. Do you guys think this is the right way to go?
Technical SEO | | JoaoCJ0 -
Dynamic Url best approach
Hi We are currently in the process of making changes to our travel site where by if someone does a search this information can be stored and also if the user needs to can take the URL and paste into their browser at find that search again. The url will be dynamic for every search, so in order to stop duplicate content I wanted ask what would be the best approach to create the URLS. ** An example of the URL is: ** package-search/holidays/hotelFilters/?depart=LGW&arrival=BJV&sdate=20150812&edate=20150819&adult=2&child=0&infant=0&fsearch=first&directf=false&nights=7&tsdate=&rooms=1&r1a=2&r1c=0&r1i=0&&dest=3&desid=1&rating=&htype=all&btype=all&filter=no&page=1 I wanted to know if people have previous experience in something like this and what would be the best option for SEO. Will we need to create the URL with a # ( As i read this stops google crawling after the #) Block the folder IN ROBOTS is there any other areas I should be aware of in order stop any duplicate content and 404 pages once the URL/HOLIDAY SEARCH is no longer valid. thanks E
Technical SEO | | Direct_Ram0 -
Sitemap url's not being indexed
There is an issue on one of our sites regarding many of the sitemap url's not being indexed. (at least 70% is not being indexed) The url's in the sitemap are normal url's without any strange characters attached to them, but after looking into it, it seems a lot of the url's get a #. + a number sequence attached to them once you actually go to that url. We are not sure if the "addthis" bookmark could cause this, or if it's another script doing it. For example Url in the sitemap: http://example.com/example-category/0246 Url once you actually go to that link: http://example.com/example-category/0246#.VR5a Just for further information, the XML file does not have any style information associated with it and is in it's most basic form. Has anyone had similar issues with their sitemap not being indexed properly ?...Could this be the cause of many of these url's not being indexed ? Thanks all for your help.
Technical SEO | | GreenStone0 -
Changed URL of all web pages to a new updated one - Keywords still pick the old URL
A month ago we updated our website and with that we created new URLs for each page. Under "On-Page", the keywords we put to check ranking on are still giving information on the old urls of our websites. Slowly, some new URLs are popping up. I'm wondering if there's a way I can manually make the keywords feedback information from the new urls.
Technical SEO | | Champions0 -
Crawl reveals hundreds of urls with multiple urls in the url string
The latest crawl of my site revealed hundreds of duplicate page content and duplicate page title errors. When I looked it was from a large number of urls with urls appended to them at the end. For example: http://www.test-site.com/page1.html/page14.html or http://www.test-site.com/page4.html/page12.html/page16.html some of them go on for a hundred characters. I am totally stymied, as are the people at my ISP and the person who talked to me on the phone from SEOMoz. Does anyone know what's going on? Thanks So much for any help you can offer! Jean
Technical SEO | | JeanYates0 -
Negative url name?
I have a new client who has the letters "BB" at the start of his url name, bbzautorepair.com. He was told by someone at Google Adwords that the letters "BB" in his url name could hurt him with Google rankings. Reason being that Google red flags anything or website to do with firearms, guns and ammunition. He was told that the letters "BB" could be mistaken or red flagged for "BB Gun". Seems a bit far fetched. Has anyone every heard of such a thing? Thanks
Technical SEO | | fun52dig
Gary Downey0 -
Canonical URL
In our campaign, I see this notices Tag value
Technical SEO | | shebinhassan
florahospitality.com/ar/careers.aspx Description
Using rel=canonical suggests to search engines which URL should be seen as canonical. What does it mean? Because If I try to view the source code of our site, it clearly gives me the canonical url.0