XML and Disallow
-
I was just curious about any potential side effects of a client Basically utilizing a catch-all solution through the use of a spider for generating their XML Sitemap and then disallowing some of the directories in the XML sitemap in the robots.txt.
i.e.
XML contains 500 URLs
50 URLs contain /dirw/
I don't want anything with /dirw/ indexed just because they are fairly useless. No content, one image.They utilize the robots.txt file to " disallow: /dirw/ "
Lets say they do this for maybe 3 separate directories making up roughly 30% of the URL's in the XML sitemap.
I am just advising they re-do the sitemaps because that shouldn't be too dificult but I am curious about the actual ramifications of this other than "it isn't a clear and concise indication to the SE and therefore should be made such" if there are any.
Thanks!
-
Hi Thomas,
I don't think that technically there is a problem with adding url's to a sitemap & then blocking part of them with robots.txt.
I wouldn't do it however - and I would give the same advice as you did: regenerate the sitemap without this content. Main reason would be that it goes against the main goals of a sitemap: helping bots to crawl your site and to provide valuable metadata (https://support.google.com/webmasters/answer/156184?hl=en). Another advantage is that Google indicates the % of url's of each sitemap which is index. From that perspective, url's which are blocked for indexing have no use in a sitemap. Normally webmaster tools will generate errors, to let you know that there are issues with the sitemap.
If you take it one step further, Google could consider you a bit of a lousy webmaster, if you keep these url's in the sitemap. Not sure if this is the case, but for something which can easily be corrected, not sure if I would take this risk (even if it's a very minor one).
There are crawlers (like screamingfrog) which can generate sitemaps, while respecting the directives of the robots.txt - this would in my opinion be a better option.
rgds,
Dirk
-
For syntax I think you'll want:
User-agent: *
Disallow: /dirw/If the content of /dirw/ isn't worthwhile to the engines then it should be fine to disallow. It's important to note though that Google asks for CSS and Javascript to not be disallowed. Run the site through their Page Speed tool to see how this setup currently impacts that interaction. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disallow: /jobs/? is this stopping the SERPs from indexing job posts
Hi,
Intermediate & Advanced SEO | | JamesHancocks1
I was wondering what this would be used for as it's in the Robots.exe of a recruitment agency website that posts jobs. Should it be removed? Disallow: /jobs/?
Disallow: /jobs/page/*/ Thanks in advance.
James0 -
Yoast XML Sitemap Taxonomies
Hey all, Quick question about taxonomies and sitemap settings. On our e-commerce site, we noindex product tags and post tags. Under the "Taxonomies" settings in Yoast I'm seeing taxonomies such as Product Color (pa_color). Would it be wise to remove such taxonomies from our sitemap if we already include product colors, attributes, etc. in our page titles and product descriptions? Thanks in advance, Andrew
Intermediate & Advanced SEO | | mostcg0 -
What does Disallow: /french-wines/?* actually do - robots.txt
Hello Mozzers - Just wondering what this robots.txt instruction means: Disallow: /french-wines/?* Does it stop Googlebot crawling and indexing URLs in that "French Wines" folder - specifically the URLs that include a question mark? Would it stop the crawling of deeper folders - e.g. /french-wines/rhone-region/ that include a question mark in their URL? I think this has been done to block URLs containing query strings. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Xml sitemap Issue... Xml sitemap generator facilitating only few pages for indexing
Help me I have a website earlier 10,000 WebPages were facilitated in xml sitemap for indexation, but from last few days xml sitemap generator facilitating only 3300 WebPages for indexing. Please help me to resolve the issue. I have checked Google webmaster indexed pages, its showing 8,141. I have tried 2-3 paid tools, but all are facilitating 3300 pages for indexing. I am not getting what is the exact problem, whether the server not allowing or the problem with xml sitemap generator. Please please help me…
Intermediate & Advanced SEO | | udistm0 -
Do XML sitemaps need to be manually resubmitted every time they are changed?
I have been noticing lately that quite a few of my client's sites are showing sitemap errors/warnings in Google webmaster tools, despite the fact that the issue with the the sitemap (e.g a URL that we have blocked in robots.txt) was fixed several months earlier. Google talks about resubmitting sitemaps here where it says you can resubmit your sitemap when you have made changes to it, I just find it somewhat strange that the sitemap is not automatically re-scanned when Google crawls a website. Does anyone know if the sitemap is automatically rescanned and only webmaster tools is not updated, or am I going to have to manually resubmit or ping Google with the sitemap each time a change is made? It would be interesting to know other people's experiences with this 🙂
Intermediate & Advanced SEO | | Jamie.Stevens0 -
XML Sitemaps - how to create the perfect XML Sitemap
Hello, We have a site that is not updated very often - currently we have a script running to create/update the XML sitemap every time a page is added/edited or deleted. I have a few questions about best practices for creating XML sitemaps. 1. If the site is not updated for months on end - is it a bad idea to force the script to update i.e. changing the dates once a month? Will google noticed nothing has changed just the date i.e. all the content on the site is exactly the same. Will they start penalising you for updating an XML sitemap when there is nothing new about the website?
Intermediate & Advanced SEO | | JohnW-UK
2. Is it worth automating the XML file to link into Bing/Google to update via webmaster tools - as I say even if the site is never updated?
3. Is the use of "priorities" necessary?
4. The changefreq - does that mean Google/Bing expects to see a new file ever month?
5. The ordering of the pages - the script seems pretty random and put the pages in a random order - should we make it order the pages with the most important ones first? Should the home page always be first?
6. Below is a sample of how our XML sitemap appears - is there anything that we should change? i.e. all marked up properly? This XML file does not appear to have any style information associated with it. The document tree is shown below.
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><url><loc>http://www.domain.com</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url>
<url><loc>http://www.domain.com/contact/</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url>
<url><loc>http://www.domain.com/sitemap/</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url></urlset> Hope someone can help enlighten us to best practices0 -
Sitemap.xml
Hi guys I read the seomoz article about sitemap.xml dated 2008. Just wanted to check views on: Is it worthwhile using the 'priority' What if everything is set to 100% Any tips to using the priority Many thanks in advance! Richard
Intermediate & Advanced SEO | | Richard5550 -
XML Sitemap instruction in robots.txt = Worth doing?
Hi fellow SEO's, Just a quick one, I was reading a few guides on Bing Webmaster tools and found that you can use the robots.txt file to point crawlers/bots to your XML sitemap (they don't look for it by default). I was just wondering if it would be worth creating a robots.txt file purely for the purpose of pointing bots to the XML sitemap? I've submitted it manually to Google and Bing webmaster tools but I was thinking more for the other bots (I.e. Mozbot, the SEOmoz bot?). Any thoughts would be appreciated! 🙂 Regards, Ash
Intermediate & Advanced SEO | | AshSEO20110