Robots file include sitemap
-
Hello,
I see that google, facebook and moz... have robots.txt include sitemap at the footer.
Eg: http://www.google.com.vn/robots.txtSitemap: http://www.google.com/sitemaps_webmasters.xml
Sitemap: http://www.google.com/ventures/sitemap_ventures.xml Should I include my sitemap file (sitemap.xml) at the footer of robots.txt and why should do this?Thanks,
-
Fore sure. The reason is that not all sitemap files are simply sitemap.xml it may be sitemap.gzip sitemap.zip or in my case, sitmap_index.gzip. Also, some people may not be able to include their sitemap at root.
Including sitemap exacts in robots.txt gives a clear directive to each of the search engines exactly where to find your sitemap. Google and Bing/Yahoo will have no issue finding it as you probably submitted it to them but crawlers like ask.com will usually look at your robots.txt and skip your site if no map is placed in it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I include unnecessary pages in the sitemap.xml
I have a lot of pages that I don't want Google to index, so for most of them, I used cannonical, were they were duplicates, noindex were I wanted to remove the pages, but the question is: Should I include these pages in the sitemap.xml, or just the important pages? Also should I include them in order to get the changes indexed fastet by Google?
On-Page Optimization | | Silviu0 -
Webmaster sitemap how to tell Google to recheck and no of times to check a day?
Webmaster sitemap how to tell Google to recheck and no of times to check a day? But how often google check if the sitemap version changed and how to notify google about the change happened
On-Page Optimization | | bsharath0 -
Keyword stuffing when brand includes keyword
Hi If you have managed to combine brand name with primary target keyword do you still need pay attention to on page keyword stuffing ? since one would expect plenty of brand references in the body copy ? Or is it still best to reduce instances of the keyword aspect ? For example if site is called 'Franks Service Centres' and you have lots/too many instances of 'service centres' in the body copy a/c to MA on-page grader, should you reduce some instances of the kw ? All Best
On-Page Optimization | | Dan-Lawrence
Dan0 -
I have more pages in my site map being blocked by the robot file than I have being allowed to be crawled. Is Google going to hate me for this?
Using some rules to block all pages which start with "copy-of" on my website because people have a bad habit of duplicating new product listings to create our refurbished, surplus etc. listings for those products. To avoid Google seeing these as duplicate pages I've blocked them in the robot file, but of course they are still automatically generated in our sitemap. How bad is this?
On-Page Optimization | | absoauto0 -
Can Sitemap Be Used to Manage Canonical URLs?
We have a duplicate content challenge that likely has contributed to us loosing SERPs especially for generic keywords such as "audiobook," "audiobooks," "audio book," and "audio books." Our duplicate content is on two levels. 1. The first level is at our web store, www.audiobooksonline.com. Audiobooks are sometimes published in abridged, unabridged, on compact discs, on MP3 CD by the same publisher. In this case we use the publisher description of the story for each "flavor" = duplicate content. Can we use our sitemap to identify only one "flavor" so that a spider doesn't index the others? 2. The second level is that most online merchants of the same publisher's audio book use the same description of the story = lots of duplicate content on the Web. In that we have 11,000+ audio book titles offered at our Web store, I expect Google sees us as having lots of duplicated (on the Web) content and devalues our site. Some of our competitors who rank very high for our generic keywords use the same publisher's description. Any suggestions on how we could make our individual audio book title pages unique will be greatly appreciated.
On-Page Optimization | | lbohen0 -
Tool to creat a good XML sitemap
Hello lads, I need to creat a XML sitemap for a website so I can add to Google Webmaster and Bing Webmaster. What do you guys recommend? Tks in advance! PP
On-Page Optimization | | PedroM0 -
The SEOmoz crawler is being blocked by robots.txt need help
SEO moz is showing me that the robot.txt is blocking content on my site
On-Page Optimization | | CGR-Creative0