Robots.txt and Multiple Sitemaps
-
Hello,
I have a hopefully simple question but I wanted to ask to get a "second opinion" on what to do in this situation. I am working on a clients robots.txt and we have multiple sitemaps. Using yoast I have my sitemap_index.xml and I also have a sitemap-image.xml I do put them in google and bing by hand but wanted to have it added into the robots.txt for insurance. So my question is, when having multiple sitemaps called out on a robots.txt file does it matter if one is before the other? From my reading it looks like you can have multiple sitemaps called out, but I wasn't sure the best practice when writing it up in the file.
Example:
User-agent: * Disallow: Disallow: /cgi-bin/ Disallow: /wp-admin/ Disallow: /wp-content/plugins/ Sitemap: http://sitename.com/sitemap_index.xml Sitemap: http://sitename.com/sitemap-image.xml Thanks a ton for the feedback, I really appreciate it! :) J
-
Awesome! yea I submitted them to bing and google by hand, I just figured it couldn't hurt to have it in my robots too.
Appreciate the feedback
-
Yes, what you have is the proper format. The best way to submit sitemaps, of course, is to submit them via Google & Bing Webmaster Tools.
Sitemaps won't have much impact on your site unless you have a really large site, so I wouldn't focus on them too much. The best way to get content crawled & indexed by Google is good internal link structure and authoritative external links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removing Multiple 301 Redirects
During my last redesign (and migration to Drupal) some of the updated SEO friendly url's on the new site were misspelled. Rather than updating the 301 redirects to point to the correct page the developer just added an additional 301 redirect. So it was redirected like this website.com/oldpage (301 to) website.com/new-paige (301 to) website.com/new-page Instead of website.com/oldpage (301 to) website.com/new-page I'll be finishing another redesign and updating to https soon, should I remove the redirect to the misspelled domain and just have one 301 from the original page? These multiple redirects have been up for over a year. Thanks for any specific advice!
Technical SEO | | talltrees0 -
Robot.txt : How to block a specific file type in several subdirectories ?
Hello everyone ! I need help setting up a robot.txt. I'm trying to block all pdf files in particular directories so I'm using this command. In the example below the line is blocking all .gif in the entire site. Block files of a specific file type (for example, .gif) | Disallow: /*.gif$ 2 questions : Can I use this command to specify one particular directory in which I want to block pdf files ? Will this line be recognized by googlebots ? Disallow: /fileadmin/xxxxxxx/xxx/xxxxxxx/*.pdf$ Then I realized that I would have to write as many lines as many directories there are in which I want to block pdf files. Let's say I want to block pdf files in all these 3 directories /fileadmin/directory1 /fileadmin/directory1/sub1 /fileadmin/directory1/sub1/pdf Is there a pattern-matching rule I could use to blocks access to pdf files in all subdirectories instead of writing 3x the above line for each subdirectory ? For exemple : Disallow: /fileadmin/directory1*/ Many thanks in advance for any insight you may have.
Technical SEO | | LabeliumUSA0 -
How do I setup sitemaps for an international website?
I am adding translated versions of my sites to a subdomain for example es.example.com. Will I add each subdomain into Google Webmaster Tools? Will each need its own sitemap?
Technical SEO | | EcommerceSite0 -
Is there any value in having a blank robots.txt file?
I've read an audit where the writer recommended creating and uploading a blank robots.txt file, there was no current file in place. Is there any merit in having a blank robots.txt file? What is the minimum you would include in a basic robots.txt file?
Technical SEO | | NicDale0 -
Multiple urls for posting multiple classified ads
Want to optimize referral traffic while at same time keep search engines happy and the ads posted. Have a client who advertises on several classified ad sites around the globe. Which is better (post Panda), having multiple identical urls using canonicals to redirect juice to original url? For example: www.bluewidgets.com is the original www.bluewidgetsusa.com www.blue-widgets-galore.com Or, should the duplicate pages be directed to original using a 301? Currently using duplicate urls. Am currently not using "nofollow" tags on those pages.
Technical SEO | | AllIsWell0 -
Meta-robots Nofollow on logins and admins
In my SEO MOZ reports I am getting over 400 errors as Meta-robots Nofollow. These are all leading to my admin login page which I do not want robots in. Should I put some code on these pages so the robots know this and don't attempt to and I do not get these errors in my reports?
Technical SEO | | Endora0 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
Multiple Google Places listings under review
I have a client with a waste removal business who had multiple listings on his Google Places account for different service locations. Over the last four months I have been creating separate listings for each separate service location, each under a different Google account and with a unique business name, address, phone number and website URL. They have all been verified by postcard and listed separately in local directories so that they have citations. As I have been creating the new listings I have also been deleted the old ones to make sure they are not flagged as duplicates. 2 months ago all the listings in the client's Google Places account were placed under review. I made some changes and submitted it for re-review but no go with Google. Now all the new listings I set up have also suddenly been placed under review. About a week ago I noticed that information in two separate Google Places listings was being mixed up - for example, the website URL for one listing was being shown in another listing. There is no connection between these two listings other than that they were both set up from the same IP address so this seems very strange. I reported this to Google and asked them to sort it out, then all of a sudden I found that ALL of the new listings had been placed under review. So now my client has no active listings at all. He can't afford to wait another 2 months for Google to review all the listings again so I am wondering whether the best course of action would just be to delete everything and start over. Any advice would be most welcome!
Technical SEO | | EssexGirl0