Protecting sitemaps - Good idea or humbug?
-
Is there a way to protect your sitemap.xml so that only Google can read it and would it make sense to do this?
-
From a hacker's perspective, the first order of business is going to be gathering information on the target. does a hacker or someone with malicious intent gain something in obtaining access to your sitemap?
Yes, they do, and that is more information on the layout of your site. How common would there actually be something on the sitemap that could critically expose you to compromise on your VPS/Shared hosting? Um, probably super ultra rare.
But yes there was one time that I was doing an audit for a company and the sitemap did point to a directory that was vulnerable to directory browsing. Fishing around in the directory, I was able to obtain a picture of a PayPal MasterCard front and back because some idiot snapped pictures of it and uploaded it onto the site.
So there are benefits to hiding it, it's relatively easy to do, but if your lazy and don't want to, chances are your good.
-
Hi Herb,
Thank you for your feedback. I think you are right. We are dealing with very short lived up-to-date information so it is vital that as few sites as possible have the information we have. For this reason I was considering to "hide" our sitemaps. Some of our competitors do that but probably we need to find some other measures to achieve our goal.
Cheers
Thomas -
Hi Thomas;
You have not specified your web server platform, but assuming it is Apache it would be easy to do with a regular expression in your .htaccess
However, I do not see any valid reason for doing so. Your sitemap should be a refection of your public menu and internal public links. So other than making it easier for search and other spiders to crawl your site, it does not expose any information that is not available by other methods. So, best practices say that you should have an accurate site map, and unless you have a reson for hiding it that you did not mention I would not hide it.
I will tell you those that you should not bother putting areas you do not want crawled in your robots.txt file and any of the bad folks will not respect the request.
Take care,
Herb
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap
I have a question for the links in a sitemap. Wordpress works with a sitemap that first link to the different kind of pages: pagesitemap.xml categorysitemap.xml productsitemap.xml etc. etc. These links on the first page are clickable. We have a website that also links to the different pages but it's not clickable, just a flat link. Is this an issue?
Technical SEO | | Happy-SEO0 -
Which Sitemap to keep - Http or https (or both)
Hi, Just finished upgrading my site to the ssl version (like so many other webmasters now that it may be a ranking factor). FIxed all links, CDN links are now secure, etc and 301 Redirected all pages from http to https. Changed property in Google Analytics from http to https and added https version in Webmaster Tools. So far, so good. Now the question is should I add the https version of the sitemap in the new HTTPS site in webmasters or retain the existing http one? Ideally switching over completely to https version by adding a new sitemap would make more sense as the http version of the sitemap would anyways now be re-directed to HTTPS. But the last thing i can is to get penalized for duplicate content. Could you please suggest as I am still a rookie in this department. If I should add the https sitemap version in the new site, should i delete the old http one or no harm retaining it.
Technical SEO | | ashishb010 -
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Adding multi-language sitemaps to robots.txt
I am working on a revamped multi-language site that has moved to Magento. Each language runs off the core coding so there are no sub-directories per language. The developer has created sitemaps which have been uploaded to their respective GWT accounts. They have placed the sitemaps in new directories such as: /sitemap/uk/sitemap.xml /sitemap/de/sitemap.xml I want to add the sitemaps to the robots.txt but can't figure out how to do it. Also should they have placed the sitemaps in a single location with the file identifying each language: /sitemap/uk-sitemap.xml /sitemap/de-sitemap.xml What is the cleanest way of handling these sitemaps and can/should I get them on robots.txt?
Technical SEO | | MickEdwards0 -
My Rankings Keep Going Down - Needs some more ideas on why...
Hi Everyone, I have been working on the website www.PetsSpark.com for a while now. I used to be on the first page for many of my keywords, now I am lucky if i am on the 2nd or 3rd. I stopped my SEO efforts for about 3 months while making changes to the website, its whithin these 3 months that I started dropping from the first page. The biggest thing I notice is the "type" of website now ranking for my main keywords. Mostly Forums, Blogs, or Product Review websites. Take for instance Dog Tear Stain Remover. I used to rank at about #5 for this keyword and now i rank at #20. I'm not sure if my loss of ranking is a combination of things like who Google is letting rank now or if there is something wrong on my website, etc. Can anyone give me a little insight?? Please.... I will also be happy to give more information about what has been going on if needed.
Technical SEO | | DTOSI0 -
Same day our sitemap finished processing in G WebTools, our SERP results Tank!
Need a little help troubleshooting an SEO issue. The day our sitemap finished processing in Google Webmaster Tools, almost all of our keyword serp results tanked. Our top 4 keywords routinely were placing from 11 - 33 rank in serp result and now they're not even on in the top 200? Would the sitemap processing have anything do do with this or should I look somewhere else. FYI: Site is build in DNN, sitemap is fine and robot.txt file is good. Open to all suggestions!!
Technical SEO | | Firecracker0 -
Reciprocal link building a bad idea?
Reciprical Link Building or one way traffic? Good morning from 8 degrees c but very sunny wetherby UK Regarding link acquisition ive had it in my head that the best idea is to secure inbound links only i.e avoid reciprocal link building so that when your source links to you you dont give them a link back. But is this no longer valid when targeting inbound links to increase rank? Is it ok to link back or will that "leak SEO juice" or whatever we decide to call it this week.
Technical SEO | | Nightwing0 -
Duplicate XML sitemaps - 404 or leave alone?
We switched over from our standard XML sitemap to a sitemap index. Our old sitemap was called sitemap.xml and the new one is sitemapindex.xml. In Webmaster Tools it still shows the old sitemap.xml as valid. Also when you land on our sitemap.xml it will display the sitemap index, when really the index lives on sitemapindex.xml. The reason you can see the sitemap on both URLs is because this is set from the sitemap plugin. So the question is, should we change the plugin setting to let the old sitemap.xml 404, or should we allow the new sitemap index to be accessed on both URLs?
Technical SEO | | Hakkasan0