Removing sitemaps.xml from the SERPs
-
Hi
What's the best way to remove sitemaps.xml from the SERP
Thanks
-
Make sure to set the X-Robots directive to NOINDEX|FOLLOW, or you'll completely defeat the purpose of the sitemap.
-
I'm not sure why you would want to remove the sitemap.xml from the SERPs, but I'll answer your question anyways.
One option would be to remove the URL via Google Webmaster Tools.
Another option is to use the X-Robots-Tag directive in the HTTP header and set it to noindex. Here is some more info on how to set it up http://googleblog.blogspot.com/2007/07/robots-exclusion-protocol-now-with-even.html and http://sebastians-pamphlets.com/handling-googles-neat-x-robots-tag-sending-rep-header-tags-with-php/
-
Why would you want to remove it from the SERP? There's no harm in it being there, and it's unlikely to outrank your actual pages for your keywords.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle SEO redirection after remove www
Recently our https://bestclassifiedsusa.com free classifieds USA website we move from www to non www format. Now old urls still showing in Google SERP. When click they show 404 error pages due to removed old ads.
Technical SEO | | bcuclassifieds
We redirect www to non www version for main domain but still old cache pages affected.
How to dynamically handle this case, all old www version ads pages auto redirect to relevant category pages?0 -
Homepage was removed from google and got deranked
Hello experts I have a problem. The main page of my homepage got deranked severely and now I am not sure how to get the rank back. It started when I accidentally canonicalized the main page "https://kv16.dk" to a page that did not exist. 4 months later the page got deranked, and you were not able to see the "main page" in the search results at all, not even when searching for "kv16.dk". Then we discovered the canonicalization mistake and fixed it, and were able to get the main page back in the search results when searching for "kv16.dk". At first after we made the correction, some weeks passed by, and the ranking didn't get better. Google search console recommended uploading a sitemap, do we did that. However in this sitemap there was a lot of "thin content sites", for all the wordpress attachments. E.g. for every image in an article. more exactly there were 91 of these attachment sites, and the rest of the page consists of only two pages "main page" and an extra landing page. After that google begun recommending the attachment urls in some searches. We tried fixing it by redirecting all the attachments to their simple form. E.g. if it was an attachment page for an image we redirected strait to the image. Google has not yet removed these attachment pages, so the question is if you think it will help to remove the attachments via google search console, or will that not help at all? For example when we search "kv16" an attachment URL named "birksø" is one of the first results
Technical SEO | | Christian_T0 -
Is there a maximum sitemap size?
Hi all, Over the last month we've included all images, videos, etc. into our sitemap and now its loading time is rather high. (http://www.troteclaser.com/sitemap.xml) Is there any maximum sitemap size that is recommended from Google?
Technical SEO | | Troteclaser0 -
XML Sitemap and unwanted URL parameters
We currently don't have an XML sitemap for our site. I generated one using Screaming Frog and it looks ok, but it also contains my tracking url parameters (ref=), which I don't want Google to use, as specified in GWT. Cleaning it will require time and effort which I currently don't have. I also think that having one could help us on Bing. So my question is: Is it better to submit a "so-so" sitemap than having none at all, or the risks are just too high? Could you explain what could go wrong? Thanks !
Technical SEO | | jfmonfette0 -
Weird SERPS
Hello mozerz, I have a question regarding my SERPS that I just can't figure out, maybe one of you had this problem before or has encounter a similar problem. So I have a website on Scotland, and I have a page for each major city so the structure is scotland/city/glasgow or scotland/city/edinburgh or scotland/city/aberdeen. Now every city page is w3 validated, page speed validated, has around 1000 words of text, all with h1, h2, image alt, nice title and nice description, all have grade A on moz campaign, original content, all with canonical links, no ads or spammy links. The title , description, h1, h2 and img alt are the same only replaced Aberdeen with Edinburgh or Glasgow. or so on ... so all pages are identically just replaced the city word. In terms of link building I have done none ! and in terms of promotion I have done none... so theres no scenario where by I have done something more for one city then the other Now my problem is that all cities are doing well in SERPS, only Glasgow has come up till 14 position and then has dropped suddenly to 74, and remained there for 2 weeks now. I know this sounds like a penalty, but it can't be because I haven't done anything, I've tried all the tools possible to analyze the Glasgow page, though that it was a code problem or a broken link or that google-bot doesn't get up to this page to crawl it and classify it, but all is fine. Can anyone suggest anything that I might do. I'm 100% sure that I have no penalty, I've checked even the webmaster tools, open site explorer to see if anyone tried to link to my site with spammy links ( has happened before, I had about 1000 links about viagra from a competitor pointing to my site ) .
Technical SEO | | asmedia0 -
Https Version of Homepage in SERPS
The https version of our homepage appears in Google's SERPs. We have rel canonical on the page pointing to the http version. We have a redirect in our htaccess that sends https to http. I thought this was just a fluke and it would be fixed by the next crawl, but it's been like this for a few weeks now. Not only that, but we're losing rank a bit and I'm afraid there's a correlation. Has this ever happened to anyone?
Technical SEO | | UnderRugSwept0 -
When is it safe to remove 301 redirects?
I have created over 500 301 redirects in my .htaccess file, some of them are more than 2 years old now. Should I delete them? I don't like seeing the "notices" number in crawl diagnostics so high 😞
Technical SEO | | danielshaw0 -
Include pagination in sitemap.xml?
Curious on peoples thoughts around this. Since restructuring our site we have seen a massive uplift in pages indexed and organic traffic with our pagination. But we haven't yet included a sitemap.xml. It's an ancient site that never had one. Given that Google seems to be loving us right now, do we even need a sitemap.xml - aside from the analytical benefis in WM Tools? Would you include pagination URL's (don't worry, we have no duplicate content) in the sitemap.xml? Cheers.
Technical SEO | | sichristie0