Noindex xml RSS feed
-
Hey,
How can I tell search engines not to index my xml RSS feed?
The RSS feed is created by Yoast on WordPress.
Thanks, Luke.
-
Hi there! It sounds like there is something wrong with the plugin (either the installation or configuration). I know there was an issue about a year ago where the Yoast plugin was creating some weird links in sitemaps, but have not heard of anything like that recently. Do you have the most recent version of both the plugin and WordPress installed? Also, are you able to share the URLs of your problematic site map and a few of the broken links reported in GWT?
Thanks
-
Hey,
The reason is that I want to no index the actual .xml file because it causes a 404 error in Google Web Master Tools.
-
Hi there, thanks for your question! I am curious, do you mean to have your entire feed no-indexed? If so, why?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I noindex my categories?
Hello! I have created a directory website with a pretty active blog. I probably messed this up, but I pretty much have categories (for my blog) and custom taxonomy (for different categories of services) that are very similar. For example I have the blog category "anxiety therapists" and the custom taxonomy "anxiety". 1- is this a problem for google? Can it tell the difference between archive pages in these different categories even though the names are similar? 2- should I noindex my blog categories since the main purpose of my site is to help people find therapists ie my custom taxonomy?
Intermediate & Advanced SEO | | angelamaemae0 -
Best server-side xml sitemap generator?
I have tried xml-sitemaps which tends to crash when spidering my site(s) and requires multiple manual resumes which aren't practical for our businesses. Please let me know if any other server-side generators that could be used on multiple enterprise-sized websites exist that could be a good fit. Image sitemaps would also be helpfu.l +++One with multiple starting URLs would help spidering/indexing the most important sections of our sites. Also, has anyone heard of or used Dyno Mapper? This also looks like a good solution for us, but was wondering if anyone has had any experience with this product.
Intermediate & Advanced SEO | | recbrands0 -
Sanity Check: NoIndexing a Boatload of URLs
Hi, I'm working with a Shopify site that has about 10x more URLs in Google's index than it really ought to. This equals thousands of urls bloating the index. Shopify makes it super easy to make endless new collections of products, where none of the new collections has any new content... just a new mix of products. Over time, this makes for a ton of duplicate content. My response, aside from making other new/unique content, is to select some choice collections with KW/topic opportunities in organic and add unique content to those pages. At the same time, noindexing the other 90% of excess collections pages. The thing is there's evidently no method that I could find of just uploading a list of urls to Shopify to tag noindex. And, it's too time consuming to do this one url at a time, so I wrote a little script to add a noindex tag (not nofollow) to pages that share various identical title tags, since many of them do. This saves some time, but I have to be careful to not inadvertently noindex a page I want to keep. Here are my questions: Is this what you would do? To me it seems a little crazy that I have to do this by title tag, although faster than one at a time. Would you follow it up with a deindex request (one url at a time) with Google or just let Google figure it out over time? Are there any potential negative side effects from noindexing 90% of what Google is already aware of? Any additional ideas? Thanks! Best... Mike
Intermediate & Advanced SEO | | 945010 -
XML Site Validators...Any Good Ones?
Before submitting to Google, I was wondering if anyone had any suggestions for testing sitemaps out before submitting?
Intermediate & Advanced SEO | | alrockn0 -
If other websites implement our RSS feed sidewide on there website, can that hurt our own website?
Think about the switching anchors from the backlinks and the 100s of sidewide inlinks... I gues Google will understand that it's just a RSS feed right?
Intermediate & Advanced SEO | | Zanox0 -
RSS "fresh" content with static page
Hi SEOmoz members, Currently I am researching my competitor and noticed something what i dont really understand. They have hundreds of static pages that dont change, the content is already the same for over 6 months. Every time a customer orders a product they use their rss feed to publish: "Customer A just bought product 4" When i search in Google for product 4 in the last 24 hours, its always their with a new publishing date but the same old content. Is this a good SEO tactic to implant in my own site?
Intermediate & Advanced SEO | | MennoO0 -
To noindex or not to noindex
Our website lets users test whether any given URL or keyword is censored in China. For each URL and keyword that a user looks up, a page is created, such as https://en.greatfire.org/facebook.com and https://zh.greatfire.org/keyword/freenet. From a search engines perspective, all these pages look very similar. For this reason we have implemented a noindex function based on certain rules. Basically, only highly ranked websites are allowed to be indexed - all other URLs are tagged as noindex (for example https://en.greatfire.org/www.imdb.com). However, we are not sure that this is a good strategy and so are asking - what should a website with a lot of similar content do? Don't noindex anything - let Google decide what's worth indexing and not. Noindex most content, but allow some popular pages to be indexed. This is our current approach. If you recommend this one, we would like to know what we can do to improve it. Noindex all the similar content. In our case, only let overview pages, blog posts etc with unique content to be indexed. Another factor in our case is that our website is multilingual. All pages are available (and equally indexed) in Chinese and English. Should that affect our strategy?References:https://zh.greatfire.orghttps://en.greatfire.orghttps://www.google.com/search?q=site%3Agreatfire.org
Intermediate & Advanced SEO | | GreatFire.org0 -
Are there any disadvantages of switching from xml sitemaps to .asp sitemaps in GWT
I have been using multiple xml sitemaps for products for over 6 months and they are indexing well with GMT. I have been having this manually amended when a product becomes obsolete or we no longer stock it. I now have the option to automate the sitemaps from a SQL feed but using .asp sitemaps that I would submit the same way in GWT. I'd like your thoughts on the Pro's and cons of this, pluses for me is realtime updates, con's I percieve GMT to prefer xml files. what do you think?
Intermediate & Advanced SEO | | robertrRSwalters0