Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
RSS feeds- What are the secrets to getting them, and the links inside then, indexed and counted for SEO purposes?
-
RSS feeds, at least on paper, should be a great way to build backlinks and boost rankings. They are also very seductive from a link-builder's point of view- free, easy to create, allows you to specifiy anchor text, etc. There are even several SEO articles, anda few products, extolling the virtues of RSS for SEO puposes.
However, I hear anecdotedly that they are extremely ineffective in getting their internal links indexed. And my success rate has been abysmal- perhaps 15% have ever been indexed,and so far, I havenever seem Google show an RSS feed as a source for a backlink. I have even thrown some token backlinks against RSS feeds to see if that helped in getting them indexed, but even that has a very low success rate.
I recently read a blog post saying that Google "hates aRSS feeds" and "rarely spiders perhaps the first link or two." Yet there are many SEO advocates who claim that RSS feeds are a great untapped resource for SEO. I am rather befuddled.
Has anyone "crackedthe code" onhow to get them,and the links that they contain, indexed and helping rankings?
-
Actually, RSS feeds are also used as a defensive method of link building. YOAST makes a plugin for Wordpress that everyone should use (if they use wordpress), one of the features is inserting text and links into your RSS feed.
Obnoxious scraper sites use RSS feeds to populate their websites, they do not monitor the content, its all automated. By putting links and a citation in your RSS feeds, this lets you at least get a little benefit from their theft of your content.
Link Explorer shows feedburner and a couple other RSS agg sites as high value referring sites.
-
why would anyone need this service? I believe the original question was RSS feeds from the site owner being indexed? RSS feeds should be submitted too google webmaster tools to be index by google and Bing offers a similar service too webmasters, After initial submission the webmaster never has to submit again?
If I wanted to push my content using RSS feeds then I would use Ping.fm to push my content and links to third party sites and social media.......
I am at a loss why a webmaster would use the linkilecious site?
-
Really detailed overlook. Nice touching on everything.
-
If I understand the question correctly you would like your content to be spread to other sites through rss feeds and then be indexed there with a backlink to your site?
Number 1: there must be a reason for the other site to index and create a backlink to your site.
Number 2: these links are almost always "no follow" and therefore need to reach a very high amount of links to be of any real use for you if you want to affect the serp.
eg: You submit your site to several "ping" sites of your choosing that index certain content and then when you publish a new story these sites get pinged from your cms and a nofollow backlink is created for you on that site,
Just make certain that these sites that you ping actually has good content and have fills a puropose for the visitors.
A better way though to keep control over the material is to create an own site running wordpress where you write about your site as a blog. Just put a news section in a sidebar and put your RSS feed in there. wordpress sites are indexed extremly fast and when you own the site you can choose to use follow links in the section on the blog site.
This should lead to a faster indexing and you create backlinks that have a function and furthermore you own the site linking to you primary site.
A short summary:
RSS feeds are good to spread content and attract visitors. They're not a quick way to get backlinks.
-
We use an RSS feed for new product lists. We may have some lag time before a new product gets put in a category and able to be browsed to on our site. The RSS feed gives a few days head start getting these new products into the search engines. We redirect all RSS links back to the main site links that include canonical tags for the main product pages.
-
RSS should be designed primarily for your users and secondly to syndicate out using RSS Aggregators to distribute parts of your content (headlines and URLS)
Be careful about how much of the article content you include within the RSS Feed themselves. Whilst is it good for the user to include the full article within the feed by doing so you are also giving scrapers an an easy time to reproduce your content and thus might end up being penalising for duplicate content even though you are the original source (I've seen this happen).
I've used two techniques in the past the first was to publish a short additional body that contain a call to action to follow the link to the original article stub. I then switched to publishing the full content within the feed just for my users but I am thinking about going changing it again and publishing part of the content within the feed and then have a call to action for the reader to visit my site for the full article which will hopefully increase CTR on the feed whilst reducing the content duplication issue
-
The link building power of rss feeds is simply in getting other sites to feature and link to your content via rss. There would be no utility for a bot to crawl your feed stand alone, it would rather just look at the content itself. Try submitting your feed to rss directories or having other webmasters feature your feed on their site. I believe several web 2.0 sites like squid allow for feed publishing as well. Hope that helps.
-
Sorry, I'm a little confused as well. Why would you want people linking to your RSS feed instead of your original posts? Why would you even want the RSS feed to be indexed and returned in search results rather than the original posts? Wouldn't Google want to link people to the original post vs. the RSS feed? Aren't RSS feeds supposed to be a feed of content already on your site... so I don't see why Google would have much of an incentive to spider it or return it in search results?
-
You load links into it, it then creates an RSS feed on their end that gets pinged. You can load any kind of link into it and it'll ping them.
-
Thanks, but Linklicious turns links into RSS feeds- it doesn't help get the RSS feeds, or their internal links, to get indexed as far as I know. Am I not understanding the service correctly?
-
This service works well, I've personally tested it: http://linklicious.me/
Try that or another pinging service, there are a ton of them out there.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Many Links to Disavow at Once When Link Profile is Very Spammy?
We are using link detox (Link Research Tools) to evaluate our domain for bad links. We ran a Domain-wide Link Detox Risk report. The reports showed a "High Domain DETOX RISK" with the following results: -42% (292) of backlinks with a high or above average detox risk
Intermediate & Advanced SEO | | Kingalan1
-8% (52) of backlinks with an average of below above average detox risk
-12% (81) of backlinks with a low or very low detox risk
-38% (264) of backlinks were reported as disavowed. This look like a pretty bad link profile. Additionally, more than 500 of the 689 backlinks are "404 Not Found", "403 Forbidden", "410 Gone", "503 Service Unavailable". Is it safe to disavow these? Could Google be penalizing us for them> I would like to disavow the bad links, however my concern is that there are so few good links that removing bad links will kill link juice and really damage our ranking and traffic. The site still ranks for terms that are not very competitive. We receive about 230 organic visits a week. Assuming we need to disavow about 292 links, would it be safer to disavow 25 per month while we are building new links so we do not radically shift the link profile all at once? Also, many of the bad links are 404 errors or page not found errors. Would it be OK to run a disavow of these all at once? Any risk to that? Would we be better just to build links and leave the bad links ups? Alternatively, would disavowing the bad links potentially help our traffic? It just seems risky because the overwhelming majority of links are bad.0 -
Deep linking with redirects & building SEO
Hi there. I'm using deep linking with unique URL's that redirect to our website homepage or app (depending on whether the user accesses the link from an iphone or computer) as a way to track attribution and purchases. I'm wondering whether using links that redirect negatively affects our SEO? Is the homepage still building SEO rank despite the redirects? I appreciate your time & thanks for your help.
Intermediate & Advanced SEO | | L_M_SEO0 -
Getting SEO Juice back after Redirect
Hi, On my website, many product pages were redirected over time to its product category, due to the product being unavailable. I understand with a 301 redirect, the final URL would have lost about 15% of the link juice. However - if after some time (e.g. 2 months, or 1 year) I remove the redirection - is the original page going to have any SEO juice, or did it already lose all of it? Thanks,
Intermediate & Advanced SEO | | viatrading10 -
What is best practice for "Sorting" URLs to prevent indexing and for best link juice ?
We are now introducing 5 links in all our category pages for different sorting options of category listings.
Intermediate & Advanced SEO | | lcourse
The site has about 100.000 pages and with this change the number of URLs may go up to over 350.000 pages.
Until now google is indexing well our site but I would like to prevent the "sorting URLS" leading to less complete crawling of our core pages, especially since we are planning further huge expansion of pages soon. Apart from blocking the paramter in the search console (which did not really work well for me in the past to prevent indexing) what do you suggest to minimize indexing of these URLs also taking into consideration link juice optimization? On a technical level the sorting is implemented in a way that the whole page is reloaded, for which may be better options as well.0 -
Should you delete old blog posts for SEO purposes?
Hey all, When I run crawl diagnostics I get around 500 medium-priority issues. The majority of these (95%) come from issues with blog pages (duplicate titles, missing meta desc, etc.). Many of these pages are posts listing contest winners and/or generic announcements (like, "we'll be out of the office tomorrow"). I have gone through and started to fix these, but as I was doing so I had the thought: what is the point of updating pages that are completely worthless to new members (like a page listing winners in 2011, in which case I just slap a date into the title)? My question is: Should I just bite the bullet and fix all of these or should delete the ones that are no longer relevant? Thanks in advance, Roman
Intermediate & Advanced SEO | | Dynata_panel_marketing1 -
CDN for SEO (or not)?
Does CDN impact on SEO or not? There seems conflicting ideas as to whether they impact positively or negatively, I realise that if the page loads quicker this is a good thing for SEO and usability of course. Does Google see CDN as just cheating and a get-around for not doing the work from the ground up and using good hosting etc? Do you have any direct experience? All constructive input much appreciated!
Intermediate & Advanced SEO | | seoman101 -
De-indexed Link Directory
Howdy Guys, I'm currently working through our 4th reconsideration request and just have a couple of questions. Using Link Detox (www.linkresearchtools.com) new tool they have flagged up a 64 links that are Toxic and should be removed. After analysing them further alot / most of them are link directories that have now been de-indexed by Google. Do you think we should still ask for them to be removed or is this a pointless exercise as the links has already been removed because its been de-indexed. Would like your views on this guys.
Intermediate & Advanced SEO | | ScottBaxterWW0 -
Is linking to search results bad for SEO?
If we have pages on our site that link to search results is that a bad thing? Should we set the links to "nofollow"?
Intermediate & Advanced SEO | | nicole.healthline0