RSS feeds- What are the secrets to getting them, and the links inside then, indexed and counted for SEO purposes?
-
RSS feeds, at least on paper, should be a great way to build backlinks and boost rankings. They are also very seductive from a link-builder's point of view- free, easy to create, allows you to specifiy anchor text, etc. There are even several SEO articles, anda few products, extolling the virtues of RSS for SEO puposes.
However, I hear anecdotedly that they are extremely ineffective in getting their internal links indexed. And my success rate has been abysmal- perhaps 15% have ever been indexed,and so far, I havenever seem Google show an RSS feed as a source for a backlink. I have even thrown some token backlinks against RSS feeds to see if that helped in getting them indexed, but even that has a very low success rate.
I recently read a blog post saying that Google "hates aRSS feeds" and "rarely spiders perhaps the first link or two." Yet there are many SEO advocates who claim that RSS feeds are a great untapped resource for SEO. I am rather befuddled.
Has anyone "crackedthe code" onhow to get them,and the links that they contain, indexed and helping rankings?
-
Actually, RSS feeds are also used as a defensive method of link building. YOAST makes a plugin for Wordpress that everyone should use (if they use wordpress), one of the features is inserting text and links into your RSS feed.
Obnoxious scraper sites use RSS feeds to populate their websites, they do not monitor the content, its all automated. By putting links and a citation in your RSS feeds, this lets you at least get a little benefit from their theft of your content.
Link Explorer shows feedburner and a couple other RSS agg sites as high value referring sites.
-
why would anyone need this service? I believe the original question was RSS feeds from the site owner being indexed? RSS feeds should be submitted too google webmaster tools to be index by google and Bing offers a similar service too webmasters, After initial submission the webmaster never has to submit again?
If I wanted to push my content using RSS feeds then I would use Ping.fm to push my content and links to third party sites and social media.......
I am at a loss why a webmaster would use the linkilecious site?
-
Really detailed overlook. Nice touching on everything.
-
If I understand the question correctly you would like your content to be spread to other sites through rss feeds and then be indexed there with a backlink to your site?
Number 1: there must be a reason for the other site to index and create a backlink to your site.
Number 2: these links are almost always "no follow" and therefore need to reach a very high amount of links to be of any real use for you if you want to affect the serp.
eg: You submit your site to several "ping" sites of your choosing that index certain content and then when you publish a new story these sites get pinged from your cms and a nofollow backlink is created for you on that site,
Just make certain that these sites that you ping actually has good content and have fills a puropose for the visitors.
A better way though to keep control over the material is to create an own site running wordpress where you write about your site as a blog. Just put a news section in a sidebar and put your RSS feed in there. wordpress sites are indexed extremly fast and when you own the site you can choose to use follow links in the section on the blog site.
This should lead to a faster indexing and you create backlinks that have a function and furthermore you own the site linking to you primary site.
A short summary:
RSS feeds are good to spread content and attract visitors. They're not a quick way to get backlinks.
-
We use an RSS feed for new product lists. We may have some lag time before a new product gets put in a category and able to be browsed to on our site. The RSS feed gives a few days head start getting these new products into the search engines. We redirect all RSS links back to the main site links that include canonical tags for the main product pages.
-
RSS should be designed primarily for your users and secondly to syndicate out using RSS Aggregators to distribute parts of your content (headlines and URLS)
Be careful about how much of the article content you include within the RSS Feed themselves. Whilst is it good for the user to include the full article within the feed by doing so you are also giving scrapers an an easy time to reproduce your content and thus might end up being penalising for duplicate content even though you are the original source (I've seen this happen).
I've used two techniques in the past the first was to publish a short additional body that contain a call to action to follow the link to the original article stub. I then switched to publishing the full content within the feed just for my users but I am thinking about going changing it again and publishing part of the content within the feed and then have a call to action for the reader to visit my site for the full article which will hopefully increase CTR on the feed whilst reducing the content duplication issue
-
The link building power of rss feeds is simply in getting other sites to feature and link to your content via rss. There would be no utility for a bot to crawl your feed stand alone, it would rather just look at the content itself. Try submitting your feed to rss directories or having other webmasters feature your feed on their site. I believe several web 2.0 sites like squid allow for feed publishing as well. Hope that helps.
-
Sorry, I'm a little confused as well. Why would you want people linking to your RSS feed instead of your original posts? Why would you even want the RSS feed to be indexed and returned in search results rather than the original posts? Wouldn't Google want to link people to the original post vs. the RSS feed? Aren't RSS feeds supposed to be a feed of content already on your site... so I don't see why Google would have much of an incentive to spider it or return it in search results?
-
You load links into it, it then creates an RSS feed on their end that gets pinged. You can load any kind of link into it and it'll ping them.
-
Thanks, but Linklicious turns links into RSS feeds- it doesn't help get the RSS feeds, or their internal links, to get indexed as far as I know. Am I not understanding the service correctly?
-
This service works well, I've personally tested it: http://linklicious.me/
Try that or another pinging service, there are a ton of them out there.
Good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Only a fraction of the sitemap get indexed
I have a large international website. The content is subdivided in 80 countries, with largely the same content all in English. The URL structure is: https://www.baumewatches.com/XX/page (where XX is the country code)
Intermediate & Advanced SEO | | Lvet
Language annotations hreflang seem to be set up properly In the Google Search Console I registered: https://www.baumewatches.com the 80 instances of https://www.baumewatches.com/XX in order to geo target the directories for each country I have declared a single global sitemap for https://www.baumewatches.com (https://www.baumewatches.com/sitemap_index.xml structured in a hierarchical way) The problem is that the site has been online already for more than 8 months and only 15% of the sitemap URLs have been indexed, with no signs of new indexations in the last 3 months. I cannot think about a solution for this.0 -
Subdomains + SEO
Hi everyone, So a little background - my company launched a new website (http://www.everyaction.com). The homepage is currently hosted on an amazon s3 bucket while the blog and landing pages are hosted within Hubspot. My question is - is that going to end up hurting our SEO in the long run? I've seen a much slower uptick in search engine traffic than I'm used to seeing when launching new sites and I'm wondering if that's because people are sharing the blog.everyaction.com url on social (which then wouldn't benefit just everyaction.com?) Anyways, a little help on what I should be considering when it comes to subdomains would be very helpful. Thanks, Devon
Intermediate & Advanced SEO | | EveryActionHQ0 -
'Nofollow' footer links from another site, are they 'bad' links?
Hi everyone,
Intermediate & Advanced SEO | | romanbond
one of my sites has about 1000 'nofollow' links from the footer of another of my sites. Are these in any way hurtful? Any help appreciated..0 -
How hard would it be to take a well-linked site, completely change the subject matter & still retain link authority?
So, this would be taking a domain with a domain authority of 50 (200 root domains, 3500 total links) and, for fictitious example, going from a subject matter like "Online Deals" to "The History Of Dentistry"... just totally unrelated new subject for the old/re-purposed domain. The old content goes away entirely. The domain name itself is a super vague .com name and has no exact match to anything either way. I'm wondering, if the DNS changed to different servers, it went from 1000 pages to a blog, ownership/contacts stayed the same, the missing pages were 301'd to the homepage, how would that fare in Google for the new homepage focus and over what time frame? Assume the new terms are a reasonable match to the old domain authority and compete U.S.-wide... not local or international. Bonus points for answers from folks who have actually done this. Thanks... Darcy
Intermediate & Advanced SEO | | 945010 -
Google Indexed my Site then De-indexed a Week After
Hi there, I'm working on getting a large e-commerce website indexed and I am having a lot of trouble.
Intermediate & Advanced SEO | | Travis-W
The site is www.consumerbase.com. We have about 130,000 pages and only 25,000 are getting indexed. I use multiple sitemaps so I can tell which product pages are indexed, and we need our "Mailing List" pages the most - http://www.consumerbase.com/mailing-lists/cigar-smoking-enthusiasts-mailing-list.html I submitted a sitemap a few weeks ago of a particular type of product page and about 40k/43k of the pages were indexed - GREAT! A week ago Google de-indexed almost all of those new pages. Check out this image, it kind of boggles my mind and makes me sad. http://screencast.com/t/GivYGYRrOV While these pages were indexed, we immediately received a ton of traffic to them - making me think Google liked them. I think our breadcrumbs, site structure, and "customers who viewed this product also viewed" links would make the site extremely crawl-able. What gives?
Does it come down to our site not having enough Domain Authority?
My client really needs an answer about how we are going to get these pages indexed.0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
Steps you can take to ensure your content is indexed and registered to your site before a scraper gets to it?
Hi, A clients site has significant amounts of original content that has blatantly been copied and pasted in various other competitor and article sites. I'm working with the client to rejig lots of this content and to publish new content. What steps would you recommend to undertake when the new, updated site is launched to ensure Google clearly attributes the content to the clients site first? One thing I will be doing is submitting a new xml + html sitemap. Thankyou
Intermediate & Advanced SEO | | Qasim_IMG0 -
Getting Google to Correct a Misspelled Site Link...Help!
My company website recently got its site links in google search... WooHoo! However, when you type TECHeGO into Google Search one of the links is spelled incorrectly. Instead of 'CONversion Optimization' its 'COversion Optimization'. At first I thought there was a misspelling on that page somewhere but there is not and have come to the conclusion that Google has made a mistake. I know that I can block the page in webmaster tools (No Thanks) but how in the crap can I get them to correct the spelling when no one really knows how to get them to appear in the first place? Riddle Me That Folks! sitelink.jpg
Intermediate & Advanced SEO | | TECHeGO0