Do rss feeds help seo?
-
If we put relevant RSS feeds on a site, will it help the SEO value?
Years ago, I shied away from RSS feeds because they slowed the site down and I didn't like relying on them. However, the past couple years, the Internet has become better, especially in Alaska.
-
ahh, I see. I think there can be cases where it's valuable to pull in a feed. Perhaps an index of relevant headlines around a topic, or maybe headlines from a users blog posts. I wouldn't use it as the primary basis of an SEO strategy though.
Also, one tip if you're planning on doing something like this. It's much better to cache the results of a feed on your server, rather than pulling in a live RSS feed every time a user loads a page. Given the value that Google places on page speed loading, you would probably have a greater negative impact on page load speed by adding RSS feeds to your site, unless you have a cache system in place.
-
I would strongly suggest staying away from it. We'll use it for ORM purposes sometimes when the KW diff is really low but for the most part stay away from it as it will most likely get you dinged for duplicate content... no bueno
HOWEVER. If you still need to scrape other RSS feeds... make sure to give attribution and add a robots noindex meta tag to each page where content is republished. That way you can hopefully sidestep any and all negative penalties.
-
I think the question was not phrased properly.
I don't want to push feeds, but PULL feeds. Will the pulled feeds add SEO value?
-
In general, I think it's good to have RSS feeds for a site. It's a good way of reaching new viewers, who might only come across your content in a reader or a syndicated version. Also, lots of social platforms have an option for including an RSS feed - so if you interact on the social platform - people you are interacting with would be able to see some of the most recent content from your site.
Google doesn't count RSS feeds as duplicate content, and is generally smart enough to figure out the original source of the content if it was republished in part somewhere.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Htaccess file help
Hi, thanks for looking i am trying (and failing) to write a htaccess for the following scenario <colgroup><col width="679"></colgroup> http://www.gardening-services-edinburgh.com/index.html http://www.gardening-services-edinburgh.com http://www.gardening-services-edinburgh.com/ | so that all of these destinations goto the one resource any ideas? thanks andy
Technical SEO | | McSEO0 -
Javascript redirects -- what are the SEO pitfalls?
Q: Is it dangerous (SEO fallout) to use javascript redirects? Tech team built a browser side tool for me to easily redirect old/broken links. This is essentially a glorified 400 page -- pops a quick message that the page requested no longer exists and that we're automatically sending you to a page that has the content you are looking. Tech team does not have the bandwidth to handle this via apache and this tool is what they came up with for me to deliver a better customer experience. Back story: very large site and I'm dealing with thousands of pages that could/should/need to be redirected. My issue is incredibly similar to what Rand mentioned way back in a post from 2009: Are 404 Pages Always Bad for SEO? We've also decided to let these pages 404 and monitor for anything that needs an apache redirect. Tool mentioned above was tech's idea to give me "the power" to manage redirects. What do you think?
Technical SEO | | FR1230 -
Site Launching, not SEO Ready
Hi, So, we have a site going up on Monday, that in many ways hasn't been gotten ready for search. The focus has been on functionality and UX rather than search, which is fair enough. As a result, I have a big list of things for the developer to complete after launch (like sorting out duplicate pages and adding titles that aren't "undefined" etc.). So, my question is whether it would be better to noindex the site until all the main things are sorted before essentially presenting search engines with the best version we can, or to have the site be indexed (duplicate pages and all) and sort these issues "live", as it were? Would either method be advisable over the other, or are there any other solutions? I just want to ensure we start ranking as well as possible as quickly as possible and don't know which way to go. Thanks so much!
Technical SEO | | LeahHutcheon0 -
When Should You Start SEO?
I am launching a new website (related to IT services) on Monday 6th May 2013. What should be my SEO/SMO/PPC strategy for a brand new website with new domain ? I have a blog within the website as well. Is it better to promote internal blog or should i focus on external bogs like wordpress ?
Technical SEO | | afycon0 -
Multiple domain SEO strategy
Hi Mozzers I'm an AM at a web dev. We're building a new site for a client who sells paint to different markets: Paint for boats Paint for construction industry Paint for, well you get the idea! Would we be better off setting up separate domains - boatpaintxxx.com, housepaintxxx.com, etc - and treat each as a searate microsites for standalone SEO activity or have them as individual pages/sub doms from a single domain - paints4all.com or something? From what i've read today, including the excellent Beginners Guide - I'm guessing there's no definitive answer! Feedback appreciated! Thanks.
Technical SEO | | rikmon0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
Does Bitly hurt your SEO?
I often use bit.ly or Google URL shortener in links when other websites post my articles so I can track clicks. However, I am thinking this may HURT my SEO given that it is taking away a back link to my website. Is that logic correct ? If so, what is a good way to be able to track clicks if a website posts your article without jeopardizing the SEO value?
Technical SEO | | StreetwiseReports1 -
301 Redirect Help
Hello! I am getting ready to launch my freshly coded site in the next week or so. My product URLs are changing SLIGHTLY and want to confirm I am going about things the right way: A. My LIVE site store URLs look like http://hiphound.com/shop/dog-collars . My DEV site store URLs look like http://hiphound.com/dog-collars . No /shop directory. B. The dev firm installed the rewrite rule below: ############################################ enable rewrites Options +FollowSymLinks RewriteEngine on #RedirectMatch 301 ^/shop?/$ http://hiphound.com/ RedirectMatch 301 ^/shop?/$ http://hiphound.com ########################################### C. When I manually enter a URL with /shop in the address the website redirects to the correct page which is good. QUESTIONS I HAVE 1. Is the above redirect correct? I need them to permanent. Don't think the above is right... 2. Will links in the Google index be redirected as well? I am assuming yes but just want to confirm. 3. For each page indexed in Google will its pagerank, etc. be passed to the new page using just the 301 above? 4. Do I need to create addtional 301s for each page? So mapping the old page to the new page? Please advise. The goal here is to of course preserve the rankings of the pages already in the Google index. THANK YOU!!! Lynn
Technical SEO | | hiphound0