If other websites implement our RSS feed sidewide on there website, can that hurt our own website?
-
Think about the switching anchors from the backlinks and the 100s of sidewide inlinks... I gues Google will understand that it's just a RSS feed right?
-
I am more of a layperson, and joined this site to research certain topics.
I wrote a blog post on the topic of RSS feed scraping and content theft .. which if I understand it was not exactly the OP question.
Having read the other answers here, however, I am actually wondering if my blog post is inaccurate ... or incomplete, and needs correction.
Are you all saying that there is no harm in your RSS feed being scraped, and it might actually be helpful due to the backlinks you might get? Or are you saying that Google ignores those links as it is clear they come from an RSS feed?
Or, am I misunderstanding your point entirely?
THanks for clarifyingl Here is the post if anyone wants to scan it and respond.
Thanks
PS if I have misunderstood the protocol, and am not to piggyback on someone elses topic, or add a link, please advise. My first foray into this forum
-
Hi David,
Thanks for the clear explenation!
The ownership implementation is a logical but good idea! Dit didn't crossed my mind until now
-
I do need to give recognition to David's answer below. For while most of the time you don't need to worry about RSS links, I've heard of webmasters who've been stung by this. It seems likely to hit lower authority sites harder.
If it is a concern, at least you have the power to do something about it.
1. Always place a rel canonical on every page, with an absolute URL (full path) This way if the scrapers take the whole page, the canonical link pointing to yourself might stay in tact.
2. If you suspect over-optimization filters you can de-optimize your anchor text, or add greater variety.
3. In extreme cases you can file a DMCA takedown request of your copywrited content, but at volume this solution doesn't scale, and is a messy business regardless.\
Regardless, these are for the minority of cases. Most of the time you shouldn't have to worry about it.
-
Hi Cyrus,
Thanks for your reply! As I thought Google wil understand it and ignore RSS generated links.
Above I explained my question, beceause I gues I was a little bit to short...I just copy-paste the addition from above:
I don't mean that I put 100s of link in the RSS feed but, when our RSS feed is shown on an other website, we receive backlinks from that website (through the RSS feed) with switchin anchors. In addition is 9 of the 10 times the RSS feed implemented in the sidebar, so sidewide links. The question is if this situation can hurt the website?..Did this clear my question?
But your answer is clear... Google will understand and ignore just as I expected.
So we don't have to worry about this issue I guesss...
-
Hi David, thanks for your insights...
Maybe I didn't wrote down the second answer as clear as possible..."I am not sure exactly what the question is in the second part. Are you asking if you should put hundreds of site-wide links in your RSS Feed? Either way, here is good measure to take"
I don't mean that I put 100s of link in the RSS feed but, when our RSS feed is shown on an other website, we receive backlinks from that website (through the RSS feed) with switchin anchors. In addition is 9 of the 10 times the RSS feed implemented in the sidebar, so sidewide links. The question is if this situation can hurt the website?..Did this clear my question?
-
Normally, I would bow out as I must be mistaken to any of the SEO Staff at SEOmoz as each of them have forgotten more than I will know. However, I have spent a lot of time on this issue, I learned jQuery for the reason of combating this. I am 100% certain, at least for how it was between June 2012 and Feb. 2013.
I realize that Google has stated they can tell the difference but, Google has a policy of misinformation as part of it's strategy to protect search integrity. I give misinformation it's due credit, it ended the cold war, but it also means you cannot trust anything Google says until it is proven true.
You can parse an RSS feed in a manor that will not retain anything to identify it as coming from an RSS. Google will only know it is your content by chronology, in other words that they always find it on your site first.
How would google know that this:
Came from an RSS feed? It is just an a link, identical to the other billions out there.
I have tested this with two WordPress Installs. Both had Google Analytic and were Verified with Google Webmaster.
On the first I would post articles of 100% original content. On the second I would pick them up and then parse the feed, post it as a post, mention it on the social medias and 100% of the articles would stay indexed on the not original domain and 0% would stay indexed on the original domain.
Then, we switched and had 100% indexed on the original domain. We tested it again on two more domains that were not new. One a PR3 and one a PR1 with the exact same outcome.
The single best thing you can do is post it on Google+, in my experience, after I post on Google+, within just a few minutes I have Google bot on the page.
Establish ownership on each of your pages with meta author or meta publisher tags too, it will help a lot.
-
I'm hesitant to give a definite answer on this. Short answer: Yes, Google should understand these are RSS generated links and typically ignore them. But I've also heard grumblings from webmasters who swear this isn't true.
A better question might be: why 100's of sitewide links? And why are they included in the RSS? The RSS typically includes an article body without much additional navigation. If you address these issues, consolidating your links and cleaning your feed, I'd say you likely have little to worry about from scrapers.
-
To answer your first question, yes it can hurt you. Are they doing this without your permission? You can always send them a take down notice. You will find a great article here by SEOmoz's Sarah Bird titled 4 Ways to Protect Your Copyrights.
I am not sure exactly what the question is in the second part. Are you asking if you should put hundreds of site-wide links in your RSS Feed? Either way, here is good measure to take.
-
Make sure you are pinging your posts to more than just one service or so.
-
Make sure you are linking your posts on Google+ as soon as you post them and Facebook / Twitter / Etc. Too.
-
Make sure you have SOME good links to establish clear ownership of content.
-
I would not put a bunch of links in these feeds. Google may see this site, which is probably involved in other not so great activity as a desired backlink to you and you could end up with undesired association, especially if the link count is very high.
-
There is nothing about an RSS feed, once parsed and restructured as a web page that will make it known to Google that it was an RSS Feed.
-
Why do you have an RSS feed and does anyone follow it?
If you do not have anyone following it, you may want to just shut it down for a while. If you have a good following for it then that is not an option. I had an issue with copy-n-paste content hijacking and we put a small bit of JavaScript in that would whatever you wanted into the clipboard as they copied and then they would paste it in their site. We noticed a big drop in activity after they noticed alerts on their website like
"ALERT: All readers of the content, I have a confession. I have stolen this content and I am involved in Blackhat SEO. For those of you who do not know what that is you will probably know my tools. Ever had a virus or got spam mail? You got that because of visiting sites like mine. Please click here to report me because this message was actually inserted by the guy I stole this from and I do not even realize it is here yet."
Not only does it reduce the theft, it is also fun for the whole family.
I am sure by now there are similar tools for RSS.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hreflang implementation issue
We are currently handling search for a global brand www.example.com which has presence in many countries worldwide. To help Google understand that there is an alternate version of the website available in another language, we have used “hreflang” tags. Also, there is a mother website (www.example.com/global) which is given the attribution of “x-default” in the “hreflang” tag. For Malaysia as a geolocation, the mother website is ranking instead of the local website (www.example.com/my) for majority of the products. The code used for “hreflang” tag execution, on a product page, being: These “hreflang” tags are also present in the XML sitemap of the website, mentioning them below: <loc>http://www.example.com/my/product_name</loc> <lastmod>2017-06-20</lastmod> Is this implementation of “hreflang” tags fine? As this implementation is true across all geo-locations, but the mother website is out-ranking me only in the Malaysia market. If the implementation is correct, what could be other reasons for the same ranking issue, as all other SEO elements have been thoroughly verified and they seem fine.
Intermediate & Advanced SEO | | Starcom_Search0 -
Website copying in Tweets from Twitter
Just noticed a web developer I work with has been copying tweets into the website - and these are displayed (and saved) one page at a time across hundreds of pages (this is so they can populate a twitter feed, I am told). How would you tackle this, now that the deed's been done? This is in Drupal. Your thoughts would be welcome as this is a new one to me. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Internal Linking - Can You Over Do It?
Hi, One of the sites I'm working on has a forum with thousands of pages, amongst thousands of other pages. These pages produce lots of organic search traffic... 200,000 per month. We're using a bit of custom code to link relevant words and phrases from various discussion threads to hopefully related discussion pages. This generates thousands of links and up to 8 in-context links per page. A page could have anywhere from 200 to 3000 words in one to 50+ comments. Generally, a page with 200 words would have fewer of these automatically generated links, just because there are fewer terms naturally on the page. Is there any possible problem with this, including but not limited to some kind of internal anchor text spam or anything else? We do it to knit together pages for link juice and hopefully user experience... giving them another page to go to. The pages we link to are all our pages that produce or we hope to produce organic search traffic from. Thanks! ....Darcy
Intermediate & Advanced SEO | | 945010 -
New Website. Changing TLD or not?
Hi, At my company we are making a new website because the days of the old one are numbered. We already decided that the folder structure will be changed so we have more "clean" url's. Now we also would like to change from .net/nl to .nl . Since we already are redirecting all url's (>10.000), we think this is the moment to switch the TLD. What do you guys think? Is their anyone who has some kind of experience/tip they would like to share?
Intermediate & Advanced SEO | | SEO_ACSI0 -
Merging two websites to one...
Hi all. Could do with a second opinion on this please... At present a client of ours owns two shops (both doing the same but in towns about 20 miles apart - they sell flooring, but using different names) and has a website for each. The plan is to rebrand both of these stores the same and merge both websites into one. The problem comes that both of the individual websites rank very well in their respective Google Local search results and I fear that killing one of the sites will mean that one store will vanish from the local listings. One domain is a DA 45 and the other a DA 11 so the plan is to use the stronger of the two domains. The question I would like to ponder with people wiser than myself is how can we ensure that the new single domain ranks for both locations in the local? Would the easiest solution be to have pages such as domain.com/store1 and domain.com/store2 with full listings for that store inc name, address, phone number, customer reviews etc? At present the DA 45 domain ranks very well in it's Google local so we need to find a way to change the homepage of that to have both the stores phone numbers but without affecting the local listing. I was considering adding the second phone number as a text based image so that it's visible for people but not for bots Finally, would 301 redirecting the now unused store to domain.com/store2 help with ensuring that we do not lose any local listing for that keyword? If not, are there any suggestions people could offer up Many thanks for any help and sorry for the very long question Carl
Intermediate & Advanced SEO | | GrumpyCarl0 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
Can I Improve Organic Ranking by Restrict Website Access to Specific IP Address or Geo Location?
I am targeting my website in US so need to get high organic ranking in US web search. One of my competitor is restricting website access to specific IP address or Geo location. I have checked multiple categories to know more. What's going on with this restriction and why they make it happen? One of SEO forum is also restricting website access to specific location. I can understand that, it may help them to stop thread spamming with unnecessary Sign Up or Q & A. But, why Lamps Plus have set this? Is there any specific reason? Can I improve my organic ranking? Restriction may help me to save and maintain user statistic in terms of bounce rate, average page views per visit, etc...
Intermediate & Advanced SEO | | CommercePundit1 -
What are the SEO ramifications to forwarding your website to Facebook?
I have a client who wants to forward their website traffic to a campaign on Facebook for two week. I think it's a horrible idea on so many levels, but need a solid reason why. My gut says that their Google rankings will suffer, but I can't find any research/articles that state such. Help?
Intermediate & Advanced SEO | | Axis410