Yoast SEO plugin and Weak Links
-
The plugin has what I thought was a great feature. My main site is often scrapped and I thought 'well at least we're getting a Link out of it' - due to the RSS feature of Yoast's Wordpress SEO plugin (you can add a link to the bottom of your RSS feeds).
Now Google is talking about Links from weak/crap sites and how they may impact your rankings. So - with this in mind..
Do we want links from scrappers? Are we now better off discontinuing the usage of this feature?
I imagine there may be varying opinions on this so I'll open it as a discussion...
thanks
-
I believe that the balance of the link outweighs the threat of penalty from low quality sites scraping your content - providing your not already seeing warnings in your Google Webmaster Tools.
If your content is being scrapped there will likely be other links in the content anyway (as it's normally good practice to interlink your content within your own content). Removing the link added by the plugin wouldn't remove the links in the content and you would still face the same predicament as if you were to include the footer link.
In the case of your content not being interlinked then the decision would need to be based solely on the quality of the domains scrapping the content and you'd need to do a little bit of research to find the sites that were commonly scraping your content and decide based upon the reputation of those sites.
I don't think an answer to this one will be all that black and white as it's highly situation dependent. If it were doing your rankings major harm then you'd be given a warning in your Google Webmaster Tools and you'd know to disable the footer link in the RSS feed. But if you're getting no warnings, initial impressions of the scraping sites aren't too bad and you've seen no other issues relate to the links then I'd keep the link and hopefully benefit from the free backlink with my chosen anchor text.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What happens if we remove all the links to internal pages from our homepage?
Hi Moz community, We wanna give a try by removing all the links from homepage to internal pages and keep just a free trial button. Will this impact our SEO anyway? We have nearly 15 important internal pages at 2nd and 3rd hierarchy level. They may drop in rankings but we want to risk for few days to understand how it works. Your opinion please! Thanks
Algorithm Updates | | vtmoz0 -
Too many "nofollow" outgoing links are Okay?
Hi all, Our forum have so many discussions and topics where our users leave their websites and oter URLs which will be marked "nofollow" by default. Beside spammy websites, is that Okay to have so many "nofollow" outgoing links? Thanks
Algorithm Updates | | vtmoz0 -
ATTN SEO MINDS: Is there a way/tool to categorize keywords from an Omniture/GA report?
So ideally I would like to take the list of keywords I am currently ranking for, and group these based on what the user intent was in making that query. For example if I am a Thai delivery chain and I am currently receiving traffic from the queries "vegan dish" and "tofu thai food", I would want to have a column in a keyword report that says these queries fall into the VEGETARIAN category. I think what I want to know is how can I filter a massive list by a range of keywords? I want to know does this cell contain, "keyword A" or "keyword B" or "keyword Z". If so list the corresponding category. This way I can look at keyword performance by category or user intent/motivation. Is there a tool out there that will help me accomplish this, or is there a good solution in excel I can use?
Algorithm Updates | | Jonathan.Smith0 -
Relevant Link, but Low DA...good idea?
If a website has a low DA (not because of spam. Just because it's new or because there isn't a ton of content) but it is industry specific/relevant, then is that worth pursuing? I have read how relevancy is supposed to be a major portion determining a link's benefit, but I"m leery about about something with a low DA - like under 15 low. Thanks, Ruben
Algorithm Updates | | KempRugeLawGroup0 -
Drastic Drop in Link Juice
Hi Back in December we shifted my web domain from a gourmetdirect.com to gourmetdirect.co.nz as part of a site-wide revamp. Everything was going along fine until recently when my Linking domains plummeted and external links fell from 6000 approx to 600. We still have the .com live for loads of disfunctional reasons. Can anyone help? I have gone from a top ranker to a no show and my contractors are all shaking their heads.
Algorithm Updates | | GourmetDirect0 -
SEO Audit after Penguin 2.1 what are you guys seeing? this is my thougts
We have looked at around 2000 sites since Penguin 2.1 launched a few weeks back. These include our customers and their own competitors site. We are going through all the data which is obviously going to take some time. Hopefully we will publish a report on our findings as we are happy to share. What I currently see in my early analysis is Roughly 70% of sites tested have 0% exact match Anchor Text for their money keywords. The other 30% have less than 5% exact match Anchor Text. The quality of the links is often still poor to the sites ranking on page 1. The content surrounding the links is only about 10-15% of the time related to the money keywords. The loading time of the sites ranking seems to not matter, we encountered a lot of slow sites. Design and usability of the site was not important. We are not seeing much impact via Social media, a lot of these sites are small business Less than 10% of sites on page 1 had a Google+ account More than 40% of page 1 sites had Facebook profiles. More than 80% of the sites ranking on page 1 had less than 100 links to the landing page that ranked What are your opinions of helping to recover if hit by the above??? Q) If you have too high an anchor text percentage and have been hit or may get hit in the future would you. a) create some more high quality links with more varied anchor text, ie Click here, brand name etc b) not create any more links just remove the links you have to dilute the anchor text c) change the anchor text on links you are able to These figures are a work in progress so data will change just wanting to share our early findings and try to get a good conversation going. What are you guys seeing?
Algorithm Updates | | tempowebdesign0 -
Pdfs for SEO - benefits, downfalls and promotional methods
Hi fellow Mozzers, We're just in the middle of relaunching our website (a design agency), and I had a few questions re: SEO of our service keywords. The designers want the site to seem light on content, despite my advice that this would reduce the terms we can rank for. With that in mind, I was going to include advice pages that can be found via the site map, site search or text links but aren't promoted via the top level or second level nav. Another alternative I was going to explore was using pdfs for design case studies, so the site would feature a light case study, but with a more in-depth pdf available if wanted. I have located numerous articles highlighting how best to optimise pdfs, but I have a few queries aside from the technical standpoint. So: is this the best way to getting round the issue of keeping the site 'light' on content? are there stats that show CTRs on pdf pages over HTML? as well as optimising the pdf content and promoting them on our social media channel, is there a benefit from including them on the likes of Scribd, Edocr and so on (from either an SEO or simply from a promotional viewpoint, or both) Hopefully that's all clear! Nick
Algorithm Updates | | themegroup0 -
Link analysis task
Hi mozzers, I am currently working on a phd, and one of the professors asked me for help. He would like to know how many Danish school websites (n=1500) links to a certain section of a government website (the relevant section has around 1600 pages). The problem is, that the government website is coded very poorly from an seo perspective with lots of strange URL variables, entailing OSE can't give valid data. So, what would be the best way to check how many of the school websites link? Throw all 1500 website through Xenu, or is there a smarter solution? Maybe the link out feature on Bing? Any suggestions will be greatly appreciate. Thanks!
Algorithm Updates | | ThomasHgenhaven0