Too many "nofollow" outgoing links are Okay?
-
Hi all,
Our forum have so many discussions and topics where our users leave their websites and oter URLs which will be marked "nofollow" by default. Beside spammy websites, is that Okay to have so many "nofollow" outgoing links?
Thanks
-
Well having too much spammy content is a really difficult battle to fight.
Google will eventually tackle the bad, the spammy, the duplicate and the keywords stuffed content out of their rankings.
As your website is a forum, i'd advise to make more strict rules to the community and sanction the spammy content. Also educate the community in order to do not violate content google guidelines.Hope it helps.
Best luck.
GR. -
Hi GR,
I agree with too many nofollow outgoing spammy links. But how about having so much spammy content in-terms of text, keywords and hyperlinks?
Thanks
-
Hello vtmoz,
No, there is no harm in having tons or thousands of outgoing links as nofollow.
I havent found yet a reason not to have the necessary nofollow tags.
There is no limit to the protection about the spammy and many outgoing links you are having.Take also into consideration that, the message you are sending is to the robots not no follow that link because you think it doesnt carry out any valuable information. You might even be helping google, as you prevent the bot from analyzing spammy websites.
Hope it helps.
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
Is it Okay to have "No Response" pages?
Hi all, I can see some "No Response" pages which gives a error message "Site cannot be reached" or keeps on loading but don't. I have got this list from Screaming from spider tool. Do we need to fix these or ignore? Thanks
Algorithm Updates | | vtmoz0 -
Footer menu links: Header tags or list items?
Hi, I would like to know header tags (h5 or h6) or list items ( ) works better for footer menu links for the best linking structure. Thanks
Algorithm Updates | | vtmoz1 -
Can we ignore "broken links" without redirecting to "new pages"?
Let's say we have reaplced www.website.com/page1 with www.website.com/page2. Do we need to redirect page1 to page2 even page1 doesn't have any back-links? If it's not a replacement, can we ignore a "lost page"? Many websites loose hundreds of pages periodically. What's Google's stand on this. If a website has replaced or lost hundreds of links without reclaiming old links by redirection, will that hurts?
Algorithm Updates | | vtmoz0 -
Do I need to track my rankings on the keywords "dog" and "dogs" separately? Or does Google group them together?
I'm creating an SEO content plan for my website, for simplicity's sake lets say it is about dogs. Keeping SEO in mind, I want to strategically phrase my content and monitor my SERP rankings for each of my strategic keywords. I'm only given 150 keywords to track in Moz, do I need to treat singular and plural keywords separately? When I tried to find estimated monthly searches in Google's keyword planner, it is grouping together "dog" and "dogs" under "dogs"... and similarly "dog company" and "dog companies" under "dog companies". But when I use Moz to track my rankings for these keywords, they are separate and my rankings vary between the plural version and singular version of these words. Do I need to track and treat these keywords separately? Or are they grouped together for SEO's sake?
Algorithm Updates | | Fairstone0 -
How could Google define "low quality experience merchants"?
Matt Cutts mentioned at SXSW that Google wants to take into consideration the quality of the experience ecommerce merchants provide and work this into how they rank in SERPs. Here's what he said if you missed it: "We have a potential launch later this year, maybe a little bit sooner, looking at the quality of merchants and whether we can do a better job on that, because we don’t want low quality experience merchants to be ranking in the search results.” My question; how exactly could Google decide if a merchant provides a low and high quality experience? I would image it would be very easy for Google to decide this with merchants in their Trusted Store program. I wonder what other data sets Google could realistically rely upon to make such a judgment. Any ideas or thoughts are appreciated.
Algorithm Updates | | BrianSaxon0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0 -
Rel="author" - This could be KickAss!
Google is now encouraging webmasters to attribute content to authors with rel="author". You can read what google has to say about it here and here. A quote from one of google's articles.... When Google has information about who wrote a piece of content on the web, we may look at it as a signal to help us determine the relevance of that page to a user’s query. This is just one of many signals Google may use to determine a page’s relevance and ranking, though, and we’re constantly tweaking and improving our algorithm to improve overall search quality. I am guessing that google might use it like this..... If you have several highly successful articles about "widgets", your author link on each of them will let google know that you are a widget expert. Then when you write future articles about widgets, google will rank them much higher than normal - because google knows you are an authority on that topic. If it works this way the rel="author" attribute could be the equivalent of a big load of backlinks for highly qualified authors. What do you think about this? Valuable? Also, do you think that there is any way that google could be using this as a "content registry" that will foil some attempts at content theft and content spinning? Any ideas welcome! Thanks!
Algorithm Updates | | EGOL3