38% of SEOs Never Disavow Links: Are you one among them or the other 62%?
-
Hi all,
Links disavowing is such a advanced tasks in SEO with decent amount of risk involved. I thought many wouldn't follow use this method as Google been saying that they try to ignore bad links and there will be no penalty for such bad links and negative SEO is really a rare case. But I wondered to see only 38% SEOs never used this method and other 62% are disavowing links monthly, quarterly or yearly. I just wonder do we need to disavow links now? It's very easy to say to disavow a link which is not good but difficult to conclude them whether they are hurting already or we will get hurt once they been disavowed.
Thanks
-
Hi VTmoz,
We didn´t use to disavow a lot since most clients backlinks were done by us and always stayed clear from the "dark side of the moon". We stopped altogether though after last year pinguin´s 4.0 update.
-
I believe a lot of it has to do with your overall link profile. If you are building a website with little/no links, each one plays a much more important role.
I disavow when it's clearly spam, and there has been a dip in rankings when the link appeared.
-
We have not disavowed any links since late last year, and Google announced they would be "ignoring" low-quality links and not"penalizing" because of them. We have not seen any of our clients rankings since we stopped disavowing. There is a great Moz post by Marie Haynes that you may want to check out- https://moz.com/blog/do-we-still-need-to-disavow-penguin. It was published in April 2017, so as I write this it is still relevant. Hope this helps and best of success!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Did any one else notice rankings drops since 14th-21st Feb this year?
Hi everyone, I'm looking for some guidance please. Our website saw a drop in rankings on around 14th - 21st Feb 2019 and the rankings haven't really recovered. I can't see anything significant in Search Console, there was a possible unconfirmed algorithm update during early Feb but I can't really find anything factual to determine if this was the cause. The ranking drop seems to be across different keywords. What else should I check? Did anyone experience such a drop around this time and if so found out why this happened and how to recover? Thanks for your help 🙂 Caroline
Algorithm Updates | | Teepee_Digital0 -
How to take down a sub domain which is receiving many spammy back-links?
Hi all, We have a sub domain which has less engagement for last few years. Eventually many spammy back links pointed to this sub domain. There are relevant back links too. We have deleted most of the pages which are employing spammy content or which have spammy back links. Still I'm confused whether to take this sub domain down or keep it. The confusion between "relevant backlinks might be helping our website" and "spammy backlinks are affecting to drop in rankings"? Thanks
Algorithm Updates | | vtmoz0 -
More or less pages from Homepage? Linking 3rd hierarchy level pages from Homepage.
Hi Moz community, With the concept of preserving link juice, many websites stopped linking too many pages from homepage. We even removed our 3rd hierarchy level pages removed from our homepage. We didn't notice much change in rankings. Recently I have gone through some SEO articles where some experts suggested to link low level pages from homepage which indicates to Google the way we respect and prioritise those pages but not just homepage and very next level pages. This also works in internal linking it seems. Is this true? Can we add such low level pages from homepage? Which actually works Thanks
Algorithm Updates | | vtmoz0 -
Parallax Scrolling when used with “hash bang” technique is good for SEO or not?
Hello friends, One of my client’s website http://chakracentral.com/ is using Parallax scrolling with most of the URLs containing hash “#” tag. Please see few sample URLs below: http://chakracentral.com/#panelBlock4 (service page)
Algorithm Updates | | chakraseo
http://chakracentral.com/#panelBlock3 (about-us page) I am planning to use “hash bang” technique on this website so that Google can read all the internal pages (containing hash “#” tag) with the current site architecture as the client is not comfortable in changing it. Reference: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started#2-set-up-your-server-to-handle-requests-for-urls-that-contain-escaped_fragment But the problem that I am facing is that, lots of industry experts do not consider parallax websites (even with hash bang technique) good for SEO especially for mobile devices. See some references below: http://searchengineland.com/the-perils-of-parallax-design-for-seo-164919
https://moz.com/blog/parallax-scrolling-websites-and-seo-a-collection-of-solutions-and-examples So please find my queries below for which I need help: 1. Will it be good to use the “hash bang” technique on this website and perform SEO to improve the rankings on desktop as well as mobile devices?
2. Is using “hash bang” technique for a parallax scrolling website good for only desktop and not recommended for mobile devices and that we should have a separate mobile version (without parallax scrolling) of the website for mobile SEO?
3. Parallax scrolling technique (even with "hash bang") is not at all good for SEO for both desktop as well as mobile devices and should be avoided if we want to have a good SEO friendly website?
4. Any issue with Google Analytics tracking for the same website? Regards,
Sarmad Javed0 -
Content Caching Memory & Removal of 301 Redirect for Relieving Links Penalty
Hi, A client site has had very poor link legacy, stretching for over 5 years. I started the campaign a year ago, providing valuable good quality links. Link removals and creating a disavow to Google have been done, however after months and months of waiting nothing has happened. If anything, after the recent penguin update, results have been further affected. A 301 redirect was undertaken last year, consequently associating those bad links with the new site structure. I have since removed the 301 redirect in an attempt to detach this legacy, however with little success. I have read up on this and not many people appear to agree whether this will work. Therefore, my new decision is to start a fresh using a new domain, switching from the .com to .co.uk version, helping remove all legacy and all association with the spam ridden .com. However, my main concern with this is whether Google will forever cach content from the spammy .com and remember it, because the content on the new .co.uk site will be exactly the same (content of great quality, receiving hundreds of visitors each month from the blog section along) The problem is definitely link related and NOT content as I imagine people may first query. This could then cause duplicate content, knowing that this content pre-existed on another domain - I will implement a robots.txt file removing all of the .com site , as well as a no index no follow - and I understand you can present a site removal to Google within webmaster tools to help fast track the deindexation of the spammy .com - then once it has been deindexed, the new .co.uk site will go live with the exact same content. So my question is whether Google will then completely forget that this content has ever existed, allowing me to use exactly the same content on the new .co.uk domain without the threat of a duplicate content issue? Also, any insights or experience in the removal of a 301 redirect, detaching legacy and its success would also be very helpful! Thank you, Denver
Algorithm Updates | | ProdoDigital0 -
External Linking Best Practices Question
Is it frowned upon to use basic anchor text such as "click here" within a blog article when linking externally? I understand, ideally, you want to provide a descriptive anchor text, especially linking internally, but can it negatively affect your own website if you don't use a descriptive anchor text when linking externally?
Algorithm Updates | | RezStream80 -
SinglePlatform's Restaurant Menu Across Web Properties vs "SEO-Optimized"
Surprised I wasn't able to find an existing answer given that SinglePlatform apparently serves 500,000 SMBs with menus that appear on over 150 publisher websites. Given Panda's razor-sharp intolerance for duplicate content, am I safe to assume that any claim of SinglePlatform's menu on a local restaurant being beneficial to your SEO is now spurious? If so, what's best way to handle this as a potential SEO liability while still having one of their nicely formatted restaurant menus on your site? For reference: http://www.openforum.com/articles/using-singleplatform-to-build-a-digital-presence Update May 7, 2012 Connected directly with the folks at SinglePlatform, and the answer here is a lot simpler than my over-thinking of it. The menu usually sits within an iFrame or widget so that's that. But the ability to truthfully show an up-to-date menu for any given establishment is a legit way to address the healthy amount of local search intent that seems to be directed at exactly that. Overall a pretty slick platform, looking forward to seeing how they grow into the SMB, local & mobile in the coming months, I think the space is ripe to benefit from products/services that take advantage of these sorts of economies of scale.
Algorithm Updates | | mgalica0 -
Question about Local / Regional SEO
Good Morning Moz Community, I have a local SEO/regional SEO question. I apologize if this question is duplicated from another area on this forum but, a query of the term Regional SEO showed no results, as did similar queries. Please preference this entire question with "Knowing what we know about the most recent changes to local search" I know what has worked in the past, my concern is Now. Working with a heavily regulated client that is regional, mostly East Coast US. They are in Financial Services and state licensing is a requirement. They are licensed in 15 states. Obviously, it would look foolish, in this day in age, to Title Tag individual pages with local modifiers and have numerous pages covering a similar topic with not much difference than localized modifiers in front of the keyword. I've never found that SE's can understand broad regional terms such as New England or Mid Atlantic or Southeast or Northeast, if someone knows different please share. Aside from an exact match search. The client does have 7 offices in various states. Perfectly matching and consistent listings in G Places, Bing Local and Yahoo Local was step one and all their locations are now in those services and there are many more smaller local citation listings are in the works. We have also successfully implemented a plan to generate great reviews from actual customers, for each location, they're receiving a few a day right now. Their local places listings, where they have physical locations, are doing very well but: 1. What would the community's suggestion be on generating more targeted traffic in the 8 states where they have no physical location? 2. The client wants to begin creating smaller blogs that are highly localized to the states and major population centers that they do not have a physical location in. There is an open check book to dedicate to this effort however, I do a lot of work in this industry so I want to offer the best possible, most up to date advice, my concern is that these efforts will have two results: a. be obscured by the ”7 pack" by companies with local brick and mortar b. would detract from the equity built in their existing blog by generating content in other domains, I would prefer to continue growing the main blog. 3. As a follow up, it has been documented that Google is now using the same algorithm for local, personal and personalized, that being the case, is there any value in building links to you Places page? Can you optimize your Places page by using the same off site techniques as you would traditionally? Sorry to kill you with such a long question on a Sunday 🙂
Algorithm Updates | | dogflog1