If I get spammy backlinks removed is it still necessary to disavow?
-
Now there is some conflicting beliefs here and I want to know what you think.
If I got a high spam website to remove my backlink, is a disavow through search console still necessary ?
Keep in mind if it helps even in the slightest to improve rankings im for it!
-
You generally don't need to take any action on these types of links (you don't need to remove or disavow). Google can see they are just scraped duplicates or a real article and ignore them.
But let's say they were harmful bad links (maybe paid links or irrelevant placed sneakily by you - ie: a link to iphones from a page about dogs), then when you remove links it's always a good stop-gap to also disavow. Because Google might not immediately crawl the URLs with bad links right away, but the disavow they will in theory pick up on more quickly.
-
all spam links must be disvow specially the root domain of the spam link.
-
The situation is this. I was featured on a high quality website. Immediately 8 other high quality sites copied the exact article which linked to me.. Now I have these backlinks
-
If that links have spam score above 70, then you don't need to disavow. Google is already considered as spam.
-
Hello,
Disavow is no longer necessary - https://www.searchenginejournal.com/google-disavow-tool/289871/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help with Getting Googlebot to See Google Charts
We received a message from Google saying we have an extremely high number of URLs that are linking to pages with similar or duplicate content. The main difference between these pages are the Google charts we use. It looks like Google isn't able to see these charts (most of the text are very similar) and the charts (lots of it) are the main differences between these pages. So my question is what is the best approach to allowing Google to see the data that exists in these charts? I read from here http://webmasters.stackexchange.com/questions/69818/how-can-i-get-google-to-index-content-that-is-written-into-the-page-with-javascr that a solution would be to have the text that is displayed on the charts coded into the html and hidden by CSS. I'm not sure but it seems like a bad idea to have it seen by Google but hidden to the user by CSS. It just sounds like a cloaking hack. Can someone clarify if this is even a solution or is there a better solution?
Technical SEO | | ERICompensationAnalytics1 -
Can you have an SSL cert but still have http?
I was under the impression that if you got an SSL cert for your site that the site would change to https. I ran this site: http://thekinigroup.com/ through an SSL checker and it said it had one...but it's http. 1. Why didn't it change to https? Is there an extra step there that needs to be done? 2. Is there a reason someone would choose to get an SSL cert, but not have https? Thanks, Ruben
Technical SEO | | KempRugeLawGroup0 -
Site removed from Google Index
Hi mozers, Two months ago we published http://aquacion.com We registered it in the Google Webmaster tools and after a few day the website was in the index no problem. But now the webmaster tools tell us the URLs were manually removed. I've look everywhere in the webmaster tools in search for more clues but haven't found anything that would help me. I sent the acces to the client, who might have been stupid enough to remove his own site from the Google index, but now, even though I delete and add the sitemap again, the website won't show in Google SERPs. What's weird is that Google Webmaster Tools tells us all the page are indexed. I'm totally clueless here... Ps. : Added screenshots from Google Webmaster Tools. Update Turns out it was my mistake after all. When my client developped his website a few months ago, he published it, and I removed the website from the Google Index. When the website was finished I submited the sitemap, thinking it would void the removal request, but it don't. How to solve In webmaster tools, in the [Google Index => Remove URLs] page, you can reinclude pages there. tGib0
Technical SEO | | RichardPicard0 -
Backlinks that we have if they are 404?
Hi All, Backlinks that we have if they are 404? Open site explorer shows 1,000 of links and when I check many are 404 and those are spammy links which we had but now the sites are 404 I am doing a link profile check which is cleaning up all spammy links Should i take any action on them? As open site explorer or Google still shows these links on the searches. Should we mention these URL's in disallow in Google webmaster. Thanks
Technical SEO | | mtthompsons0 -
Does 301 redirect of old filenames still work?
I have gone through several revisions of my site. We used to have only static pages in HTML. I had search-engine-optimization.html changed to seo-philippines.html changed to /seo-philippines/ I 301 redirected all of them whenever I change the filenames. This is in the course of 6 years worth of link building and I'm wondering if this has an effect because our rankings go down everytime we do this.
Technical SEO | | optimind0 -
301 & backlinks
Apologies if my question sounds like a school Maths lesson 😉 If you have 2 sites: Site 1) is linked to by sites A,B & C Site 2) is linked to by sites X,Y & Z You then 301 redirect site 2 to site 1. Most of the juice from site 2 (obtained from links X,Y,Z) should be passed over to site 1. But what if site 2 is linked to by the same sites A,B,C as site 1 instead of X,Y,Z. Since both sites have exactly the same links will the same, less, or any weight be passed over by the 301 redirect? Many thanks.
Technical SEO | | martyc1 -
Struggling to get my lyrics website fully indexed
Hey guys, been a longtime SEOmoz user, only just getting heavily into SEO now and this is my first query, apologies if it's simple to answer but I have been doing my research! My website is http://www.lyricstatus.com - basically it's a lyrics website. Rightly or wrongly, I'm using Google Custom Search Engine on my website for search, as well as jQuery auto-suggest - please ignore the latter for now. My problem is that when I launched the site I had a complex AJAX Browse page, so Google couldn't see static links to all my pages, thus it only indexed certain pages that did have static links. This led to my searches on my site using the Google CSE being useless as very few pages were indexed. I've since dropped the complex AJAX links and replaced it with easy static links. However, this was a few weeks ago now and still Google won't fully index my site. Try doing a search for "Justin Timberlake" (don't use the auto-suggest, just click the "Search" button) and it's clear that the site still hasn't been fully indexed! I'm really not too sure what else to do, other than wait and hope, which doesn't seem like a very proactive thing to do! My only other suspicion is that Google sees my site as more duplicate content, but surely it must be ok with indexing multiple lyrics sites since there are plenty of different ones ranking in Google. Any help or advice greatly appreciated guys!
Technical SEO | | SEOed0 -
How to use overlays without getting a Google penalty
One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.
Technical SEO | | Red_Mud_Rookie0