Proactively Use GWT Removal Tool?
-
I have a bunch of links on my site from sexualproblems.net (not a porn site, it's a legit doctor's site who I've talked to on the phone in America). The problem is his site got hacked and has tons of links on his homepage to other pages, and mine is one of them. I have asked him multiple times to take the link down, but his webmaster is his teenage son, who doesn't basically just doesn't feel like it. My question is, since I don't think they will take the link down, should I proactively remove it or just wait till I get a message from google?
I'd rather not tell google I have spam links on my site, even if I am trying to get them removed. However, I have no idea if that's a legitimate fear or not. I could see the link being removed and everything continuing fine or I could see reporting the removal request as signaling a giant red flag for my site to be audited.
Any advice?
- Ruben
-
*Disavow, that's what I meant to say. Alright, thanks for your insights! I appreciate it.
- Ruben
-
If you feel the link is negatively affecting your site then you can disavow links from that domain using the Disavow Link tool in Webmaster Tools: https://support.google.com/webmasters/answer/2648487?hl=en
But your best is always to try get the link down manually. If it is not affecting you (one bad link in a herd of many links) then I wouldn't worry about it.
-
Hi,
If you have low quality / hacked links leading to your site, I would just go ahead and disavow that site. Don't worry about Google coming along to audit you for this, that isn't what they are trying to do. You need to distance yourself from the site and this is the way to do it.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tens of duplicate homepages indexed and blocked later: How to remove from Google cache?
Hi community, Due to some WP plugin issue, many homepages indexed in Google with anonymous URLs. We blocked them later. Still they are in SERP. I wonder whether these are causing some trouble to our website, especially as our exact homepages indexed. How to remove these pages from Google cache? Is that the right approach? Thanks
Algorithm Updates | | vtmoz0 -
Link Removal
Hi - We have been trying to remove bad links for about 12 months. QUESTION: How can be eliminate backlinks from sites are not possible to contact? Background: Contacted as many domain administrators as we could. Not a big change. Some want $$$ Submitted 3 disavow lists (3 months apart from each other). Last list was to remove all links. We still have a large number of Japanese and Chines links directories pointing to us that we cannot contact or don't know how to ask to be removal. One key thing to keep in mind, is that we don't want to change the URL. Thanks, for the help.
Algorithm Updates | | highlandadventures0 -
Using a stop word when optimizing pages
I have a page (for a spa) I am trying to fully optimize and, using AdWords have run every conceivable configuration (using Exact Match) to ascertain the optimal phrase to use. Unfortunately, the term which has come up as the 'best' phrase is "spas in XXX" [xxx represents a location]. When reviewing the data, phrases such as "spas XXX" or "spa XXX" doesn't give me an appropriate search volume to warrant optimizing. So, with that said, do I optimize the page without the word "in", and 'hope' we get the search volume for searches using the word "in", or optimize using the stop word? Any thoughts? Thank you!
Algorithm Updates | | MarketingAgencyFlorida0 -
Google webmaster tool content keywords Top URLs
GWT->Optimization->Content Keywords section... If we click on the keyword it will further shows the variants and Top URLs containing those variants. My problem is none of the important pages like product details pages, homepage or category pages are present in that Top URLs list. All the news, guides section url's are listed in the Top URLs section for most important keyword that is also present in my domain name. How to make google realize the important pages for the important keyword?
Algorithm Updates | | BipSum0 -
Should I remove my keyword meta?
So it's safe to assume keywords are no longer used by SEs in the old fashioned sense to rank sites, but should be keep them as indicators of site content? It's been suggested by some that they're detrimental for two reasons: 1. Your competitors can snoop the keywords you're targeting but mainly... 2. Over-optimisation is the enemy these days! Thanks for your input 🙂
Algorithm Updates | | underscorelive0 -
Don't use an h1 and just use h2's?
We just overhauled our site and as I was auditing the overhaul I noticed that there were no h1's on any of the pages. I asked the company that does our programming why and he responded that h1's are spammed so much so he doesn't want to put them in. Instead he put in h2's. I can't find anything to back this up. I can find that h1's are over-optimized but nothing that says to skip them altogether. I think he's crazy. Anyone have anything to back him up?
Algorithm Updates | | Dave_Whitty0 -
Is using WPML (WordPress Multilingual Plugin) ok for On-Page SEO?
Hi Mozzers, I'm investigating multilingual site setup and translating content for a small website for 15-20 pages and came accross WPML (WordPress Multilingual Plugin) which looks like it could help, but I am curious as to whether it has any major international SEO limitations before trialing/buying. It seems to allow the option to automatically setup language folder structures as www.domain.com/it/ or www.domain.com/es/ etc which is great and seems to offer easy way of linking out to translators (for extra fee), which could be convenient. However what about the on-page optimization - url names, title tags and other onpage elements - I wonder if anyone has any experiences with using this plugin or any alternatives for it. Hoping for your valued advice!
Algorithm Updates | | emerald0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0