Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How can I block incoming links from a bad web site ?
-
Hello all,
We got a new client recently who had a warning from Google Webmasters tools for manual soft penalty. I did a lot of search and I found out one particular site that sounds roughly 100k links to one page and has been potentialy a high risk site.
I wish to block those links from coming in to my site but their webmaster is nowhere to be seen and I do not want to use the disavow tool.
Is there a way I can use code to our htaccess file or any other method?
Would appreciate anyone's immediate response.
Kind Regards
-
Hi Yiannis,
As far as I'm aware, there isn't really a way to "block" a link. The link is seen on the other site. Returning a 404 for the page being linked to doesn't change the fact that there are a 100K links from one site pointing at your site. The only options I'm aware of are to 1.) contact the owner of the website with the links and ask them to remove the links and 2.) if that doesn't work disavow the links.
I understand your hesitancy to use the disavow tool, but quite frankly, this is exactly what it is intended for.
If you feel comfortably with the links being there and think Google has already dealt with them, then do nothing, but if you want to do something about the links, you either have to get them removed or disavow them.
BTW - My understanding of the partial manual actions is that often times Google not only deals with the suspicious links (devaluing them), but they also penalize the pages/keywords they think you were attempting to manipulate. So, just because it was a partial action and not a full site action doesn't mean it's not effecting some of your rankings. It's just not going to affect all your rankings for all your pages.
Kurt Steinbrueck
OurChurch.Com -
Hi eyepaq and thanks for your reply, much appreciated
The reason I do not want to use the disavow tool is because
- Google sent on the message that "took targeted action on the unnatural links instead of on the site’s ranking as a whole", meaning they took care of the problem 2) Ranking and traffic looking solid
- I have seen a lot of cases where people used it and lost rankings (some never recovered).
My thoughts were to block the spammy links and monitor if the traffic will be affected (which I doubt as it seems most of it is for branded searches). If then I face de-index, drops then use the disavow/reconsideration request.
What do you think?
-
Hi,
If you are talking about one or only a few sites then it's easy.
Just build a disavow file as there is no down side to that . The disavow with domain:domainname.com (not individual pages) and upload it via Web master Tools. After sumbiting the file - send a reconsideration request explaining the situation and mentioning the disavow file. You should be safe after that.
Alternatively - but to be honest I don't see the reason why not going with the first option - is to return a 404 if the referral is that domain.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our clients Magento 2 site has lots of obsolete categories. Advice on SEO best practice for setting server level redirects so I can delete them?
Our client's Magento website has been running for at least a decade, so has a lot of old legacy categories for Brands they no longer carry. We're looking to trim down the amount of unnecessary URL Redirects in Magento, so my question is: Is there a way that is SEO efficient to setup permanent redirects at a server level (nginx) that Google will crawl to allow us at some point to delete the categories and Magento URL Redirects? If this is a good practice can you at some point then delete the server redirects as google has marked them as permanent?
Technical SEO | | Breemcc0 -
Are on-site content carousel bad for SEO?
Hi, I didn't find an answer to my question in the Forum. I attached an example of content carousel, this is what I'm talking about. I understand that Google has no problem anymore with tabbed contents and accordeons (collapsible contents). But now I'm wondering about textual carousels. I'm not talking about an image slider, I'm talking about texts. Is text carousel harder to read for Google than plain text or tabs? Of course, i'm not talking about a carousel using Flash. Let's say the code is proper... Thanks for your help. spfra5
Technical SEO | | Alviau0 -
Why Can't Googlebot Fetch Its Own Map on Our Site?
I created a custom map using google maps creator and I embedded it on our site. However, when I ran the fetch and render through Search Console, it said it was blocked by our robots.txt file. I read in the Search Console Help section that: 'For resources blocked by robots.txt files that you don't own, reach out to the resource site owners and ask them to unblock those resources to Googlebot." I did not setup our robtos.txt file. However, I can't imagine it would be setup to block google from crawling a map. i will look into that, but before I go messing with it (since I'm not familiar with it) does google automatically block their maps from their own googlebot? Has anyone encountered this before? Here is what the robot.txt file says in Search Console: User-agent: * Allow: /maps/api/js? Allow: /maps/api/js/DirectionsService.Route Allow: /maps/api/js/DistanceMatrixService.GetDistanceMatrix Allow: /maps/api/js/ElevationService.GetElevationForLine Allow: /maps/api/js/GeocodeService.Search Allow: /maps/api/js/KmlOverlayService.GetFeature Allow: /maps/api/js/KmlOverlayService.GetOverlays Allow: /maps/api/js/LayersService.GetFeature Disallow: / Any assistance would be greatly appreciated. Thanks, Ruben
Technical SEO | | KempRugeLawGroup1 -
How to remove all sandbox test site link indexed by google?
When develop site, I have a test domain is sandbox.abc.com, this site contents are same as abc.com. But, now I search site:sandbox.abc.com and aware of content duplicate with main site abc.com My question is how to remove all this link from goolge. p/s: I have just add robots.txt to sandbox and disallow all pages. Thanks,
Technical SEO | | JohnHuynh0 -
Can Google read onClick links?
Can Google read and pass link juice in a link like this? <a <span="">href</a><a <span="">="#Link123" onClick="window.open('http://www.mycompany.com/example','Link123')">src="../../img/example.gif"/></a> Thanks!
Technical SEO | | jorgediaz0 -
I can buy a domain from a competitor. Whats the best way to make good use of these links for my existing website
I can buy a domain from a competitor. Whats the best way to make good use of these links for my existing website
Technical SEO | | Archers0 -
What is the value of english links with foreign language anchor text for a foreign site?
I have a site in Spanish that is hosted in Spain with a .es TLD. I already have many Spanish-language links from websites in Spain, but I obviously want more and I'm finding I might need to look beyond typical Spanish sites. In talking to some of my link builders who work on my English/American sites, they are recommending that I build links on the normal article sites, blogs and web 2.0 sites that I normally build links on but that I make all the content English and insert the anchor text in Spanish. For example, if my site were about "weightloss", my keyword would be "perder peso" (in spanish). They are recommending that I have articles, reviews, etc written about weightloss in English with the anchor text "perder peso" worked into the English article. Most of the sites are English sites that are hosted in the US (article sites, web 2.0 properties, etc). My question is what is the value of these links? Does anybody have any experience with this?
Technical SEO | | jargomang0 -
Google.ca is showing our US site instead of our Canada Site
When our Canadian users who search on google.ca for our brand (e.g. Travelocity, Travelocity hotels, etc.), the first few results our from our US site (travelocity.com) rather than our Canadian site (travelocity.ca). In Google Webmaster Tools, we've adjusted the geotargeting settings to focus on the appropriate locale, but the wrong country TLD is still coming up at the top via google.ca. What's the best way to ensure our Canadian site comes up instead of the US site on google.ca? Thanks, Tory Smith
Technical SEO | | travelocitysearch
Travelocity0