SPAMMY links to my search results page
-
Hi,
I have a big problem, somehow someone has built THOUSANDS of SPAMMY links to my site's search result pages with keywords that does not even make sense or inline with the site.
Links are dofollow and goes to /?s=TERM see the for yourself https://analytics.moz.com/pro/link-explorer/inbound-links?site=https%3A%2F%2Fhealthtian.com
See image https://imgur.com/a/fahetyZ
Please is there a way to BLOCK all at once? i feel these links are impacting on my site.
Thanks
-
yes i know, during the compilation of the list i used GSC, SEMRUSH, AHREF,etc...
-
Don't rely on search console link data. Google don't show you all of the links they see pointing to your site, only a very small sample which is usually also a bad sample. By all means integrate search console links into your wider project, but don't treat them as 'the best data' because they absolutely are not
-
Then I guess just use the disavow tool where you're not limited to what Search Console will let you input? It's a free-form text file, you choose for yourself which domains you want to disavow. You don't have to select them from URLs that 'show up' in search console
-
i do the management myself.
the URLS showing on moz/semrush and other similar tools are different from what is showing on search console
-
Hi,
have you used the new google console lately? i mean added new domains? because they don't show up in the disavow tool.
I added my domain in the new console and it has changed from previous method
Thanks
-
The disavow tool is here:
https://www.google.com/webmasters/tools/disavow-links-main
(in Search Console)
5th tip is for a senior developer to look at not an SEO person / marketer. I also don't recommend it
-
So you have spammy links for 7K unique URLs? Ask whoever manages your GSC account to grant you higher access as this shouldn't be an issue.
-
Hi,
I have a list of domains and URLs about 7K but i can't disavow them because the new google console does not give option for that.
My search results are no-index and i have used **Disallow: /?s= **in my robots.txt file. so how do i do the 5th tip?
-
-
Isolate the domains which the links are coming from
-
Disavow all of the domains
-
Meta no-index your search results URLs as Google doesn't like to index search results anyway
-
Robots.txt block all your search results URLs
-
If you are feeling hardcore, when someone visits the search result URLs specifically from the offending domain(s) serve them 410s (but this is a bit thermonuclear)
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much time wait to do Link Building to a new page?
We launched a new page 2 weeks ago, and we want to know how much time should we wait to do Link Building and how many links should we try to generate to not get penalized by Google. Thanks! EDIT: Thanks to all for you answer! Very helpful!
Link Building | | carlostinca0 -
Best way to get over the Hump (from page 2 to page 1)
Hello, I've recently been able to bring my site way up in the rankings for some very competitive keywords. I've done nothing spammy, and everything has been done with a lot of detail and a focus on only using the best techniques. My site is on page 2 for a lot of keywords. What would be your suggestions for getting over that hump and making it to page 1? I'm happy with my results (coming from pages 30-50 to page 2 over a couple of months) but I'm not satisfied and won't be until I can be on the top of page 1... Any suggestions are welcome... Thanks
Link Building | | Prime850 -
Individual keyword pages or single page with backlinks and anchor text?
Hi, I was wondering which is the better strategy? People say the keyword dig is the basis for any SEO strategy, which i totally agree with, however, traditionally the strategy for these keywords has been to write fresh content based on sometimes very similar keywords / terms to attract specific keywords traffic... With google getting smarter and smarter, would you say it is safer (if you know you have a very tight and stable backlinking strategy) to backlink a page that has a number of your similar keywords present, with the relevant anchor text on the inbound sites? So in essence the site ranks for multiple keywords..? I hope this makes sense, it's a very difficult one to get out onto paper! 🙂 Many thanks in advance!
Link Building | | turnips
Turnips0 -
Are links from every page of a partners website to my website bad for seo?
I have asked a partner website to link to my website in the Menu Bar. Which has resulted in a link to my website from every page of the partners website (Over 1000 Pages). The link is not a "nofollow" link, it is a regular link. Can this have any negative effects in terms of SEO?
Link Building | | raghavkapur0 -
Why links coming from this page are never found? (opensiteexplorer)
There is a very strong authority and ranking website that has links on it to some of our competitors websites and other companies. I used opensiteexplorer to check if these links were showing up as backlinks on competitors site but I was unable to find them. This 'powerful' page of links has been around for quite a while so I don't think time has anything to do with it. SEOmoz toolbar itself gives the particular page good rank, trust, and authority. Using this website (not sure it has any use) http://rapid.searchmetrics.com/en/ it finds the outbound links and domains. I was unable to find any nofollow or noindex things anywhere on the site.. Just looking at the site and it's links I would think it an ideal backlink target for my company. However using opensiteexplorer I am lead to believe this backlink is never being found? Any thoughts? Cheers
Link Building | | Earthsaver0 -
How to improve (ASAP) the linking root domain, the followed linking root domains and the linking C-Blocks? Linkbuilding (or whatever) techniques.
I have a small site (.com) like any website in my sector. 30-90 pages. I have no crawls errors. Everythings is fine, just I need to improve my linking root domain, the followed linking root domain and the linking C-Blocks. Example: my competitors have 300 (one of them have 1300) of total links. I have 30. Anyone know some good strategies? techniques? tips? I just dont want to be in a farm directory, I want free links. I'm already running two strategies but it works so slowly. I want something faster at this moment. Also, any recommendation will be thankful.
Link Building | | NicoDavila0 -
Are 36 links from one site to nested pages better than one link to the root domain?
I have a Driving School website www.1stclassdriving.co.uk. The site is structured geographically with 36 area pages, one
Link Building | | Brian_Worger
page per area post code and one page per Driving Instructor. I am trying to develop links and have found a site (da91 pa
63) where I can create area links to each of my area pages if I
wish. Is it best to just create one link to the root page or
should I create 36 individual links to each area page? - which is most valuable?0