Getting spam Links pointing to our wrong url, what to do?
-
Hey Mozzers,
Looking in my Google Search Console (Webmaster Tools), I'm getting links pointing to bogus pages on my website that result in a 404. What does one do so you can tell Google that it has been "fixed"?
Do i just 301 it to another website?
If I add it to my disavow list, does Google remove the error in my webmaster tools?
Thank you!
-
Hey Shawn,
1. If GWT is showing a 404 without a "Linked from," I wouldn't worry about it too much. Sometimes Google tries to recrawl URLs it crawled once in the past, but that doesn't mean it's causing a problem. I would double check that you don't have any internal or external links to "blog/blog," then just "mark as fixed." There's no point to disallow Google from crawling a page that isn't there in the first place.
2. Nope. The only danger of having a long robots.txt is that you may get lost while checking your rules and accidentally disallow a page you want in the index. Whenever you add a rule to robots.txt, make sure you go to Google Search Console and use the robots.txt Tester to check if you've disallowed any important pages.
Good luck!
Kristina
-
Thank you Kristina. I went ahead and disavowed some of those links. Other bogus links are still showing in GWT, such as "blog/blog/"...we only have one /blog category but Google just weirdly finds more bogus ones. When I select "Linked from", it doesn't show any sources. So my additional questions are:
1. If I add "blog/blog" to my robots.txt, do I mark the /blog/blog links in GWT "mark as fixed"?
2. Does it matter if my robots.txt is long or short?
Thank you
-
Hi Shawn,
If these are from crappy scraper sites (which it looks like they are), then I'd add it to your disavow list, yes. I don't think that'll get Google to remove your error in GWT, but remember: GWT is a separate algorithm from the actual algorithm because Google doesn't want us to see how sophisticated they are. If GWT sees "errors" in links, but you've properly disavowed, I'm fairly confident the actual Google algorithm won't penalize you.
For anyone else finding this thread: make sure that you check the sites that are linking to your broken pages before disavowing! If they're good links but out of date or mistyped, just update them.
Best,
Kristina
-
Joe,
Thanks for the update. Our website doesn't have any malware, but sucuri.net seems ideal to those who need it. Reading your article, I read:
- "Someone else from another site links to you but has a typo in their link"
In our case, these are crappy scraping sites that have bogus links pointing to us. Here's an example: Let's say you're nba.com and have a page on kobe bryant - nba.com/kobe-bryant
The link that Google is picking up from these crappy sites show a link to "nba.com/kobe-bry". If you notice, the link is incorrect and I don't want to see another 404 piling in webmaster tools.
What would you do?
-
I've seen this happen several times and the first thing to do is really clean out your site first. I'd recommend Sucuri.net to do a thorough malware cleanup. Their free plugin isn't enough, you need to go with the paid version. Are you on WordPress? I've seen this happen to 4 sites already I've helped.
Then, let all the errors 404. If you 301 them then the bad pages and links will stay alive longer. Even better to let them 410, but 404 should do the trick. I wrote about some of the reasons behind those in a post here on Moz a few years ago that still applies.
Matt Cutts has a good reply on disavowing spam links in his video here, and says:
"If you're at all stressed, if you're worried, if you're not able to sleep at night because you think Google might have something, or might see it, or we might get a spam report about you, or there might be some misunderstanding or an algorithm might rank your site lower, I would feel free to just go ahead and disavow those links as well,"
I went ahead and disavowed the DOMAINS of the most recent site, just to be extra sure. Let me know if you have follow up questions!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Difference between urls and referring urls?
Sorry, nit new to this side of SEO We recently discovered we have over 200 critical crawler issues on our site (mainly 4xx) We exported the CSV and it shows both a URL link and a referring URL. Both lead to a 'page not found' so I have two questions? What is the difference between a URL and a referring URL? What is the best practice/how do we fix this issue? Is it one for our web developer? Appreciate the help.
Moz Pro | | ayrutd1 -
Crawl Errors from URL Parameter
Hello, I am having this issue within SEOmoz's Crawl Diagnosis report. There are a lot of crawl errors happening with pages associated with /login. I will see site.com/login?r=http://.... and have several duplicate content issues associated with those urls. Seeing this, I checked WMT to see if the Google crawler was showing this error as well. It wasn't. So what I ended doing was going to the robots.txt and disallowing rogerbot. It looks like this: User-agent: rogerbot Disallow:/login However, SEOmoz has crawled again and it still picking up on those URLs. Any ideas on how to fix? Thanks!
Moz Pro | | WrightIMC0 -
Does Google have a direct link with facebook and twitter?
Google monitor social media. What I'm wondering is do Google use the same tools we have on Facebook's API, Twitter's API etc to use in their SERPs Or do Facebook grant Google more detailed access to see who has liked links etc. I think it's quite an interesting point as surely I can push up my own count by repeatedly sharing my own links, which wouldn't be genuine. If Google had better access they could then determine what's been faked etc.
Moz Pro | | PhotoGazza0 -
Getting your site totally indexed by SEOMOZ
Hi guys! Ijust started using SEOMOZ software and wondered how it could be that my site has over 10.000 pages but in the Pro Dashboard it only indexed about 1500 of them. I've been waiting a few weeks now but the number has been stable ever since. Is there a way to get the whole site indexed by SEOMoz software? Thanks for your answers!
Moz Pro | | ssiebn70 -
How do I get the Page Authority of individual URLs in my exported (CSV) crawl reports?
I need to prioritize fixes somehow. It seems the best way to do this would be to filter my exported crawl report by the Page Authority of each URL with an error/issue. However, Page Authority doesn't seem to be included in the crawl report's CSV file. Am I missing something?
Moz Pro | | Twilio0 -
Can I get a list of all links on a given domain?
Sorry, this is actually kind of a tripartite question: I was looking at the Competitive Link Analysis on one my clients' campaigns. Sometime between June and September their total links went up by about 120,000. We have no idea where those links came from (although the numbers would indicate that they're mostly internal). Question 1: In none of the other tools can I figure out how to list these links on a domain level. Is there a way to get a list of all links for our given domain? I've been playing around with the page-by-page and even that doesn't show me everything. For example, I'm looking at OSE for their homepage and it lists 45 links for a page that it claims has 151 total. Question 2: How did it pick those 45 to display out of the 151 possible? If these are only external links, why do half of them come from one of our subdomains? Also... Question 3: If our client hasn't made any major changes recently, why has the number of internal links gone up so dramatically? Thanks.
Moz Pro | | MackenzieFogelson1 -
Why am I getting duplicate content errors on same page?
In the SEOmoz tools I am getting multiple errors for duplicate page content and duplicate page titles for one section on my site. When I check to see which page has the duplicate title/content the url listed is exactly the same. All sections are set up the same, so any ideas on why I would be getting duplication errors in just this one section and why they would say the errors are on the same page (when I only have one copy uploaded on the server)?
Moz Pro | | CIEEwebTeam0