My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
-
Excuse me for posting this here, I wasn't having much luck going through GWT support.
We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere.
I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website.
If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
-
Hi
Yeah, lets say they use Xrumer.
They hack your site, insert pages of their own, and links on your pages.
They put those urls in text files based on their keyword targets/groups.
They run the software, using those list with their link sources and using their auto insert random url template.
So that pings a 404 to GWT so the 404 shows up there.
If these are pros, they already know that the pages are dead by now, as they confirm links after each run. It just takes a bit more time for GWMT to get notified so you'll see them trickle in.
So you'll see those 404 pages getting links from different dates.
Hope that helps
-
I don't understand why more spam links would be coming in though. Is it because the spam network doesn't realize that I've removed the injected pages? In other words, are they unknowingly linking to 404s?
-
Since those URLs are already gone after you cleaned it up, you can just mark those as fixed. GWT usually is pretty late with picking those up. I've handled my share of hacked sites, some with invisible links.
If they appear again, then you'll need to find where they are getting through. It's a pain but you have to fully check your files for scripts and encrypted codes.
Aside from those, it's just time. Google will eventually stop showing them.
Good luck Andrew!
PS. You might want to look at some of your pages using Google's cache result. You can see invisible links using that. Just in case you haven't done this part.
-
Thank you for your response.
I also believe I was hacked through my wordpress. What exactly did you do once you realized the htaccess file was changed? Did you change it back to whatever code was there before?
I already submitted a reconsideration request to Google and it was successful. I no longer have "this site may be hacked" in the SERPs, but I still have thousands of urls pointing to 404 pages.
-
-
Samething happened to me a last month due to a securrity break in a pluggin that was part of my wordpress theme.
After hacked the site with injected url and also altering htaccess file (check that out) they changed the htacces file in order that if you enter your url you could see the correct version of your web, but if you enter your website in a google search traffic wen to this spammy viagra stuff pages.
I also recieved a manual action on my site.
What i did:
1- removed the injected files that were creating the spammy urls
2- edited the htacces file locating what code they had changed
3- summited a reconsideration request explaining what it was happening
4- Removed on webmaster tools al url that were spammy created on my site to remove them from google index
After 10 days manual action was removed. But till know i still have spammy links to 404 on my site. This happens because they also hacked other sites and creates like spammy linking networks. Has people start recovering their sites the amount of links to this pages will reduce.
My experience this big amount on 404 it has affected on about 30% of traffic. This traffic has know recovered almost completly and the amount on 404 is reducing with time.
So my conclusion is that this 404 are not healthy but they will be gone with time and your site will recover.
-
Ha, sorry about the initial test post. It wasn't publishing on my main computer at first.
-
Can you please be more specific.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My old URL's are still indexing when I have redirected all of them, why is this happening?
I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?
Technical SEO | | One2OneDigital0 -
524 links from a site unknown to me - contact then disavow?
Hi there, Google Webmaster Tools 'Who links the most' informs me there are 534 links from a site called econintersect.com linking to my site and 102 links from similar pages.com. I have never requested these links and am keen to know if they are harming my site due to poor link quality? Simon
Technical SEO | | simmo2350 -
Value in Consolidating Similar Sites / Duplicate Content for Different URLs
We have 5 ecommerce sites: one company site with all products, and then four product-specific sites with relevant URL titles and products divided up between them (www.companysite.com, www.product1.com, www.product2.com, etc). We're thinking of consolidating the smaller sites into our most successful site (www.product1.com) in order to save management time and money, even though I hate to lose the product-specific URLs in search results. Is this a wise move? If we proceed, all of the products will be available on both our company site and our most successful site (www.company.com & www.product1.com). This would unfortunately give us two sites of duplicate content, since the products will have the same pictures, descriptions, etc. The only difference would be the URL. Would we face penalties from Google, even though it would make sense to continue to carry our products on our company site?
Technical SEO | | versare0 -
New Site maintaining rank on old URL's
Hi I have a new website going live which has a different page names etc i.e. the old site had pages that are ranking called aboutus.html and the new site is called about.php What is the best approach to maintain the rank and also on orphaned pages Many Thanks
Technical SEO | | ocelot0 -
Site was infected with spam webmaster tools still reporting it
I have recently been working with a site that was hacked. It suffered from a pharma injection into Joomla. The site has been cleaned for several months, but WMT is still reporting "pharmacy" as occuring 421 times. The url it gives reports a 500 error. I also removed it in Google. Can this still be hurting the site? How can I clean this up?
Technical SEO | | smcmark0 -
Best practices for controlling link juice with site structure
I'm trying to do my best to control the link juice from my home page to the most important category landing pages on my client's e-commerce site. I have a couple questions regarding how to NOT pass link juice to insignificant pages and how best to pass juice to my most important pages. INSIGNIFICANT PAGES: How do you tag links to not pass juice to unimportant pages. For example, my client has a "Contact" page off of there home page. Now we aren't trying to drive traffic to the contact page, so I'm worried about the link juice from the home page being passed to it. Would you tag the Contact link with a "no follow" tag, so it doesn't pass the juice, but then include it in a sitemap so it gets indexed? Are there best practices for this sort of stuff?
Technical SEO | | Santaur0 -
Why would SEOMoz and GWT report 404 errors for pages that are not 404ing?
Recently, I've noticed that nearly all of the 404 errors (not soft 404) reported in GWT actually resolve to a legitimate page. This was weird, but I thought it might just be old info, so I would go through the process of checking and "mark as fixed" as necessary. However, I noticed that SEOMoz is picking up on these 404 errors in the diagnostics of the site as well, and now I'm concerned with what the problem could be. Anyone have any insight into this? Rich
Technical SEO | | secretstache0