My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
-
Excuse me for posting this here, I wasn't having much luck going through GWT support.
We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere.
I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website.
If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
-
Hi
Yeah, lets say they use Xrumer.
They hack your site, insert pages of their own, and links on your pages.
They put those urls in text files based on their keyword targets/groups.
They run the software, using those list with their link sources and using their auto insert random url template.
So that pings a 404 to GWT so the 404 shows up there.
If these are pros, they already know that the pages are dead by now, as they confirm links after each run. It just takes a bit more time for GWMT to get notified so you'll see them trickle in.
So you'll see those 404 pages getting links from different dates.
Hope that helps
-
I don't understand why more spam links would be coming in though. Is it because the spam network doesn't realize that I've removed the injected pages? In other words, are they unknowingly linking to 404s?
-
Since those URLs are already gone after you cleaned it up, you can just mark those as fixed. GWT usually is pretty late with picking those up. I've handled my share of hacked sites, some with invisible links.
If they appear again, then you'll need to find where they are getting through. It's a pain but you have to fully check your files for scripts and encrypted codes.
Aside from those, it's just time. Google will eventually stop showing them.
Good luck Andrew!
PS. You might want to look at some of your pages using Google's cache result. You can see invisible links using that. Just in case you haven't done this part.
-
Thank you for your response.
I also believe I was hacked through my wordpress. What exactly did you do once you realized the htaccess file was changed? Did you change it back to whatever code was there before?
I already submitted a reconsideration request to Google and it was successful. I no longer have "this site may be hacked" in the SERPs, but I still have thousands of urls pointing to 404 pages.
-
-
Samething happened to me a last month due to a securrity break in a pluggin that was part of my wordpress theme.
After hacked the site with injected url and also altering htaccess file (check that out) they changed the htacces file in order that if you enter your url you could see the correct version of your web, but if you enter your website in a google search traffic wen to this spammy viagra stuff pages.
I also recieved a manual action on my site.
What i did:
1- removed the injected files that were creating the spammy urls
2- edited the htacces file locating what code they had changed
3- summited a reconsideration request explaining what it was happening
4- Removed on webmaster tools al url that were spammy created on my site to remove them from google index
After 10 days manual action was removed. But till know i still have spammy links to 404 on my site. This happens because they also hacked other sites and creates like spammy linking networks. Has people start recovering their sites the amount of links to this pages will reduce.
My experience this big amount on 404 it has affected on about 30% of traffic. This traffic has know recovered almost completly and the amount on 404 is reducing with time.
So my conclusion is that this 404 are not healthy but they will be gone with time and your site will recover.
-
Ha, sorry about the initial test post. It wasn't publishing on my main computer at first.
-
Can you please be more specific.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Linking issue
So i am working with a review company and I am having a hard time with something. We have created a category which lists and categorizes every one of our properties. For example a specific property in the category "restaurant" would be as seen below: /restaurant/mcdonalds /restaurant/panda-express And so on and so on. What I am noticing however is that our more obscure properties are not being linked to by any page. If I were to visit the page myurl.com/restaurant I would see 100+ pages of properties, however it seems like only the properties on the first few pages are being counted as having links. So far the only way I have been able to work around this issue is by creating a page and hiding it in our footer called "all restaurants". This page lists and links to every one of our properties. However it isn't exactly user friendly and I would prefer scrapers not to be able to scrape all properties at once! Anyway, any suggestions would be greatly appreciated.
Technical SEO | | HashtagHustler0 -
Changing site URL structure
Hey everybody, I'm looking for a bit of advice. A few weeks ago Google sent me an email saying all pages with any text input on them need to switch to https for those pages. This is no problem, I was slowly switching the site to https anyway using a 301 redirect. However, my site also has a language subfolder in the url, mysite.com/en/ mysite.com/ru/ etc. Due to poor work on my part the translations of the site haven't been updated in a long time and lots of the pages are in english even on the russian version etc. So I'm thinking of just removing this url structure and just having mysite.com My plan is to 301 all requests to https and remove the language subfolder in the url at the same time. So far the https switching hasn't changed my rankings. Am I more at risk of losing my rankings by doing this? Thanks!
Technical SEO | | Ruhol0 -
How do you link your adaptive mobile site to Google Analytics?
With Google now saying they're putting a lot more emphasis on mobile sites, we recently got notifications from Google Webmaster Tools saying that some of our pages are not built for mobile. Some of these pages, however have an adaptive page that when you visit from a mobile phone (m.mysite.com), you're taken to instead of the desktop version. My question is, how do I let Google know that I have an adaptive site and not get penalized for poor mobile usability? I already have Google Analytics on the mobile site, I just need to somehow let Webmaster tools / Google's web crawlers know that they should be looking to my mobile site for usability, not the desktop site. Any advice is appreciated!
Technical SEO | | Ditigal_Taylor0 -
Site blocked by robots.txt and 301 redirected still in SERPs
I have a vanity URL domain that 301 redirects to my main site. That domain does have a robots.txt to disallow the entire site as well. However, for a branded enough search that vanity domain still shows up in SERPs and has the new Google message of: A description for this result is not available because of this site's robots.txt I get why the message is there - that's not my , my question is shouldn't a 301 redirect trump this domain showing in SERPs, ever? Client isn't happy about it showing at all. How can I get the vanity domain out of the SERPs? THANKS in advance!
Technical SEO | | VMLYRDiscoverability0 -
Site-Wide Header image w/ Link hurting me?
I have a banner in the header that is constant across all pages except the one it links to. It goes in before the content div and this search term is #1 across google in almost every variation, but it appears to link in a lot of the non relevant pages for #2 spots in some cases. While this is a relatively new domain (started 03/12), but this has been a constant rank for about 2 months. I'm wondering if this may be hurting the keyword targeting of my internal pages and if i should no follow that header image on all pages except the homepage?
Technical SEO | | choiceenergy0 -
Google Shows 24K Links b/w 2 sites that are not linked
Good Morning, Does anyone have any idea why Google WMT shows me that i have 24,101 backlinks from one of my sites ( http://goo.gl/Jb4ng ) pointing to my other site ( http://goo.gl/JgK1e ) ... These sites have zero links between them, as far as I can see/tell. Can someone please help me figure out why Google is showing 24k backlinks? Thanks
Technical SEO | | Prime850 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
During a site platform transition, should we 301 redirect all URLs or only those with inbound links?
We have an ecommerce client transitioning to a new platform. Due to the nature of the platform, all the pages will have different URLs. There are between 7000-8000 total pages on the website. We wrote 301 redirects for all URLs which are showing inbound links. Unfortunately, automating this process is pretty difficult and hand writing URLs for 8000 links is unfeasible. Is it worth investing the time to 301 redirect all 8000 URLs, or are we safe with only doing those with inbound links? One other option would be to implement a generic redirect for all the rest of the old URLs that sends them to the homepage. Would this be a good compromise?
Technical SEO | | outofboundsdigital0