My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
-
Excuse me for posting this here, I wasn't having much luck going through GWT support.
We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere.
I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website.
If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
-
Hi
Yeah, lets say they use Xrumer.
They hack your site, insert pages of their own, and links on your pages.
They put those urls in text files based on their keyword targets/groups.
They run the software, using those list with their link sources and using their auto insert random url template.
So that pings a 404 to GWT so the 404 shows up there.
If these are pros, they already know that the pages are dead by now, as they confirm links after each run. It just takes a bit more time for GWMT to get notified so you'll see them trickle in.
So you'll see those 404 pages getting links from different dates.
Hope that helps
-
I don't understand why more spam links would be coming in though. Is it because the spam network doesn't realize that I've removed the injected pages? In other words, are they unknowingly linking to 404s?
-
Since those URLs are already gone after you cleaned it up, you can just mark those as fixed. GWT usually is pretty late with picking those up. I've handled my share of hacked sites, some with invisible links.
If they appear again, then you'll need to find where they are getting through. It's a pain but you have to fully check your files for scripts and encrypted codes.
Aside from those, it's just time. Google will eventually stop showing them.
Good luck Andrew!
PS. You might want to look at some of your pages using Google's cache result. You can see invisible links using that. Just in case you haven't done this part.
-
Thank you for your response.
I also believe I was hacked through my wordpress. What exactly did you do once you realized the htaccess file was changed? Did you change it back to whatever code was there before?
I already submitted a reconsideration request to Google and it was successful. I no longer have "this site may be hacked" in the SERPs, but I still have thousands of urls pointing to 404 pages.
-
-
Samething happened to me a last month due to a securrity break in a pluggin that was part of my wordpress theme.
After hacked the site with injected url and also altering htaccess file (check that out) they changed the htacces file in order that if you enter your url you could see the correct version of your web, but if you enter your website in a google search traffic wen to this spammy viagra stuff pages.
I also recieved a manual action on my site.
What i did:
1- removed the injected files that were creating the spammy urls
2- edited the htacces file locating what code they had changed
3- summited a reconsideration request explaining what it was happening
4- Removed on webmaster tools al url that were spammy created on my site to remove them from google index
After 10 days manual action was removed. But till know i still have spammy links to 404 on my site. This happens because they also hacked other sites and creates like spammy linking networks. Has people start recovering their sites the amount of links to this pages will reduce.
My experience this big amount on 404 it has affected on about 30% of traffic. This traffic has know recovered almost completly and the amount on 404 is reducing with time.
So my conclusion is that this 404 are not healthy but they will be gone with time and your site will recover.
-
Ha, sorry about the initial test post. It wasn't publishing on my main computer at first.
-
Can you please be more specific.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spammy nofollow links
Hello, One of our clients - a cleaning business - has a heck of a lot of spammy nofollow links pointing to their site. The majority of the links are from comments or 'pingbacks', most with the anchor text 'cheap nfl jerseys' or 'cyber monday ugg boots'. After researching the subject of spammy nofollow links, it seems there is a lot of uncertainty regarding the negative affect these could have on your SEO efforts. So I guess my question to the community is: if your site was suddenly hit by a plethora of spammy nofollow links, what would you do and why? Cheers, Lewis
Technical SEO | | PeaSoupDigital0 -
GWT Soft 404 count is climbing. Important to fix?
In GWT I am seeing my mobile site's soft 404 count slowly rise from 5 two weeks ago to over 100 as of today. If I do nothing I expect it will continue to rise into the thousands. This is due to there being followed links on external sites to thousands of discontinued products we used to offer. The landing page for these links simply says the product is no longer available and gives links to related areas of our site. I know I can address this by returning a 404 for these pages, but doing so will cause these pages to be de-indexed. Since these pages still have utility in redirecting people to related, available products, I want these pages to stay in the index and so I don't want to return a 404. Another way of addressing this is to add more useful content to these pages so that Google no longer classifies them as soft 404. I have images and written content for these pages that I'm not showing right now, but I could show if necessary. But before investing any time in addressing these soft 404s, does anyone know the real consequences of not addressing them? Right now I'm getting 275k pages indexed and historically crawl budget has not been an issue on my site, nor have I seen any anomalous crawl activity since the climb in soft 404s began. Unchecked, the soft 404s could climb to 20,000ish. I'm wondering if I should start expecting effects on the crawl, and also if domain authority takes a hit when there are that many soft 404s being reported. Any information is appreciated.
Technical SEO | | merch_zzounds0 -
How do you link your adaptive mobile site to Google Analytics?
With Google now saying they're putting a lot more emphasis on mobile sites, we recently got notifications from Google Webmaster Tools saying that some of our pages are not built for mobile. Some of these pages, however have an adaptive page that when you visit from a mobile phone (m.mysite.com), you're taken to instead of the desktop version. My question is, how do I let Google know that I have an adaptive site and not get penalized for poor mobile usability? I already have Google Analytics on the mobile site, I just need to somehow let Webmaster tools / Google's web crawlers know that they should be looking to my mobile site for usability, not the desktop site. Any advice is appreciated!
Technical SEO | | Ditigal_Taylor0 -
GWT giving me 404 errors based on old and deleted site map
I'm getting a bunch of 404 crawl errors in my Google Webmaster Tools because we just moved our site to a new platform with new URL structure. We 301 redirected all the relevant pages. We submitted a new site map and then deleted all the site maps to the old website url structure. However, google keeps crawling the OLD urls and reporting back the 404 errors. It says that the website is linking to these 404 pages via an old outdated sitemap (which if you goto shows a 404 as well, so it's not as if Google is reading these old site maps now). Instead it's as if Google has cached the old sitemap but continues to use it to crawl these non-existent pages. Any thoughts?
Technical SEO | | Santaur0 -
Will Google Recrawl an Indexed URL Which is No Longer Internally Linked?
We accidentally introduced Google to our incomplete site. The end result: thousands of pages indexed which return nothing but a "Sorry, no results" page. I know there are many ways to go about this, but the sheer number of pages makes it frustrating. Ideally, in the interim, I'd love to 404 the offending pages and allow Google to recrawl them, realize they're dead, and begin removing them from the index. Unfortunately, we've removed the initial internal links that lead to this premature indexation from our site. So my question is, will Google revisit these pages based on their own records (as in, this page is indexed, let's go check it out again!), or will they only revisit them by following along a current site structure? We are signed up with WMT if that helps.
Technical SEO | | kirmeliux0 -
How I should fix a lot of 404 issues?
Hi guys, Recently I migrated a site from Drupal to Wordpress. I didn't contemplated the URL's will be different on Wordpress. Actually I have moren than 500 (the site has around 5000 pages) page not found issues on WMT and Moz. It because articles that has accents (spanish site) and symbols had wrong URLs in Drupal and right now in Wordpress that pages are not working.
Technical SEO | | alejandrogm
i.e: http://www.lapelotona.com/noticias/-tata-y-messi%2C-el-primer-saludo-de-los-rosarinos-/ So I want to know if I should fix the URLs and make the 301 redirect to each of them or only fix the issues and thats it. Thanks in advance if someone can help me.
Sorry if I have a mistake with my english, I still learning.0 -
Anyone using Adobe Business Catalyst and Fixing SEO URL Blog Updates?
Does anyone else have experience with the current update Adobe Business Catalyst has announced for their blog features? Florin at BC offered the code below: http://www.graeagle.com/images/fb_blog_og_img.jpg" /> However nether myself nor another commentator can figure out how to make it work: I added the meta data to my template but it seems the tags are not correct. For example, the tag {tag_blogpostmetatitle} does not automatically include the SEO title that I've called out in my individual blog post. So, it appears the browser is ignoring the tag and just including it as is. When I view the source for my live blog article, this is what I get for the lines that I've added the code in the tag: Also, I cannot get schema metadata to work on the BC blog. For example, I have used it on this page: http://www.homedestination.com/_blog/Real_Estate_Blog/post/things_to_know_before_building_a_new_home/; which yields the following in Google's Rich Snippet Tool: Extracted structured data rdfa-node property: title: {tag_blogpostmetatitle} description:__{tag_blogpostmetadescription}
Technical SEO | | jessential0 -
Penguin update: Penalty caused from onsite issues or link profile?
Back in April before the Penguin update, our website home page ranked in the #1 position for several of our keywords and on page 1 for dozens of other keywords. But immediately after the Penguin update in April our rankings dropped immediately to below #100 for nearly all keywords. The sharp drop was obviously a penalty of some kind. We worked on removing some bad back links that were questionable. Over the past 7 months many of the bad links have dropped off and our link profile is improving. Our rankings, however, have not improved at all. In Yahoo and Bing we remain strong and rank on page 1 for many of our keywords. I joined SEOmoz because I’ve heard about their great tools and resources for SEO. The first thing I learned is that I had a lot of errors and warnings that need to be addressed and I’m optimistic that these items once addressed will get me out of that dreadful penalty box we’ve been in for 7 months now. So with that quick summary of our SEO problems I have a few questions that I hope to get some direction on. 1. Crawl Diagnostics for my site in SEOmoz reports 7 errors and 19 warnings including missing meta description tags, temporary redirects, duplicate page content, duplicate page title, 4xx client error, and title element too long. Could these errors and warnings be what has landed my website in some kind of penalty or filter? 2. A couple of the errors were duplicate page title and duplicate page content. So there appears to be a duplicate home page. Here are the two pages: howtomaketutus.com/ howtomaketutus.com/?p=home They are the same page but it looks like Google is seeing it as duplicate content. Do I need to do a 301 redirect in the .htaccess file? I’m not sure how that would work since they are the same page. If that is possible how would I go about doing that? 3. Finally based on what I’ve described above is it more likely that the penalty we are experiencing is because of onsite issues or because of our link profile? We would really appreciate any help or direction anyone can offer on these issues. Thanks
Technical SEO | | 123craft0