My site was hacked and spammy URLs were injected that pointed out. The issue was fixed, but GWT is still reporting more of these links.
-
Excuse me for posting this here, I wasn't having much luck going through GWT support.
We recently moved our eCommerce site to a new server and in the process the site was hacked. Spammy URLs were injected in, all of which were pointing outwards to some spammy eCommerce retail stores. I removed ~4,000 of these links, but more continue to pile in. As you can see, there are now over 20,000 of these links. Note that our server support team does not see these links anywhere.
I understand that Google doesn't generally view this as a problem. But is that true given my circumstance? I cannot imagine that 20,000 new, senseless 404's can be healthy for my website.
If I can't get a good response here, would anyone know of a direct Google support email or number I can use for this issue?
-
Hi
Yeah, lets say they use Xrumer.
They hack your site, insert pages of their own, and links on your pages.
They put those urls in text files based on their keyword targets/groups.
They run the software, using those list with their link sources and using their auto insert random url template.
So that pings a 404 to GWT so the 404 shows up there.
If these are pros, they already know that the pages are dead by now, as they confirm links after each run. It just takes a bit more time for GWMT to get notified so you'll see them trickle in.
So you'll see those 404 pages getting links from different dates.
Hope that helps
-
I don't understand why more spam links would be coming in though. Is it because the spam network doesn't realize that I've removed the injected pages? In other words, are they unknowingly linking to 404s?
-
Since those URLs are already gone after you cleaned it up, you can just mark those as fixed. GWT usually is pretty late with picking those up. I've handled my share of hacked sites, some with invisible links.
If they appear again, then you'll need to find where they are getting through. It's a pain but you have to fully check your files for scripts and encrypted codes.
Aside from those, it's just time. Google will eventually stop showing them.
Good luck Andrew!
PS. You might want to look at some of your pages using Google's cache result. You can see invisible links using that. Just in case you haven't done this part.
-
Thank you for your response.
I also believe I was hacked through my wordpress. What exactly did you do once you realized the htaccess file was changed? Did you change it back to whatever code was there before?
I already submitted a reconsideration request to Google and it was successful. I no longer have "this site may be hacked" in the SERPs, but I still have thousands of urls pointing to 404 pages.
-
-
Samething happened to me a last month due to a securrity break in a pluggin that was part of my wordpress theme.
After hacked the site with injected url and also altering htaccess file (check that out) they changed the htacces file in order that if you enter your url you could see the correct version of your web, but if you enter your website in a google search traffic wen to this spammy viagra stuff pages.
I also recieved a manual action on my site.
What i did:
1- removed the injected files that were creating the spammy urls
2- edited the htacces file locating what code they had changed
3- summited a reconsideration request explaining what it was happening
4- Removed on webmaster tools al url that were spammy created on my site to remove them from google index
After 10 days manual action was removed. But till know i still have spammy links to 404 on my site. This happens because they also hacked other sites and creates like spammy linking networks. Has people start recovering their sites the amount of links to this pages will reduce.
My experience this big amount on 404 it has affected on about 30% of traffic. This traffic has know recovered almost completly and the amount on 404 is reducing with time.
So my conclusion is that this 404 are not healthy but they will be gone with time and your site will recover.
-
Ha, sorry about the initial test post. It wasn't publishing on my main computer at first.
-
Can you please be more specific.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix these unwanted URLs?
Right now i have wordpress, one page website, but google also show wp-content. KIndly check below in google. site:http://baltimoreelite.com/ How I can fix this issue?
Technical SEO | | marknorman0 -
Image centric site and duplicate content issues
We have a site that has very little text, the main purpose of the site is to allow users to find inspiration through images. 1000s of images come to us each week to be processed by our editorial team, so as part of our process we select a subset of the best images and process those with titles, alt text, tags, etc. We still host the other images and users can find them through galleries that link to the process and unprocessed image pages. Due to the lack of information on the unprocessed images, we are having lots of duplicate content issues (The layout of all the image pages are the same, and there isn't any unique text to differentiate the pages. The only changing factor is the image itself in each page) Any suggestions on how to resolve this issue, will be greatly appreciated.
Technical SEO | | wedlinkmedia0 -
Should I consider webmaster tools links and linked pages ratio to remove unnatural links?
I don't know this is a suitable place for post this question. Anyway I have done it. According to the Google webmaster tools, Links to your site page. My blog has considerable amount of links, from linked pages (from certain domain names). For an instance please refer following screenshot. When I am removing unnatural links, should I consider these, links from linked pages ratio? Almost all of these sites are social bookmarking sites. When I publish a new bookmark on those sites, they automatically add a homepage link. As a result of that, I got a huge number of home page links from linked pages. What is your recommendation? Thanks! webmaster.png web_master_tools.png
Technical SEO | | Godad0 -
Penguin update: Penalty caused from onsite issues or link profile?
Back in April before the Penguin update, our website home page ranked in the #1 position for several of our keywords and on page 1 for dozens of other keywords. But immediately after the Penguin update in April our rankings dropped immediately to below #100 for nearly all keywords. The sharp drop was obviously a penalty of some kind. We worked on removing some bad back links that were questionable. Over the past 7 months many of the bad links have dropped off and our link profile is improving. Our rankings, however, have not improved at all. In Yahoo and Bing we remain strong and rank on page 1 for many of our keywords. I joined SEOmoz because I’ve heard about their great tools and resources for SEO. The first thing I learned is that I had a lot of errors and warnings that need to be addressed and I’m optimistic that these items once addressed will get me out of that dreadful penalty box we’ve been in for 7 months now. So with that quick summary of our SEO problems I have a few questions that I hope to get some direction on. 1. Crawl Diagnostics for my site in SEOmoz reports 7 errors and 19 warnings including missing meta description tags, temporary redirects, duplicate page content, duplicate page title, 4xx client error, and title element too long. Could these errors and warnings be what has landed my website in some kind of penalty or filter? 2. A couple of the errors were duplicate page title and duplicate page content. So there appears to be a duplicate home page. Here are the two pages: howtomaketutus.com/ howtomaketutus.com/?p=home They are the same page but it looks like Google is seeing it as duplicate content. Do I need to do a 301 redirect in the .htaccess file? I’m not sure how that would work since they are the same page. If that is possible how would I go about doing that? 3. Finally based on what I’ve described above is it more likely that the penalty we are experiencing is because of onsite issues or because of our link profile? We would really appreciate any help or direction anyone can offer on these issues. Thanks
Technical SEO | | 123craft0 -
Does having a page (or site) available on HTTP and HTTPS cause duplication issues?
Say I've got a site that can be accessed using either protocal (i.e. HTTP and HTTPS), but most (if not all of the links) are pointing to the HTTP versions. Will it cause a problem if I start link building to HTTPS versions? In other words does google see http://mysite.com as the same page as https://mysite.com? Thanks
Technical SEO | | PeterAlexLeigh0 -
What happens when a link goes to a dead url on my site?
I noticed in Open Site Explorer, I have several incoming links going to dead urls because i re-organized my site. For example, there might be an incoming link to: sample.php?ID=8 The problem is that I moved the file to /subdir1 so it would be nice if it could link to /subdir1/sample.php?ID=8 BUT, on top of that, I have also changed the url to seo-friendly urls. So, really, it should link to /Category_Descripton/ProductName/8 and then get re-written to /subdir1/sample.php?ID=8 So, what are the implications of having these incoming links to dead urls other than the bad user experience. What are the implications from an SEO standpoint? What's the best way to fix this? Thanks.
Technical SEO | | webtarget0 -
What are the considerations in setting language within the url of multilingual sites?
Is it good practice to use Language-Agnostic + LOCALE=en +LOCALE=fr (as per example below)? If not what is the best way to determine language within a url and why? For example, today we use: http://www.canadapost.ca/cpo/mc/default.jsf (goes to language last used by user) http://www.canadapost.ca/cpo/mc/default.jsf?LOCALE=fr (forces a French-launguage page) http://www.canadapost.ca/cpo/mc/default.jsf?LOCALE=en (forces and English-language page) I think you can get tell Google about these parameters through Webmaster tools to help them properly crawl and understand your content, but if we had the opportunity to change it what should we do?
Technical SEO | | CanadaPost0 -
Canonical Tag Pointing To The Same URL
Does it matter if a canonical tag points to the URL in which the tag is on? Example Page: http://www.domain.com Canonical tag: rel="canonical" href="http://www.domain.com" /> I only ask because a client of mine has a CMS that automatically does that to every page on the site and there's no way to remove it. Will this have a negative impact or does it not matter at all? Any insights would be great because I can't find a clear answer anywhere online. Thanks!
Technical SEO | | MichaelWeisbaum0