Solved PayDay hack - but SERPs show URLs - what should I do?
-
We had the PayDay hack - and solved it completely. The problem is - the SERPs have over 3,000 URLs pointing to 404 on our website all of which have urls that are like this:
<cite>www.onssi.com/2012/2/post1639/payday-loan-companies-us</cite>
What should I do? Should I disavow every one of the 3,000? No Follow?
-
Oh, they'lre still indexed - got it. Yeah, that's a lot tougher. Ultimately, Google has to re-crawl these URLs, and since they're bad URLs and have no internal links and only spammy inbound links, that can take a while.
You can remove the URLs in Google Webmaster Tools, but that's a one-by-one process, so it's mostly for the worst culprits. Another option would be to make an XML sitemap with just these bad URLs. Encourage Google to recrawl them and process the 404s. The sitemap would also help tell you how many of the URLs were indexed and to track that number (more reliably than "site:" will). Unfortunately, you may have to make that list manually.
-
Thanks, Dr. Meyers!
So I implemented what Tom said a few weeks ago, and it still hasn't resolved:
The page those go to does throw a 404, so I don't know when the listings should go away but it's pretty frustrating to see that they haven't yet.
Do you have any other suggestions on how to fix this?
-
I'm not sure you'll see a big difference here between the 404/410 (I've heard some mixed data recently), but definitely agree with Tom that, once Google honors either one, you've essentially cut the inbound link at that point. Making note of the link sources seems smart, but I'd also hesitate to disavow all these sites for now. Google is going to have to reprocess this and it may take a few days (or a couple of weeks) for the 404s to sink in. A link to a page that doesn't exist generally shouldn't harm you, though.
-
Thanks, this is really helpful!
How would I serve the 410 instead of 404? Should I use a regex match for payday and other words that wouldn't appear in regular onssi.com urls and then serve that based on the regex?
Also - speaking of useful tools, is there a tool for getting all 3,000 (that's only for payday - not even talking about other keywords) results without having to go ten by ten?
Thanks so much for the help!
-
Hi there Itamar
If I am reading this correctly, are the (now removed) URLs appearing on your own site?
If so, in order to tell Google that these pages are well and truly gone, I would serve a 410 response on those pages, rather than a 404. This is a response code that tells Google the page has gone permanently, and so it will encourage the crawler not to try to revisit the URL in the future.
That means that any external links pointing to that page will become obsolete and shouldn't be counted in your link profile. That's the theory, anyway. For that reason, I would hold off on disavowing those links for the time being. I'd make a note of it, but if everything goes OK with the 410 response, Google probably won't count those external links towards your site anyway. Just in case, make sure you get all of the external links and save them in an excel sheet, so if you do need to disavow them in the future, you have them to hand.
I've double checked the onssi.com site in the safe browsing tool (see for yourself here) and it looks as though Google thinks the site is safe - ie it doesn't think it's hacked, which is great. In any case, you may want to run the site through the malware review process just to be absolutely sure.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
I think My Site Has Been Hacked
I am working with a client and have noticed lots of 500 server errors that look very strange in their webmaster tools account. I am seeing URLs like this blog/?tag=wholesale-cheap-nfl-jerseys-free-0702.html and blog/?tag=nike-jersey-shorts-4297.html there are 155 similar pages yet the client does not sell anything like this and hasn't created these URLs. I have updated WP and all plugins and cannot find these links or pages on the site anywhere but I am guessing they are slowing the site down as GWT keeps highlighting them as errors. Has anybody had any experiences with these types of hacks and can point me in the right direction of how to clean it up properly? Ta
White Hat / Black Hat SEO | | fazza470 -
Website not listing in google - screaming frog shows 500 error? What could the issue be?
Hey, http://www.interconnect.org.uk/ - the site seems to load fine, but for some reason the site is not getting indexed. I tried running the site on screaming frog, and it gives a 500 error code, which suggests it can't access the site? I'm guessing this is the same problem google is having, do you have any ideas as to why this may be and how I can rectify this? Thanks, Andrew
White Hat / Black Hat SEO | | Heehaw0 -
Include placename in URL, or not?
Hi Mozzers, I'm wondering whether to put placename in URL or not. This is for a hotel so it's very focused on the county. I have loads of sub pages along the lines of www.hotelname.com/short-breaks-somerset www.hotelname.com/eat-out-somerset and so on but I was wondering whether that placename element would help or hinder. For example, may want to rank for short breaks in other searches (not just those seeking short breaks in Somerset) and was wondering whether the somerset bit may actually hinder this in the future. Also noticed Somerset is mentioned in nearly all of the page urls through the site. Perhaps this is a bit spammy and just not neccesary. I can include the address of the hotel on every page anyway. What do you think? Thanks in advance for your help 🙂 Luke
White Hat / Black Hat SEO | | McTaggart0 -
Will aggregating external content hurt my domain's SERP performance?
Hi, We operate a website that helps parents find babysitters. As a small add- on we currently run a small blog with the topic of childcare and parenting. We are now thinking of introducing a new category to our blog called "best articles to read today". The idea is that we "re-blog" selected articles from other blogs that we believe are relevant for our audience. We have obtained the permission from a number of bloggers that we may fully feature their articles on our blog. Our main aim in doing so is to become a destination site for parents. This obviously creates issues with regard to duplicated content. The question I have is: will including this duplicated content on our domain harm our domains general SERP performance? And if so, how can this effect be avoided? It isn't important for us that these "featured" articles rank in SERPs, so we could potentially make them "no index" sites or make the "rel canonical" point to the original author. Any thoughts anyone? Thx! Daan
White Hat / Black Hat SEO | | daan.loening0 -
My site has disapeared from the serps. Could someone take a look at it for me and see if they can find a reason why?
my site has disappeared from the serps. Could someone take a look at it for me and see if they can find a reason why? It used to rank around 4 for the search "austin wedding venues" and it still ranks number three for this search on Bing. I haven't done any SEO work on it in a while so i don't think i did anything to make Google mad but now it doesn't even rank anywhere in the top 160 results. Here's the link: http://austinweddingvenues.org Thanks in advance Mozzers! Ron
White Hat / Black Hat SEO | | Ron100 -
Dramatic fall in SERP's for all keywords at end of March 2012?? Help!
Hi, Our website www.photoworld.co.uk has been improving it's SERP's for the last 12 months or so, achieving page 1 rankings for most of our key terms. Then suddenly, around the end of March, we suffered massive drops in nearly all of our key terms (see attached image for more info). Basically I wondered if anyone had any clues on what Google has suddenly taken a huge dislike to with our site and steps we can put in place to aid with rankings recovery ASAP. Thanks n8taO.jpg
White Hat / Black Hat SEO | | cewe0 -
Beaten in SERP's by a site going 'all in' on 2 keywords in their anchor text profile.
I would like to get peoples thoughts on putting 80% of your anchor text links in just 2 keywords vs a nice spread of branded and longtail keywords.. like I am. recently fell off the first page for a key SERP.. and the site in P10 has gone nuts on just that two keyword's.. I know we have a good site onpage/ conversion / low bounce rate page views etc.. Pretty sure we get more traffic than them. Seems that this obvious bloated anchor text profiling has worked for them though.. What do you guys think/know?
White Hat / Black Hat SEO | | robertrRSwalters0