Solved PayDay hack - but SERPs show URLs - what should I do?
-
We had the PayDay hack - and solved it completely. The problem is - the SERPs have over 3,000 URLs pointing to 404 on our website all of which have urls that are like this:
<cite>www.onssi.com/2012/2/post1639/payday-loan-companies-us</cite>
What should I do? Should I disavow every one of the 3,000? No Follow?
-
Oh, they'lre still indexed - got it. Yeah, that's a lot tougher. Ultimately, Google has to re-crawl these URLs, and since they're bad URLs and have no internal links and only spammy inbound links, that can take a while.
You can remove the URLs in Google Webmaster Tools, but that's a one-by-one process, so it's mostly for the worst culprits. Another option would be to make an XML sitemap with just these bad URLs. Encourage Google to recrawl them and process the 404s. The sitemap would also help tell you how many of the URLs were indexed and to track that number (more reliably than "site:" will). Unfortunately, you may have to make that list manually.
-
Thanks, Dr. Meyers!
So I implemented what Tom said a few weeks ago, and it still hasn't resolved:
The page those go to does throw a 404, so I don't know when the listings should go away but it's pretty frustrating to see that they haven't yet.
Do you have any other suggestions on how to fix this?
-
I'm not sure you'll see a big difference here between the 404/410 (I've heard some mixed data recently), but definitely agree with Tom that, once Google honors either one, you've essentially cut the inbound link at that point. Making note of the link sources seems smart, but I'd also hesitate to disavow all these sites for now. Google is going to have to reprocess this and it may take a few days (or a couple of weeks) for the 404s to sink in. A link to a page that doesn't exist generally shouldn't harm you, though.
-
Thanks, this is really helpful!
How would I serve the 410 instead of 404? Should I use a regex match for payday and other words that wouldn't appear in regular onssi.com urls and then serve that based on the regex?
Also - speaking of useful tools, is there a tool for getting all 3,000 (that's only for payday - not even talking about other keywords) results without having to go ten by ten?
Thanks so much for the help!
-
Hi there Itamar
If I am reading this correctly, are the (now removed) URLs appearing on your own site?
If so, in order to tell Google that these pages are well and truly gone, I would serve a 410 response on those pages, rather than a 404. This is a response code that tells Google the page has gone permanently, and so it will encourage the crawler not to try to revisit the URL in the future.
That means that any external links pointing to that page will become obsolete and shouldn't be counted in your link profile. That's the theory, anyway. For that reason, I would hold off on disavowing those links for the time being. I'd make a note of it, but if everything goes OK with the 410 response, Google probably won't count those external links towards your site anyway. Just in case, make sure you get all of the external links and save them in an excel sheet, so if you do need to disavow them in the future, you have them to hand.
I've double checked the onssi.com site in the safe browsing tool (see for yourself here) and it looks as though Google thinks the site is safe - ie it doesn't think it's hacked, which is great. In any case, you may want to run the site through the malware review process just to be absolutely sure.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mark up seen as spam - how to solve?
Hi, I have a problem with my structured data. a few week's ago, i received this message in search console:
White Hat / Black Hat SEO | | chalet
'the mark on some pages of your website look like techniques that conflict with the Google guidlines for structured markup with spam'
i checked the guidlines and my website ( www.chalet.nl) but couldn't found any issues that are in conflict with the guidlines from google. So i asked a controle request.
the request was unfortunately rejected. my question: how can i decect the wrong mark? Kind regards,
Jeroen0 -
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
Does the Traffic boost SEO/SERP ranks?
Hello, I know a guy that sells Organic traffic, bought 10k from him, will this help me to bost google seo ranks? Attached a screenshoot thank you!
White Hat / Black Hat SEO | | 7liberty0 -
Moving content to a clean URL
Greetings My site was seriously punished in the recent penguin update. I foolishly got some bad out sourced spammy links built and I am now paying for it 😞 I am now thinking it best to start fresh on a new url, but I am wondering if I can use the content from the flagged site on the new url. Would this be flagged as duplicate content, even if i took the old site down? your help is greatly appreciated Silas
White Hat / Black Hat SEO | | Silasrose0 -
Next Step to Improve SERPS?
Hello... Just looking for some opinions on what my next steps should be in regards to improving my rankings in the SERPS. Most of my category and subcategory pages (as well as the home page) have received grades of "A" on SEOMOZ's on page grader... None of my competitors have as much or as unique of content as I do. My question to all of you SEO genius' and experts (i mean that as a great compliment, not sarcasm) is what would your next steps be in terms of moving up in the search results? My url is : http://goo.gl/XUH3f Thanks in advance!
White Hat / Black Hat SEO | | Prime850 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Why Proved Spammers are on 1st Google SERP's Results
This question is related exclusively to few proved spammers who have gained 1st Google search results for specific terms in the Greek market, targeting Greek audience. Why he looks spammer and very suspicious? For instance, the site epipla-sofa.gr, sofa.gr, fasthosting.gr and greekinternetmarketing.com look suspicious regarding their building link activities: 1. suspicious spiky link growth 2. several links from unrelated content (unrelated blog posts forom other markets, paid links, hidden links) 3. excessive amount of suspicious link placements (forum profiles, blog posts, footer and sidebar links) 4. Greek anchor text with the keyword within articles written in foreign languages (total spam) 5. Unnatural anchor text distribution (too many repetitions) So the main question is: Why Google is unable to recognize/trace some of these (or even all) obvious spamming tactics and still these spammy sites as shwon below reside on the 1st Google.gr SERPs. Examples of spam sites according to their link building history: www.greekinternetmarketing.com www.epipla-sofa.gr www.fasthosting.gr www.sofa.gr All their links look very similar. They use probably software to build links, or even hack authority sites and leave hidden links (really dont know how they could do that). Could you please explain or share similar issues? Have you ever found any similar cases in your industry, and how did you tackle it? We would appreciate your immediate attention to this matter. Regards, George
White Hat / Black Hat SEO | | Clickwisegr0 -
A domain is ranking for a plural key word in SERPs on page 1 but for the singular not at all?
What could the reasons that a domain is ranking for the plural version of a key word on SERPs page 1 and for the singular version not at all? Google knows that both key words belong together, as in the SERPs for one version also the other version of the key word is being highlighted. If I search for the domain with the plural keyword it shows up on the first page in SERPs, but If I search for the same keyword as singular (in German it is just removing an “s”) I see the plural version highlighted many times but I cannot find my domain. What could be the reason for this behavior? penalties?
White Hat / Black Hat SEO | | SimCaffe0