Solved PayDay hack - but SERPs show URLs - what should I do?
-
We had the PayDay hack - and solved it completely. The problem is - the SERPs have over 3,000 URLs pointing to 404 on our website all of which have urls that are like this:
<cite>www.onssi.com/2012/2/post1639/payday-loan-companies-us</cite>
What should I do? Should I disavow every one of the 3,000? No Follow?
-
Oh, they'lre still indexed - got it. Yeah, that's a lot tougher. Ultimately, Google has to re-crawl these URLs, and since they're bad URLs and have no internal links and only spammy inbound links, that can take a while.
You can remove the URLs in Google Webmaster Tools, but that's a one-by-one process, so it's mostly for the worst culprits. Another option would be to make an XML sitemap with just these bad URLs. Encourage Google to recrawl them and process the 404s. The sitemap would also help tell you how many of the URLs were indexed and to track that number (more reliably than "site:" will). Unfortunately, you may have to make that list manually.
-
Thanks, Dr. Meyers!
So I implemented what Tom said a few weeks ago, and it still hasn't resolved:
The page those go to does throw a 404, so I don't know when the listings should go away but it's pretty frustrating to see that they haven't yet.
Do you have any other suggestions on how to fix this?
-
I'm not sure you'll see a big difference here between the 404/410 (I've heard some mixed data recently), but definitely agree with Tom that, once Google honors either one, you've essentially cut the inbound link at that point. Making note of the link sources seems smart, but I'd also hesitate to disavow all these sites for now. Google is going to have to reprocess this and it may take a few days (or a couple of weeks) for the 404s to sink in. A link to a page that doesn't exist generally shouldn't harm you, though.
-
Thanks, this is really helpful!
How would I serve the 410 instead of 404? Should I use a regex match for payday and other words that wouldn't appear in regular onssi.com urls and then serve that based on the regex?
Also - speaking of useful tools, is there a tool for getting all 3,000 (that's only for payday - not even talking about other keywords) results without having to go ten by ten?
Thanks so much for the help!
-
Hi there Itamar
If I am reading this correctly, are the (now removed) URLs appearing on your own site?
If so, in order to tell Google that these pages are well and truly gone, I would serve a 410 response on those pages, rather than a 404. This is a response code that tells Google the page has gone permanently, and so it will encourage the crawler not to try to revisit the URL in the future.
That means that any external links pointing to that page will become obsolete and shouldn't be counted in your link profile. That's the theory, anyway. For that reason, I would hold off on disavowing those links for the time being. I'd make a note of it, but if everything goes OK with the 410 response, Google probably won't count those external links towards your site anyway. Just in case, make sure you get all of the external links and save them in an excel sheet, so if you do need to disavow them in the future, you have them to hand.
I've double checked the onssi.com site in the safe browsing tool (see for yourself here) and it looks as though Google thinks the site is safe - ie it doesn't think it's hacked, which is great. In any case, you may want to run the site through the malware review process just to be absolutely sure.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content warning: Same page but different urls???
Hi guys i have a friend of mine who has a site i noticed once tested with moz that there are 80 duplicate content warnings, for instance Page 1 is http://yourdigitalfile.com/signing-documents.html the warning page is http://www.yourdigitalfile.com/signing-documents.html another example Page 1 http://www.yourdigitalfile.com/ same second page http://yourdigitalfile.com i noticed that the whole website is like the nealry every page has another version in a different url?, any ideas why they dev would do this, also the pages that have received the warnings are not redirected to the newer pages you can go to either one??? thanks very much
White Hat / Black Hat SEO | | ydf0 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Sudden influx of 404's affecting SERP's?
Hi Mozzers, We've recently updated a site of ours that really should be doing much better than it currently is. It's got a good backlink profile (and some spammy links recently removed), has age on it's side and has been SEO'ed a tremendous amount. (think deep-level, schema.org, site-speed and much, much more). Because of this, we assumed thin, spammy content was the issue and removed these pages, creating new, content-rich pages in the meantime. IE: We removed a link-wheel page; <a>https://www.google.co.uk/search?q=site%3Asuperted.com%2Fpopular-searches</a>, which as you can see had a **lot **of results (circa 138,000). And added relevant pages for each of our entertainment 'categories'.
White Hat / Black Hat SEO | | ChimplyWebGroup
<a>http://www.superted.com/category.php/bands-musicians</a> - this page has some historical value, so the Mozbar shows some Page Authority here.
<a>http://www.superted.com/profiles.php/wedding-bands</a> - this is an example of a page linking from the above page. These are brand new URLs and are designed to provide relevant content. The old link-wheel pages contained pure links (usually 50+ on every page), no textual content, yet were still driving small amounts of traffic to our site.
The new pages contain quality and relevant content (ie - our list of Wedding Bands, what else would a searcher be looking for??) but some haven't been indexed/ranked yet. So with this in mind I have a few questions: How do we drive traffic to these new pages? We've started to create industry relevant links through our own members to the top-level pages. (http://www.superted.com/category.php/bands-musicians) The link-profile here _should _flow to some degree to the lower-level pages, right? We've got almost 500 'sub-categories', getting quality links to these is just unrealistic in the short term. How long until we should be indexed? We've seen an 800% drop in Organic Search traffic since removing our spammy link-wheel page. This is to be expected to a degree as these were the only real pages driving traffic. However, we saw this drop (and got rid of the pages) almost exactly a month ago, surely we should be re-indexed and re-algo'ed by now?! **Are we still being algor****hythmically penalised? **The old spammy pages are still indexed in Google (138,000 of them!) despite returning 404's for a month. When will these drop out of the rankings? If Google believes they still exist and we were indeed being punished for them, then it makes sense as to why we're still not ranking, but how do we get rid of them? I've tried submitting a manual removal of URL via WMT, but to no avail. Should I 410 the page? Have I been too hasty? I removed the spammy pages in case they were affecting us via a penalty. There would also have been some potential of duplicate content with the old and the new pages.
_popular-searches.php/event-services/videographer _may have clashed with _profiles.php/videographer, _for example.
Should I have kept these pages whilst we waited for the new pages to re-index? Any help would be extremely appreciated, I'm pulling my hair out that after following 'guidelines', we seem to have been punished in some way for it. I assumed we just needed to give Google time to re-index, but a month should surely be enough for a site with historical SEO value such as ours?
If anyone has any clues about what might be happening here, I'd be more than happy to pay for a genuine expert to take a look. If anyone has any potential ideas, I'd love to reward you with a 'good answer'. Many, many thanks in advance. Ryan.0 -
Was I hit by Panda/Payday Loan/Penguin?
Good Mozzing, So, as some of you may know based on my previous post, I am working with an odd situation here. I have taken over an account for a company and the Main domain pretty much falls into the category of everything Google hates. I have suggested to the CEO that the practices they did before me were sorta in the Grayhat realm bordering on Blackhat but I need empirical data before I can make any drastic changes. In May and June of 2013 Panda, Penguin, and Payday Loan all had updates. Our company has nothing do to with porn, apartment rentals, finances, or anything like that, but the SEO methods used were, as I said, questionable. In June of 2013 there was a drop from 8,000 sessions to 5,000 sessions from organic traffic. If I switch over to all referring traffic the loss increases to 11,000 to 7,000 sessions. To me that seems pretty substantial. Not only that, but according to the data we have not been able to recover.There was a steady climb for about 5 months before the drop, and then now we are in this middle ground. I have only been here for about 2 weeks, so the things I have been uncovering are pretty amazing. Is that enough to assume that we were indeed hit by the updates?
White Hat / Black Hat SEO | | HashtagHustler2 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
Search Results Showing Additional info/Links
Did I miss something? I was looking at search result listings this morning and noticed that Walmart has additional information at the bottom of their (non-paid (I think)) search results. Please see the attached image and you'll notice links to "Item Description - Product Warranty and Service - Specifications - Gifting Plans" How are they doing this? I just noticed the same on one of our competitors listings so It's not just Walmart and the links are item specific. (I have update the image) Z0yqKtO.jpg
White Hat / Black Hat SEO | | BWallacejr1 -
Redirecting an image url to a more SEO friendly image url
We are currently trying to find the best way of making the images on one of our sites more SEO friendly, the easiest way for us would be to redirect the image URL to a more SEO friendly image URL. For example: http://www.website.com/default/cache/file/F8325DA-0A9A-437F-B5D0A4255A066261_medium.jpg redirects to http://www.website.com/default/cache/file/spiral-staircase.jpg Would Google frown upon this as it's saying the image is one thing and then points the user somewhere else?
White Hat / Black Hat SEO | | RedAntSolutions0 -
Impressions in Google SERP has declined from 3500 to 1600 after 5-25-2012\. Is it Penguin?
It's about the website http://www.apartments-houseboats-amsterdam.com/ The visitors had declined from 270 to 150 visitors per day. Is this caused by the Google update Penguin? If so what can I do to solve the problem? Thank you for your time and effort,
White Hat / Black Hat SEO | | letsbuilditnl0