Sudden influx of 404's affecting SERP's?
-
Hi Mozzers,
We've recently updated a site of ours that really should be doing much better than it currently is. It's got a good backlink profile (and some spammy links recently removed), has age on it's side and has been SEO'ed a tremendous amount. (think deep-level, schema.org, site-speed and much, much more).
Because of this, we assumed thin, spammy content was the issue and removed these pages, creating new, content-rich pages in the meantime.
IE:
-
We removed a link-wheel page; <a>https://www.google.co.uk/search?q=site%3Asuperted.com%2Fpopular-searches</a>, which as you can see had a **lot **of results (circa 138,000).
-
And added relevant pages for each of our entertainment 'categories'.
<a>http://www.superted.com/category.php/bands-musicians</a> - this page has some historical value, so the Mozbar shows some Page Authority here.
<a>http://www.superted.com/profiles.php/wedding-bands</a> - this is an example of a page linking from the above page. These are brand new URLs and are designed to provide relevant content.
The old link-wheel pages contained pure links (usually 50+ on every page), no textual content, yet were still driving small amounts of traffic to our site.
The new pages contain quality and relevant content (ie - our list of Wedding Bands, what else would a searcher be looking for??) but some haven't been indexed/ranked yet.So with this in mind I have a few questions:
- How do we drive traffic to these new pages? We've started to create industry relevant links through our own members to the top-level pages. (http://www.superted.com/category.php/bands-musicians) The link-profile here _should _flow to some degree to the lower-level pages, right? We've got almost 500 'sub-categories', getting quality links to these is just unrealistic in the short term.
- How long until we should be indexed? We've seen an 800% drop in Organic Search traffic since removing our spammy link-wheel page. This is to be expected to a degree as these were the only real pages driving traffic. However, we saw this drop (and got rid of the pages) almost exactly a month ago, surely we should be re-indexed and re-algo'ed by now?!
- **Are we still being algor****hythmically penalised? **The old spammy pages are still indexed in Google (138,000 of them!) despite returning 404's for a month. When will these drop out of the rankings? If Google believes they still exist and we were indeed being punished for them, then it makes sense as to why we're still not ranking, but how do we get rid of them? I've tried submitting a manual removal of URL via WMT, but to no avail. Should I 410 the page?
- Have I been too hasty? I removed the spammy pages in case they were affecting us via a penalty. There would also have been some potential of duplicate content with the old and the new pages.
_popular-searches.php/event-services/videographer _may have clashed with _profiles.php/videographer, _for example.
Should I have kept these pages whilst we waited for the new pages to re-index?
Any help would be extremely appreciated, I'm pulling my hair out that after following 'guidelines', we seem to have been punished in some way for it. I assumed we just needed to give Google time to re-index, but a month should surely be enough for a site with historical SEO value such as ours?
If anyone has any clues about what might be happening here, I'd be more than happy to pay for a genuine expert to take a look. If anyone has any potential ideas, I'd love to reward you with a 'good answer'.Many, many thanks in advance.
Ryan.
-
-
Hi Monica,
Thanks for the fast response.
I'm a bit wary of 301 redirecting the old pages -- this would be extremely easy to do - however: if we were being penalised on those old pages, wouldn't this just redirect all the penalties to our new squeaky clean pages?
I submitted a new sitemap to Google the day we made the changes -- probably about three weeks - a month ago including the new URLs (or as many as we could include with the 500 URL limit) and removing the old spammy ones.
Penalty-wise, we've never had a manual warning or anything in WMT. However, this doesn't discount the idea that we may have been suffering an algorithm penalty, right?
It'd be great to hear from anyone about their experience with 301's and the likeliness of passing on 'bad' linkjuice from old pages (does this even happen?).
Also whether a 410 would help - stops all the 404 errors from continuously occurring and Google assuming there's something bizarre going on.Thank you again.
-
How do we drive traffic: In light of removing these old pages, and seeing a tremendous influx of 404 errors, I am assuming the pages that were removed were not 301 redirected. If that is the case I would strongly encourage you to redirect those old URLs to your new pages. This will help get traffic to the new pages which will eventually help them rank on their own. It sounds like your pages just dropped off the face of the planet and because the other pages are so new, you are losing all of your organic rankings and subsequently, organic traffic.
How long should you wait to be indexed: Did you submit a new sitemap to Google? I would make sure you have done that. After that, it shouldn't take very long, 2 weeks is the longest I have waited for an index after a sitemap submission.
As far as a penalty goes, check WMT. If you see nothing in there from Google I think you are safe on the penalty side. However, the sudden changes, the large amount of changes and the influx in 404 pages might have moved your site back in rankings while Google takes a look to make sure there isn't any nefarious activity. I wouldn't worry about a penalty unless you actually receive one.
If you are worried about duplicate content, try researching Rel canonical tags to see if they will be helpful to you. It sounds like you made a lot of changes quickly, and that Google needs time to investigate. Unfortunately, you have to kind of wait a little bit. Try the things I listed here, I hope that it helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
UKBF 'forex' clones appearing
Hi all, Just been looking at my referring domains and it seems someone is taking the pleasure of cloning the UK Business Forums website and adding 'forex' based links on all the external anchors. This includes everyone who is listed in their directory. I've put below the domains I know of, but if anyone else knows of more please add them so we can all get them disavowed. domain:redwood96.ru
White Hat / Black Hat SEO | | phero
domain:zanier.it
domain:selskie-zori.ru
domain:gabrielloni.it
domain:reserva-ideal.com
domain:imexaf.com
domain:rassemblementpourjouy.com
domain:windsorlegion.ca
domain:powerconector.com
domain:eltallerdelorfebrewd.com
domain:aepedome.net
domain:spkvarc.ru
domain:mtdnk.ru
domain:koning.rs
domain:rassemblementpourjouy.com
domain:imexaf.com
domain:gabrielloni.it0 -
'SEO Footers'
We have an internal debate going on right now about the use of a link list of SEO pages in the footer. My stance is that they serve no purpose to people (heatmaps consistently show near zero activity), therefore they shouldn't be used. I believe that if something on a website is user-facing, then it should also beneficial to a user - not solely there for bots. There are much better ways to get bots to those pages, and for those people who didn't enter through an SEO page, internal linking where appropriate will be much more effective at getting them there. However, I have some opposition to this theory and wanted to get some community feedback on the topic. Anyone have thoughts, experience, or data to share on this subject?
White Hat / Black Hat SEO | | LoganRay1 -
Pages mirrored on unknown websites (not just content, all the HTML)... blackhat I've never seen before.
Someone more expert than me could help... I am not a pro, just doing research on a website... Google Search Console shows many backlinks in pages under unknown domains... this pages are mirroring the pages of the linked website... clicking on a link on the mirror page leads to a spam page with link spam... The homepage of these unknown domain appear just fine... looks like that the domain is partially hijacked... WTF?! Have you ever seen something likes this? Can it be an outcome of a previous blackhat activity?
White Hat / Black Hat SEO | | 2mlab0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Creating pages as exact match URL's - good or over-optimization indicator?
We all know that exact match domains are not getting the same results in the SERP's with the algo changes Google's been pushing through. Does anyone have any experience or know if that also applies to having an exact match URL page (not domain). Example:
White Hat / Black Hat SEO | | lidush
keyword: cars that start with A Which way to go is better when creating your pages on a non-exact domain match site: www.sample.com/cars-that-start-with-a/ that has "cars that start with A" as the or www.sample.com/starts-with-a/ again has "cars that start with A" as the Keep in mind that you'll add more pages that start the exact same way as you want to cover all the letters in the alphabet. So: www.sample.com/cars-that-start-with-a/
www.sample.com/cars-that-start-with-b/
www.sample.com/cars-that-start-with-C/ or www.sample.com/starts-with-a/
www.sample.com/starts-with-b/
www.sample.com/starts-with-c/ Hope someone here at the MOZ community can help out. Thanks so much0 -
Sudden Drop in Keyword Ranking - No Idea Why
Hi Mozzers, I am in charge of everything Web Optimization for the company I work for. I keep active track of our SEO/SEM practices, especially our keyword rankings. Prior to my arrival at the company, in January of this year, we had a consultant handling the SEO work and though they did a decent job on maintaining our rankings for a hefty set of keywords, they were unable to get a particular competitive keyword ranking. This is odd because other derivations of that keyword which are equally competitive are all still ranking on page one. Also, full disclosure, they were not engaging in any questionable linking. In fact, they didn't do much of any link building whatsoever. I also haven't been engaging in any questionable content creation or spammy linking. We put out content regularly as we are a publicly traded company - nothing spammy at all. Anyway, one thing I tried since February was engaging in a social media sharing campaign among friends and coworkers to share the respective page and keyword on their Facebook and Google+ pages. To my surprise, this tactic worked just like natural search usually does - slowly and through the months I saw the keyword rank from completely invisible, to page 6, to page 3, to page 2, and finally onto position 6 page one as of just last week. Today, unfortunately, the keyword is invisible again :(. I am perplexed. It's tough to build links for our company as we are in the public and everything we do has to be approved by someone higher up. I also checked our webmaster tools and haven't seen any notifications that can give me clue as to what's going on. I am aware that there was a Penguin update recently and there are monthly Panda updates, but I'm skeptical as to whether or not those updates would be correlated to this because, at initial glance, our traffic and rankings for other keywords and pages don't seem to be affected. Suggestions? Advice? Answers? Thanks!
White Hat / Black Hat SEO | | CSawatzky0 -
Next Step to Improve SERPS?
Hello... Just looking for some opinions on what my next steps should be in regards to improving my rankings in the SERPS. Most of my category and subcategory pages (as well as the home page) have received grades of "A" on SEOMOZ's on page grader... None of my competitors have as much or as unique of content as I do. My question to all of you SEO genius' and experts (i mean that as a great compliment, not sarcasm) is what would your next steps be in terms of moving up in the search results? My url is : http://goo.gl/XUH3f Thanks in advance!
White Hat / Black Hat SEO | | Prime850 -
Is it negative to put a backlink into the footer's website of our clients ?
Hello there ! Everything is in the subject of this post but here is the context : we are a web agency and we, among others, build websites for our clients (most of them are shops). Until now, we put a link in their footer, like "developped by MyWebShop". But we don't know if it is bad or not. With only one website we can have like hundred of backlinks at once, but is it good for SEO or not ? Will Google penalize us thinking that is blackhat practices ? Is it better to put our link in the "legal notices" or "disclaimer" part of the websites ? What is the best practice for a lasting SEO ? I hope you understand my question, Thnak you in advance !
White Hat / Black Hat SEO | | mywebshop0