Recovering from disaster
-
Short Question: What's the best way to get Google to re-index duplicate URLs?
Long Story:
We have a long ago (1997) established website with a proprietary CMS. Never paid much attention to SEO (other than creating a sitemap) until four months ago. After learning some we started modifying the engine to provide better site to google (proper HTTP codes, consistent URLs to eliminate duplicates - we had something like 15,000 duplicates - etc...)
Things went great for three and half months and we reached the first page on google for our main keyword (very, very competitive keyword). Before the SEO we were getting around 25,000 impressions and 3000 clicks on google. After our SEO efforts, we reached 70,000 daily impressions and more than 7000 daily clicks.
On Aug 30th, 2014, one of our programmers committed a change to the live server by mistake. This small change effectively changed every article's URL by adding either a dash at its end or a dash and a keyword '-test-keyword' (literally).
Nobody noticed anything until two days later as the site worked perfectly for humans. The result of this small code change is that within five days our site practically disappeared from Google's results pages except when one searched for our site's name. Our rank dropped from 8 and 10 to 80 and 100 for our main keywords.
We reverted the change as soon as we noticed the problem, but during those two days, Google's bots went on a binge crawling five times the usual number of page crawled per day.
We've been trying to recover and nothing seems to be working so far. Google's bots aren't crawling the repaired URLs to get the 301 headers back to the original URL and now we still have over 2300 duplicates as reported by the webmaster tools.
Our Google impressions and clicks dropped to way below what we had before we did any SEO, down to 5000 impressions and 1200 clicks (inclusive of our direct domain name search).
During the last 15 days (after we fixed the problem), our duplicate count went from a maximum of 3200, down to 1200, then back up to 2300 without any changes on our end.
we've redone our sitemap and resubmitted it on day 3.
So, what do we do? Do we go through the URLs with 'fetch as Google' function? (that's a bit tedious for 2300 URLs) or we wait for the bots to come around whenever they feel like it? if we do this, should we submit the bad URL, have google fetch it, get the redirect, follow it and then submit the followed URL to the index?
Or is there a better solution that I'm unaware of?
Second question: Is this something to be expected when something like this happens knowing that our inbound link rarely link to the actual articles?
-
Well, after submitting multiple temporary sitemaps and having Google index them, our duplicate counts dropped back to pre-event levels.
However, our rankings haven't improved at all. Actually, if anything, they dropped even further.
At this point it's really starting to look like this is a hit from Panda 4.1 and that we had our URLs change was merely a coincidence. From the looks of it, Google is now marking our site as a low quality site. Now that we know about such a thing, we definitely experienced a 'sinister surge' prior to disaster striking.
Since we've never engaged in any bad behavior on the site and we've always followed google's best practice advice, we're currently at a loss of what could be the reason that we're hit that way. Our content is fresh and high quality (arguably the highest quality in our domain), we have a very decent link profile according to MajesticSEO, so for now, no clue about what's going on really.
Attached is the site's impressions and clicks graph from Webmaster Tools.
-
["knowing that our inbound link rarely link to the actual articles" --> not sure I follow.]
I asked whether it's normal for all ranking to drop even for unaffected pages when pages with no inbound link have issues. For example, our top ranked page for our main keyword didn't change in anyway, not its URL, its description nor its title, yet it's rank tanked after this event.
I like the temporary sitemap idea. Thanks.
-
Once you have all the 301 redirects set up, create a sitemap with all of the old urls and submit that. Google will crawl them and see that they are now 301 redirects and process the data faster. then delete the sitemap.
You should also have a canonical tag on the article pages with the new/current link that should be indexed.
"knowing that our inbound link rarely link to the actual articles" --> not sure I follow.
In general, your rankings should bounce back once google picks up on all of the fixes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from Sitemap Issues with Bing
Hi all, I recently took over SEO efforts for a large e-commerce site (I would prefer not to disclose). About a month ago, I began to notice a significant drop in traffic from Bing and uncovered in Bing Webmaster Tools that three different versions of the sitemap were submitted and Bing was crawling all three. I removed the two out of date sitemaps and re-submitted the up to date version. Since then, I have yet to see Bing traffic rebound and the amount of pages indexed by Bing is still dropping daily. During this time there has been no issue with traffic from Google. Currently I have 1.3 million pages indexed by Google while Bing has dropped to 715K (it was at 755K last week and was on par with Google several months ago). I know that no major changes have been made to the site in the past year so I can't point to anything other than the sitemap issue to explain this. If this is indeed the only issue, how long should I expect to wait for Bing to re-index the pages? In the interim I have been manually submitting important pages that aren't currently in the index. Any insights or suggestions would be very much appreciated!
Technical SEO | | tdawson090 -
Site went down and traffic hasn't recovered
Very curious situation. We have a network of sites. Sunday night one (only one) of our sites goes down, and since then we've seen a loss in traffic across all our sites!! Not only have we seen a loss of traffic, we also saw a loss of indexed pages. A complete drop off from 1.8 million to 1.3 million pages indexed. Does anyone know why one site outtage would affect the rest of them? And the indexed pages? Very confused. Thanks,
Technical SEO | | TMI.com0 -
Slowly recovering from algorithm penalty
Hi , over the years a website we took over was hit by an algorithm penalty (a combination of penguin and panda). We managed to bring rankings back (after 6 months) from page 5/6 to page 2 after we used the google disavow tool. now for the past 9 months we have been stuck on page 2.. is there anything you think can be done to bring it back to page 1? we are building quality links now and moved away from low quality links other link builders were making. We are managing the process much closer and ensuring we maintain good standards of links. also making the pages flatter and merging short page content to larger content pages now we are looking at site structure and creating structured internal link flow is there anything we should be aware of and any recommendations to get back on page 1.. this is a tailor-made travel related website with a small selection of destinations
Technical SEO | | Direct_Ram0 -
Need 301 Advice with a Recovered URL from a Domain Typosquatter
I am new to a SMB and someone bought the plural version of our domain back in 2005 and has yet to let it expire. The domain was just renewed for another year so we finally decided to contact a lawyer and go through the domain name dispute process. This seems like a pretty cut an dry case and the lawyer is very confident that we'll have the domain within 30-40 days. Currently the plural version domain 303s to spammy web pages, shows shady ads and is just a malicious looking page in general. I am not savvy enough to know the exact complexities of what's happening on the backend but it's spammy. Knowing the history of the plural version domain, how would you treat it after we acquire it? Obviously, I wouldn't want to put our site in jeopardy by 301ing the plural version of our URL to our current healthy site but at the same time many customers might go to that domain by accident so eventually I'd like to 301 it. If it's any help, the plural version has a robots.txt that prevent G from crawling it..thank you in advance for your guidance!
Technical SEO | | ssimarketing0 -
Internal linking disaster
Can someone help me understand what my devs have done? The site has thousands of pages but if there's an internal homepage link on all of the pages (click on the logo) shouldn't that count for internal links? Could it be because they are nonfollow? http://goo.gl/0pK5kn I've attached my competitors opensiteexplorer rankings (I'm the 2nd column) .. so despite the face the site is new you can see where I'm getting my ass kicked. Thanks! psRsQtH.png
Technical SEO | | bradmoz0 -
Duplicate content - Quickest way to recover?
We've recently been approached by a new client who's had a 60%+ drop in organic traffic. One of the major issues we found was around 60k+ pages of content duplicated across 3 seperate domains. After much discussion and negotiation with them; we 301'd all the pages across to the best domain but traffic is increasing very slowly. Given that the old sites are 60k+ pages each and don't get crawled very often, is it best to notify the domain change through Google Webmaster Tools to try and give Google a 'nudge' to deindex the old pages and hopefully recover from the traffic loss as quickly and as much as possible?
Technical SEO | | Nathan.Smith0 -
How do I clean up this 301 disaster?
I launched my site, InternetCE.com, and blog, www.continuingeducationjournal.com, a few years ago. I then learned I should probably merge the content, and foolishly created a subdomain, http://blog.internetce.com, and 301 redirected the blog to it. As an aside, my site is on a microsoft server, thus cannot host my wordpress blog on it. After a bit more study, I realized that my blog wasn't helping me nearly as much as it could be, so I 301'd it again to http://internetce.com/blog. In just becoming a pro member (long overdue) I realize that my entire site needs to be 301'd to merge non-www and www versions. I read somewhere that mr. cutts says not to 301 more than twice for fear of mistakenly being construed as something a bit to spammy. So, here I sit..not sure what to do. Does anyone have any advice on how to most efficiently correct this spaghetti bowl? Many thanks!
Technical SEO | | adell500 -
Redesign an SEO-Disaster | Help with Redirects of Gray Hat Pages
Hi gang. I'm a new SEO and I'm currently working on the redesign of a website. I have just discovered a ton of hidden pages that are filled with duplicate content, basically reiterating the main keyword in a variety of different variations. Each page is titled with the variation on the keyword phrase and then has one paragraph of text very similar to the previous page, etc. Here is an example of one of the offensive pages (nice lookin' site, eh?): http://www.vasectomy-reversals.com/vasectomy_reversal_surgery.html The new site will not have any of these pages. I'm writing the 301 redirects now and want to redirect these offensive pages to the most relevant page on the new site. But, I'm afraid to redirect the offensive pages. Should I leave them alone, or can I have the former developer remove them? Help. Don't know how to handle these pages and their redirects. Thanks for your help! ~ Mills
Technical SEO | | Mills0