Recovering from disaster
-
Short Question: What's the best way to get Google to re-index duplicate URLs?
Long Story:
We have a long ago (1997) established website with a proprietary CMS. Never paid much attention to SEO (other than creating a sitemap) until four months ago. After learning some we started modifying the engine to provide better site to google (proper HTTP codes, consistent URLs to eliminate duplicates - we had something like 15,000 duplicates - etc...)
Things went great for three and half months and we reached the first page on google for our main keyword (very, very competitive keyword). Before the SEO we were getting around 25,000 impressions and 3000 clicks on google. After our SEO efforts, we reached 70,000 daily impressions and more than 7000 daily clicks.
On Aug 30th, 2014, one of our programmers committed a change to the live server by mistake. This small change effectively changed every article's URL by adding either a dash at its end or a dash and a keyword '-test-keyword' (literally).
Nobody noticed anything until two days later as the site worked perfectly for humans. The result of this small code change is that within five days our site practically disappeared from Google's results pages except when one searched for our site's name. Our rank dropped from 8 and 10 to 80 and 100 for our main keywords.
We reverted the change as soon as we noticed the problem, but during those two days, Google's bots went on a binge crawling five times the usual number of page crawled per day.
We've been trying to recover and nothing seems to be working so far. Google's bots aren't crawling the repaired URLs to get the 301 headers back to the original URL and now we still have over 2300 duplicates as reported by the webmaster tools.
Our Google impressions and clicks dropped to way below what we had before we did any SEO, down to 5000 impressions and 1200 clicks (inclusive of our direct domain name search).
During the last 15 days (after we fixed the problem), our duplicate count went from a maximum of 3200, down to 1200, then back up to 2300 without any changes on our end.
we've redone our sitemap and resubmitted it on day 3.
So, what do we do? Do we go through the URLs with 'fetch as Google' function? (that's a bit tedious for 2300 URLs) or we wait for the bots to come around whenever they feel like it? if we do this, should we submit the bad URL, have google fetch it, get the redirect, follow it and then submit the followed URL to the index?
Or is there a better solution that I'm unaware of?
Second question: Is this something to be expected when something like this happens knowing that our inbound link rarely link to the actual articles?
-
Well, after submitting multiple temporary sitemaps and having Google index them, our duplicate counts dropped back to pre-event levels.
However, our rankings haven't improved at all. Actually, if anything, they dropped even further.
At this point it's really starting to look like this is a hit from Panda 4.1 and that we had our URLs change was merely a coincidence. From the looks of it, Google is now marking our site as a low quality site. Now that we know about such a thing, we definitely experienced a 'sinister surge' prior to disaster striking.
Since we've never engaged in any bad behavior on the site and we've always followed google's best practice advice, we're currently at a loss of what could be the reason that we're hit that way. Our content is fresh and high quality (arguably the highest quality in our domain), we have a very decent link profile according to MajesticSEO, so for now, no clue about what's going on really.
Attached is the site's impressions and clicks graph from Webmaster Tools.
-
["knowing that our inbound link rarely link to the actual articles" --> not sure I follow.]
I asked whether it's normal for all ranking to drop even for unaffected pages when pages with no inbound link have issues. For example, our top ranked page for our main keyword didn't change in anyway, not its URL, its description nor its title, yet it's rank tanked after this event.
I like the temporary sitemap idea. Thanks.
-
Once you have all the 301 redirects set up, create a sitemap with all of the old urls and submit that. Google will crawl them and see that they are now 301 redirects and process the data faster. then delete the sitemap.
You should also have a canonical tag on the article pages with the new/current link that should be indexed.
"knowing that our inbound link rarely link to the actual articles" --> not sure I follow.
In general, your rankings should bounce back once google picks up on all of the fixes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from a Google penalty
Hi there, So about 3.5 weeks ago I noticed my website (www.authenticstyle.co.uk) had gone from ranking in second place for our main key phrase "web design dorset" to totally dropping off the SERP's for that particular search phrase - it's literally no where to be seen. It seems that other pages of my website still rank, but the homepage. I then noticed that I had an unread alert in my Google Search Console account to say that a staging site we were hosting on a subdomain (the subdomain was domvs.authenticstyle.co.uk) had hacked content - it was a couple of PDF files with weird file names. The strange thing is we'd taken this staging site down a few weeks earlier, BUT one of my staff had left an A record set up in our Cloudflare account pointing to that staging server - they'd forgotten to remove it when removing the staging site. I then removed the A record, myself and submitted a reconsideration request on Google Search Console (which I still haven't received confirmation of) in the hope of everything sorting itself out. Since then I've also grabbed a Moz Pro account to try and dig a little deeper, but without any success. We have a few warnings for old 404's, some missing meta descs on some pages, and some backlinks that have accumulated over time that have hghish spam rating, but nothing major - nothing that would warrant a penalty as far as I can tell. From what I can make out, we've been issued a penalty on our homepage only, but I don't understand why we would get penalised for hacked content if that site domvs.authenticstyle.co.uk no longer existed (would it just be due to that erroneous A record we forgot to remove?). I contacted a few freelance SEO experts and one came back to me saying I'd done everything correctly and that I should see our site appearing again in a few days after submitting the reconsideration request. Its been 3 weeks and nothing. I'm at a huge loss as to how my site can recover from this. What would you recommend? I even tried getting our homepage to rank for a variation of "web design dorset", but it seems our homepage has been penalised for anything with "dorset" in the keyphrase. Any pointers would be HUGELY appreciated. Thanks in advance! Will
Technical SEO | | wsmith7270 -
How to recover search volume after domain name change?
On the 3rd of November we changed our company name and domain. The new site was not changed at all so the 301 process was quite straightforward. The change over was successful, no downtime, all pages redirected correctly (with a few minor exceptions). However, after a few days we started to see more and more links into the new site from the old site. They now stand at over 3 million. And links from the new site to the old site of over 200K. Links from the new site back to the old, were due to us having left a lot of links tucked away on various pages which were possibly causing loops with the 301 redirects on the old site. We fixed these and now there are no remaining links back to the old site, though we are still showing just over 200K links back to the old site. We are also seeing a LOT more back-links on the new site from old junk sites, which are not showing for the old site. A couple of years ago we went through about a year of trying to track down and remove thousands of spam backlinks. We did what we could, got a lot removed, showed Google the evidence, then Google lifted the penalty and said they had made some changes that meant the links were no longer causing the penalty. I added the old disavow file to the new site, but it doesn't cover a fraction of the sites which are being displayed as providing backlinks... many of which are clearly spammy. Is it possible that Google made some manual actions to lift the penalties but failed to associate these changes with the new domain? Changes that were not included in the disavow file? All help appreciated.
Technical SEO | | Exotissimo0 -
Recovered from Penalty But.....
Hello,We recovered from Google's manual penalty in January 2014. Afterwards, we changed our site design, fixed our content, disavowed bad links, increased our social presence, tried to engage customers in blog content, etc. But we still couldn't get our domain name back on SERPs. (It wont show our site even on first page if I search "best vpn service" on Google.com)What should we do to bring our domain name back on the SERP?What About Sandbox? What we need to do to get out our domain name from sand box?Any comments or thoughts will be really appreciated 🙂
Technical SEO | | UmairGadit0 -
Recovering from Blocked Pages Debaucle
Hi, per this thread: http://www.seomoz.org/q/800-000-pages-blocked-by-robots We had a huge number of pages blocked by robots.txt by some dynamic file that must have integrated with our CMS somehow. In just a few weeks hundreds of thousands of pages were "blocked." This number is now going down, but instead of by the hundreds of thousands, it is going down by the hundreds and very sloooooowwwwllly. So, we really need to speed up this process. We have our sitemap we will re-submit, but I have a few questions related to it: Previously the sitemap had the <lastmod>tag set to the original date of the page. So, all of these pages have been changed since then. Any harm in doing a mass change of the <lastmod>field? It would be an accurate reflection, but I don't want it to be caught by some spam catcher. The easy thing to do would be to just set that date to now, but then they would all have the same date. Any other tips on how to get these pages "unblocked" faster? Thanks! Craig</lastmod></lastmod>
Technical SEO | | TheCraig0 -
How to Recover from the Exact Match Domain Update last weekend?
Hi, Our main website a geo exact match city dot com domain which is 18 years old got all its rankings washed away from top ten to 300th to 400th position plus on Google. We are still top 10 on Yahoo and Bing for the City term. The site is Dmoz listed , has quality updated daily content and no paid advertising or links of any kind. I joined seomoz today and noticed I via the page optimiser that the main keyword for the city which is the exact match domain was over optimised but its hard to talk about events and news in a city without mentioning the citys name lol Looking at the backlinks in the open site explorer tool I noticed alot of spammy links from chinese website and weird domains that were not related and am beginning to think I was the victim of a negative seo campaign. I submitted a re-consideration request with Google and they replied in the space of a few days saying that it was not a manual penalty and that it was down to there algorithm changing or perhaps I changed cms or made changes. There was no unatural links warning either. Funny thing is the website is wordpress based and we changed to a new theme 2 weeks ago and also moved to a new server. Just wondering people views on what my next move shoud be to try and regain our rankings on Google I already toned down the use of the city keyword on the homepage. Regards Tom
Technical SEO | | glenanail0 -
How do I clean up this 301 disaster?
I launched my site, InternetCE.com, and blog, www.continuingeducationjournal.com, a few years ago. I then learned I should probably merge the content, and foolishly created a subdomain, http://blog.internetce.com, and 301 redirected the blog to it. As an aside, my site is on a microsoft server, thus cannot host my wordpress blog on it. After a bit more study, I realized that my blog wasn't helping me nearly as much as it could be, so I 301'd it again to http://internetce.com/blog. In just becoming a pro member (long overdue) I realize that my entire site needs to be 301'd to merge non-www and www versions. I read somewhere that mr. cutts says not to 301 more than twice for fear of mistakenly being construed as something a bit to spammy. So, here I sit..not sure what to do. Does anyone have any advice on how to most efficiently correct this spaghetti bowl? Many thanks!
Technical SEO | | adell500 -
After entire site is noindex'd, how long to recover?
A programmers 'accidentally' put "name="robots" content="noindex" />" into every single page of one of my sites (articles, landing pages, home page etc). This happened on Monday, and we just noticed today. Ugh... We've fixed the issue; how long will it take to get reindexed? Will we instantly retain our same positions for keywords? Any tips?
Technical SEO | | EricPacifico0 -
How to recover after blocking all the search engine spiders?
I have the following problem - one of my clients (a Danish home improvement company) decided to block all the international traffic (leaving only Scandiavian one), because they were getting a lot of spammers using their mail form to send e-mails. As you can guess this lead to blocking Google also since the servers of Google Denmark are located in the US. This lead to drop in their rankings. So my question is - What Shall I do now - wait or contact Google? Any help will be appreciated, because to be honest I had never see such thing in action until now 😄 Best Regards
Technical SEO | | GroupM0