Awful ranking after site redesign
-
Hello everyone,
I have a situation here and I’d like to have your opinion about it.
I am working on a site which has been recently redesigned from scratch: in a nutshell, as soon as the new site went live their rankings dropped and, of course, so did their visitors and so on..
The guys who redesigned the site didn’t do any 301 redirect whatsoever, so now the old pages are just 404s and blocked by robots. My question is: if they 301 redirect now, do you think it would be possible they could get their rankings back?
One more thing: when they launched the new site, the indexed pages basically doubled overnight; there were 700 and now there are 1400. Do you think this could affect their ranking as well?
Thank you for you insights
Elio
-
Hello everyone and thank you for your answers. I sincerely appreciate it!
I didn’t follow the redesign phase, I’ve just jumped on board now so I actually have no idea why they didn’t go for the 301 solution.
As Monica pointed out the 404ed pages were actually valuable pages and, at least in my opinion, this is proved by the fact that now their traffic is close to 0. Their traffic literally dropped in a matter of days (kind of scary to see such a steep fall). I agree with Travis when he says that just the valuable pages should be 301ed, but the thing is that they sell their products online, meaning that hypothetically every (product) page is equally important. They were neither old nor poor quality…I guess they just skipped the 301 step. I will do some more research but I guess that, as you guys suggest, the best way to go is 301 all those pages and see what happens.
I have no idea if they did anything on the social side but that’s worth investigating some more.
Thank you very much for now! I will keep you updated
Cheers
-
I would imagine if the pages were previously ranking they had value. The rule of thumb is to discard pages not ranking on pages 1-3. Since there has been such a decrease in traffic it is reasonable to assume that valuable pages have been 404ed when they should have been 301ed.
I have migrated 7 sites over the past 5 years, so I feel reasonably comfortable saying the duplicated pages are causing the influx in indexed pages. Redirecting the 404 pages is the strongest strategy right now. They basically created 700 valueless pages that won't rank until they are fully indexed and gain some value to the engine, which could take months. It is starting over from 0, which is why the 301 redirects are "normally" best practice.
Any 301 will lose a little bit of link juice. It goes from having a strong page rank alone to diluting its value by sharing it with another link. While it will help salvage some of the sites juice, it won't put them on page 1.
You can wait for these pages to start ranking alone, but that could take months based on the level of on page op and if there are any good links pointing at those pages currently. I am not a fan of the wait and see game, therefore, I try to do everything I can up front. The 301 redirects of the old pages would be best practice in this situation.
-
You can recover page authority from a 404 page for a surprising amount of time--I once did a 301 redirect on some pages that had been 404 for a couple of years and they quickly gained rank. What was the thinking behind not redirecting old pages? Were they poor quality? You don't have to redirect all of them at once--you can start with the best pages (and at least some of them must have been good since you had traffic to lose).
-
When a page is a 404, The Googles will come back to it in an undisclosed period of time. This is in order to make sure the page is really gone. Now if the pages that are gone used to receive referral traffic, it would be super handy to get those pages up soon, forget about the search engines. That way, you're recovering links and pages for the right reasons.
What should be your first order of questioning is if those pages were worth anything to begin with. I can rank a site for 'left handed profession city st', overnight. It doesn't mean any of that is going to work for the client.
But if they didn't redirect any of the old pages to their new, relevant, equivalents - I highly doubt they took the time to block those pages via robots.txt. If they did, wow. I'll leave it at that.
The increase of indexed pages could be due to any number of things. Perhaps a site search function is misconfigured? Perhaps the site uses tags in a way I wouldn't recommend? Perhaps the CMS, if there is one, is prone to duplicate content.
That's pretty much the best I can do without a specific example. Anyone with more 'skeelz' than I would be guessing as well. But thanks much for your question.
-
Ugh... I hate when this happens. It is such a pain in the butt to fix.
1st, you absolutely need those 301 redirects. Don't wait any longer to get them done. Those 404s are affecting your rankings considerable at this point. Basically you have 700 of them, whoa.
Secondly, the double index is because you have 700 new pages added to the 700 old pages. You can wait it out if you want to, but I don't recommend it. Get rid of the no follow on those old pages, 301 them so that the rankings might be salvaged. Once the new pages start ranking on their own you can get rid of the 301s. But, for now, get them going.
The 301s add a little bit of juice to the new pages, and that is a good thing. The reason they are important is because they are still ranking and bringing traffic to your site. The new pages will start to get some traffic which in turn will help their rankings.
Did you do anything on social with the site redesign? If you send out a post you might be able to salvage some traffic from you followers. Social signals will also help the rankings of the new pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sudden Drop in Rankings
I manage a site and this past month have noticed some pretty significant drops in rankings for some of our main keywords. I'm trying to understand why and what I can do to correct it. We did do some 301 redirects a little over a month ago so I've been wondering if that has anything to do with it as Google indexes the new URLs. One thing I did notice in Moz is that it is showing that Google is indexing the new URLs but associating them with the wrong keywords in the wrong location. The company services three locations and each location has a service page on the site. But Moz is showing that in Atlanta, the indexed URL is the Nashville one, etc. Is there any way to affect that? Another thing to note is that organic search traffic is actually up so it doesn't appear to be affecting them which I find a bit strange. Just looking for any insight as to things I should be looking into to explain the sudden drops. Thanks!
Technical SEO | | maghanlinchpinsales0 -
Is it problematic for Google when the site of a subdomain is on a different host than the site of the primary domain?
The Website on the subdomain runs on a different server (host) than the site on the main domain.
Technical SEO | | Christian_Campusjaeger0 -
Are multiple sites needed to rank one website?
My SEO guy for a mortgage website says that we should have 30 websites, with about 250 pages each on each site plus 50 blogs in order to even think of ranking for mortgage keywords. Is that correct?
Technical SEO | | simermeet0 -
Why does my site rank so badly
its my turn to ask the interminable question why does my site rank so badly? site is: marriagerecords.org.uk. it was #1 for 'marriage records' on google for about 6 months. then it was 5th to 10th for about 2 months. now it is nowhere for this phrase and anything else, none of the pages I have written rank for anything. I have spent hours upon hours researching original content and I have got some great backlinks from sites like wrexham.gov.uk and somerset.gov.uk (some dont show in opensiteexplorer yet). im guessing im over-optimizing something but i'd love some concrete fixes if anyone could suggest any. thanks, tom
Technical SEO | | lethal0r0 -
How to find all the links to my site
hi i have been trying to find all the links that i have to my site http://www.clairehegarty.co.uk but i am not having any luck. I have used the open explorer but it is not showing all the links but when i go to my google webmaster page it shows me more pages than it does on the semoz tool. can anyone help me sort this out and find out exactly what links are going into my site many thanks
Technical SEO | | ClaireH-1848860 -
Way to find how many sites within a given set link to a specific site?
Hi, Does anyone have an idea on how to determine how many sites within a list of 50 sites link to a specific site? Thanks!
Technical SEO | | SparkplugDigital0 -
SEOMOZ Ranking
Im pretty new to all this so hopefuly im going the right way ive been using SEOMOZ Pro for about 3 weeks. I have a website and a blog, I noticed that my website (flash based) was not doing very well but my blog was doing well. So i decided to turn my blog into a website for my main website and my old website into another category for my services. So I have now transferred everything over to my blog and on website grader I have got my rating upto 74/100 from 17/100 which is higher than anyone else in my keyword, my traffic ranking is higher but Im not even in the top 50 in google, they do have more backlins than me significanlty more but I am working on it. So the question is it the backlinks that help MOZ ranking, and how long does it take before google recognises my links to give me a position on google. My keywords are coventry wedding photographer and my site is rnblog.co.uk
Technical SEO | | robanewman0 -
Mobile site ranking in desktop searches
A robots.txt file is properly implemented on the mobile site to block Googlebot from crawling, yet, when running a site: query for the site in google, it still returns over 104,000 pages from the mobile site in the index. Why could this be happening?
Technical SEO | | craigsmith3330