Post Site Migration - thousands of indexed pages, 4 months after
-
Hi all,
Believe me. I think I've already tried and googled for every possible question that I have. This one is very frustrating – I have the following old domain – fancydiamonds dot net.
We built a new site – Leibish dot com and done everything by the book:
- Individual 301 redirects for all the pages.
- Change of address via the GWT.
- Trying to maintain and improve the old optimization and hierarchy.
4 months after the site migration – we still have to gain back more than 50% of our original organic traffic (17,000 vs. 35,500-50,000
The thing that strikes me the most that you can still find 2400 indexed pages on Google (they all have 301 redirects).
And more than this – if you'll search for the old domain name on Google – fancydiamonds dot net you'll find the old domain!
Something is not right here, but I have no explanation why these pages still exist.
Any help will be highly appreciated. Thanks!
-
Thanks Dana. Honestly, we have a lot of experience dealing with site migrations - I read dozens of posts and we've implemented our own step-by=step guidelines for successful site migration.
As you can see, sometimes even when you do everything by the book you can encounter some unexpected issues.
-
Yes I have. I could see the 301 redirects correctly and without any further issues.
-
Yes, it sounds like perhaps there is a technical issue here. I like Keri's suggestion below. Also, have you grepped your server logs to see if Googlebot is having issues?
It can taken Google a long long time to take down search results to old pages that either don't exist any more or that 301 to a new page. You may have to resort to using the removal tool. I realize that for 2,000 URLs doing these one at a time is inconvenient, but it may just be what you have to do.
I have some old notes on domain migration that I'll try to dig up, but unfortunately I don't think there's much there that's helpful after the fact. But I'll see what I can find.
-
the URL remover tool would be one of my last options, since I too would be afraid of any authority vanishing with the old link.
Google must have some reason to continue to index the pages and I wouldn't want them removed until I'm positive I gained back all the authority I could, from these old pages.
-
Are you certain the 301 redirects are active and working?
-
Can you add canonical tags to the 301'ed pages?
-
Make sure that none of the URLs in the 301 URL chain are disallowed by a robots.txt file. If they were in the redirect chain, Google would not be able to properly crawl the new page and properly index.
That last point may be what's preventing a portion of the old URLs from dropping, if they are being blocked in the robots.txt file.
-
-
What happens when you go into GWT and fetch fancydiamonds.net as googlebot? Is there some reason that perhaps googlebot isn't seeing the redirects correctly?
-
Hi David,
see my answer to RaymondPP.
Also, what do you mean by saying "you are linking out to your other site"?
Did you see anything?
-
There is a perfect correlation between the organic drop and the revenue – It has decreased dramatically. Of course I checked for Analytics issue but all the other traffic sources have stayed the same. We have big PPC campaigns and the traffic data is correct.
About management of expectations – usually we say that we expect 3-4 months of traffic droppings, but this had taken us a bit by surprise.
-
Thanks for the answer.
That's always a possibility - the problem is that these url's have not too few links (the old homepage is still indexed!).
If I'll use the url remover won't this result in losing all the link juice for those url's?
-
I agree with both of the previous suggestions and thought I would add a comment and a question too.
Seeing a decline of 50% or even more in traffic after a site migration is not uncommon. Hopefully your clients went into the migration with eyes open, knowing that they could see significantly lower traffic for anywhere from 6 weeks to a year, and maybe never fully recover. This sometimes happens. That's why the planning process is so important (and management of expectations).
That being said, when you installed Google Analytics on the new site, did anything change in your GA tracking code? Sometimes this happens and can lead to old analytics reports and new analytics reports not being an "apples to apples" comparison. It's just a thought. It could be that the traffic isn't actually 50% lower, but has changed much less than that.
Has revenue (or whatever your conversion goal is) dropped, increased or stayed the same?
-
I'm not understanding why your traffic is lower. If you have 301 redirects in place, even if your old pages show up and someone clicks the link, it will take them to the new site.
Another option you could pursue is a 410 (gone) for your old pages. This states to Google that the page has been removed and should no longer be indexed or linked.
But beware, you are linking out to your other site.
The 410 error is primarily intended to assist the task of web maintenance by notifying the client system that the resource is intentionally unavailable and that the Web server wants remote links to the URL to be removed. Such an event is common for URLs which are effectively dead i.e. were deliberately time-limited or simply orphaned. The Web server has complete discretion as to how long it provides the 410 error before switching to another error such as 404.
-
Hi skifr - Have you tried using the URL remover tool in GWT? And if you really want those pages out of the search engine, how about a noindex tag on the old domain?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have over 3000 4xx errors on my site for pages that don't exist! Please help!
Hello! I have a new blog that is only 1 month old and I already have over 3000 4xx errors which I've never had on my previous blogs. I ran a crawl on my site and it's showing as my social media links as being indexed as pages. For example, my blog post link is:
Technical SEO | | thebloggersi
https://www.thebloggersincentive.com/blogging/get-past-a-creative-block-in-blogging/
My site is then creating a link like the below:
https://www.thebloggersincentive.com/blogging/get-past-a-creative-block-in-blogging/twitter.com/aliciajthomps0n
But these are not real pages and I have no idea how they got created. I then paid someone to index the links because I was advised by Moz, but it's still not working. All the errors are the same, it's indexing my Twitter account and my Pinterest. Can someone please help, I'm really at a loss with it.
2f86c9fe-95b4-4df5-aeb4-73570881938c-image.png0 -
Will my site get devalued if I add the same company schema to all the pages of my website?
If I add the exact same schema markup to every page on my website - is it considered duplicate content? Our CMS is telling me that if I want schema mark-up on our site that it has to be the same on every page on the website. This limitation is frustrating but I am trying to figure out the best way to work within their boundaries. Your help is appreciated.
Technical SEO | | Annette_Wetzel0 -
Drop in traffic, spike in indexed pages
Hi, We've noticed a drop in traffic compared to the previous month and the same period last year. We've also noticed a sharp spike in indexed pages (almost doubled) as reported by Search Console. The two seemed to be linked, as the drop in traffic is related to the spike in indexed pages. The only change we made to our site during this period is we reskinned out blog. One of these changes is that we've enable 'normal' (not ajax) pagination. Our blog has a lot of content on, and we have about 550 odd pages of posts. My question is, would this impact the number of pages indexed by Google, and if so could this negatively impact organic traffic? Many thanks, Jason
Technical SEO | | Clickmetrics0 -
Is there a way to get Google to index more of your pages for SEO ranking?
We have a 100 page website, but Google is only indexing a handful of pages for organic rankings. Is there a way to submit to have more pages considered? I have optimized meta data and get good Moz "on-page graders" or the pages & terms that I am trying to connect....but Google doesn't seem to pick them up for ranking. Any insight would be appreciated!
Technical SEO | | JulieALS0 -
Will SEO Moz index our keywords if the site is ALL https?
We have a site coming into beta next week. Playing around with SEO Moz, I had trouble getting the keywords to rank at all. Was this because the site is entirely https? If yes, what else can SEO Moz NOT do if the site is all https? Thanks!
Technical SEO | | OTSEO0 -
How to Find all the Pages Index by Google?
I'm planning on moving my online store, http://www.filtrationmontreal.com/ to a new platform, http://www.corecommerce.com/ To reduce the SEO impact, I want to redirect 301 all the pages index by Google to the new page I will create in the new platform. I will keep the same domaine name, but all the URL will be customize on the new platform for better SEO. Also, is there a way or tool to create CSV file from those page index. Can Webmaster tool help? You can read my question about this subject here, http://www.seomoz.org/q/impacts-on-moving-online-store-to-new-platform Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
Mitigating duplicate page content on dynamic sites such as social networks and blogs.
Hello, I recently did an SEOMoz crawl for a client site. As it typical, the most common errors were duplicate page title and duplicate content. The client site is a custom social network for researchers. Most of the pages that showing as duplicate are simple variations of each user's profile such as comment sections, friends pages, and events. So my question is how can we limit duplicate content errors for a complex site like this. I already know about the rel canonical tag, and rel next tag, but I'm not sure if either of these will do the job. Also, I don't want to lose potential links/link juice for good pages. Are there ways of using the "noindex" tag in batches? For instance: noindex all urls containing this character? Or do most CMS allow this to be done systematically? Anyone with experience doing SEO for a custom Social Network or Forum, please advise. Thanks!!!
Technical SEO | | BPIAnalytics0 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0