Post Site Migration - thousands of indexed pages, 4 months after
-
Hi all,
Believe me. I think I've already tried and googled for every possible question that I have. This one is very frustrating – I have the following old domain – fancydiamonds dot net.
We built a new site – Leibish dot com and done everything by the book:
- Individual 301 redirects for all the pages.
- Change of address via the GWT.
- Trying to maintain and improve the old optimization and hierarchy.
4 months after the site migration – we still have to gain back more than 50% of our original organic traffic (17,000 vs. 35,500-50,000
The thing that strikes me the most that you can still find 2400 indexed pages on Google (they all have 301 redirects).
And more than this – if you'll search for the old domain name on Google – fancydiamonds dot net you'll find the old domain!
Something is not right here, but I have no explanation why these pages still exist.
Any help will be highly appreciated. Thanks!
-
Thanks Dana. Honestly, we have a lot of experience dealing with site migrations - I read dozens of posts and we've implemented our own step-by=step guidelines for successful site migration.
As you can see, sometimes even when you do everything by the book you can encounter some unexpected issues.
-
Yes I have. I could see the 301 redirects correctly and without any further issues.
-
Yes, it sounds like perhaps there is a technical issue here. I like Keri's suggestion below. Also, have you grepped your server logs to see if Googlebot is having issues?
It can taken Google a long long time to take down search results to old pages that either don't exist any more or that 301 to a new page. You may have to resort to using the removal tool. I realize that for 2,000 URLs doing these one at a time is inconvenient, but it may just be what you have to do.
I have some old notes on domain migration that I'll try to dig up, but unfortunately I don't think there's much there that's helpful after the fact. But I'll see what I can find.
-
the URL remover tool would be one of my last options, since I too would be afraid of any authority vanishing with the old link.
Google must have some reason to continue to index the pages and I wouldn't want them removed until I'm positive I gained back all the authority I could, from these old pages.
-
Are you certain the 301 redirects are active and working?
-
Can you add canonical tags to the 301'ed pages?
-
Make sure that none of the URLs in the 301 URL chain are disallowed by a robots.txt file. If they were in the redirect chain, Google would not be able to properly crawl the new page and properly index.
That last point may be what's preventing a portion of the old URLs from dropping, if they are being blocked in the robots.txt file.
-
-
What happens when you go into GWT and fetch fancydiamonds.net as googlebot? Is there some reason that perhaps googlebot isn't seeing the redirects correctly?
-
Hi David,
see my answer to RaymondPP.
Also, what do you mean by saying "you are linking out to your other site"?
Did you see anything?
-
There is a perfect correlation between the organic drop and the revenue – It has decreased dramatically. Of course I checked for Analytics issue but all the other traffic sources have stayed the same. We have big PPC campaigns and the traffic data is correct.
About management of expectations – usually we say that we expect 3-4 months of traffic droppings, but this had taken us a bit by surprise.
-
Thanks for the answer.
That's always a possibility - the problem is that these url's have not too few links (the old homepage is still indexed!).
If I'll use the url remover won't this result in losing all the link juice for those url's?
-
I agree with both of the previous suggestions and thought I would add a comment and a question too.
Seeing a decline of 50% or even more in traffic after a site migration is not uncommon. Hopefully your clients went into the migration with eyes open, knowing that they could see significantly lower traffic for anywhere from 6 weeks to a year, and maybe never fully recover. This sometimes happens. That's why the planning process is so important (and management of expectations).
That being said, when you installed Google Analytics on the new site, did anything change in your GA tracking code? Sometimes this happens and can lead to old analytics reports and new analytics reports not being an "apples to apples" comparison. It's just a thought. It could be that the traffic isn't actually 50% lower, but has changed much less than that.
Has revenue (or whatever your conversion goal is) dropped, increased or stayed the same?
-
I'm not understanding why your traffic is lower. If you have 301 redirects in place, even if your old pages show up and someone clicks the link, it will take them to the new site.
Another option you could pursue is a 410 (gone) for your old pages. This states to Google that the page has been removed and should no longer be indexed or linked.
But beware, you are linking out to your other site.
The 410 error is primarily intended to assist the task of web maintenance by notifying the client system that the resource is intentionally unavailable and that the Web server wants remote links to the URL to be removed. Such an event is common for URLs which are effectively dead i.e. were deliberately time-limited or simply orphaned. The Web server has complete discretion as to how long it provides the 410 error before switching to another error such as 404.
-
Hi skifr - Have you tried using the URL remover tool in GWT? And if you really want those pages out of the search engine, how about a noindex tag on the old domain?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog page won't get indexed
Hi Guys, I'm currently asked to work on a website. I noticed that the blog posts won't get indexed in Google. www.domain.com/blog does get indexed but the blogposts itself won't. They have been online for over 2 months now. I found this in the robots.txt file: Allow: / Disallow: /kitchenhandle/ Disallow: /blog/comments/ Disallow: /blog/author/ Disallow: /blog/homepage/feed/ I'm guessing that the last line causes this issue. Does anyone have an idea if this is the case and why they would include this in the robots.txt? Cheers!
Technical SEO | | Happy-SEO2 -
Why blocking a subfolder dropped indexed pages with 10%?
Hy Guys, maybe you can help me to understand better: on 17.04 I had 7600 pages indexed in google (WMT showing 6113). I have included in the robots.txt file, Disallow: /account/ - which contains the registration page, wishlist, etc. and other stuff since I'm not interested to rank with registration form. on 23.04 I had 6980 pages indexed in google (WMT showing 5985). I understand that this way I'm telling google I don't want that section indexed, by way so manny pages?, Because of the faceted navigation? Cheers
Technical SEO | | catalinmoraru0 -
Feedback needed on possible solutions to resolve indexing on ecommerce site
I’ve included the scenario and two proposed fixes I’m considering. I’d appreciate any feedback on which fixes people feel are better and why, and/or any potential issues that could be caused by these fixes. Thank you! Scenario of Problem I’m working on an ecommerce website (built on Magneto) that is having a problem getting product pages indexed by Google (and other search engines). Certain pages, like the ones I’ve included below, aren’t being indexed. I believe this is because of the way the site is configured in terms of internal linking. The site structure forces certain pages to be linked very deeply, therefore the only way for Googlebot to get to these pages is through a pagination page (such as www.acme.com/page?p=3). In addition, the link on the pagination page is really deep; generally there are more than 125 links on the page ahead of this link. One of the Pages that Google isn’t indexing: http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb/430-20-lb-laser-bond-22-x-650-1-roll.html This page is linked from http://www.getpaper.com/find-paper/engineering-paper/bond-20-lb?p=5, and it is the 147<sup>th</sup> link in the source code. Potential Fixes Fix One: Add navigation tags to the template so that search engines will spend less time crawling them and will get to the deeper pages, such as the one mentioned above. Note: the navigation tags are for HTML-5; however, the Magento site in which this is built does not use HTML 5. Fix Two: Revised the Templates and CSS so that the main navigation and the sidebar navigation is on the bottom of the page rather than the top. This would put the links to the product pages in the source code ahead of the navigation links.
Technical SEO | | TopFloor0 -
No existing pages in Google index
I have a real estate portal. I have a few categories - for example: flats, houses etc. Url of category looks like that: mydomain.com/flats/?page=1 Each category has about 30-40 pages - BUT in Google index I found url like: mydomain.com/flats/?page=1350 Can you explain it? This url contains just headline etc - but no content! (it´s just generated page by PHP) How is it possible, that Google can find and index these pages? (on the web, there are no backlinks on these pages) thanks
Technical SEO | | visibilitysk0 -
Site being indexed by Google before it has launched
We are currently coming towards the end of migrating one of our retail sites over to magento. To our horror, we find out today that some pages are already being indexed by Google, and we have started receiving orders through new site. Do you have any suggestions for what may have caused this? Or similarly, what the best solution would be to de-index ourselves? We most recently excluded anything with a certain parameter from robots.txt - could this being implemented incorrectly have caused this issue? Thanks
Technical SEO | | Sayers0 -
No index directory pages?
All, I have a site built on WordPress with directory software (edirectory) on the backend that houses a directory of members. The Wordpress portion of the site is full of content and drives traffic through to the directory. Like most directories, the results pages are thin on content and mainly contain links to member profiles. Is it best to simply no index the search results for the directory portion of the site?
Technical SEO | | JSOC0 -
What is the most effective way to migrate an ecommerce site?
I am about migrate a 1500 product ecommerce site from Netsuite to Volusion. The url structure is not going to be the same so I need to know the most effective way of redirecting the old urls to the new site. Is there an easier method than collecting the most popular pages and creating a 301 xml page and upload it once the site goes live?
Technical SEO | | BenRWoodard0 -
GWT indexing wrong pages
Hi SEOMoz I have a listings site. In a part of the page, I have 3 comboboxes, for state, county and city. On the change event, the javascript redirects the user to the page of the selected location. Parameters are passed via GET, and my URL is rewrited via htaccess. Example: http:///www.site.com/state/county/city.html The problem is, there is A LOT(more than 10k) of 404 errors. It is happenning because the crawler is trying to index the pages, sometimes WITHOUT a parameter, like http:///www.site.com/state//city.html I don't know how to stop it, and I don't wanna remove it, once it's very clicked by the users. What should I do?
Technical SEO | | elias990