Post Site Migration - thousands of indexed pages, 4 months after
-
Hi all,
Believe me. I think I've already tried and googled for every possible question that I have. This one is very frustrating – I have the following old domain – fancydiamonds dot net.
We built a new site – Leibish dot com and done everything by the book:
- Individual 301 redirects for all the pages.
- Change of address via the GWT.
- Trying to maintain and improve the old optimization and hierarchy.
4 months after the site migration – we still have to gain back more than 50% of our original organic traffic (17,000 vs. 35,500-50,000
The thing that strikes me the most that you can still find 2400 indexed pages on Google (they all have 301 redirects).
And more than this – if you'll search for the old domain name on Google – fancydiamonds dot net you'll find the old domain!
Something is not right here, but I have no explanation why these pages still exist.
Any help will be highly appreciated. Thanks!
-
Thanks Dana. Honestly, we have a lot of experience dealing with site migrations - I read dozens of posts and we've implemented our own step-by=step guidelines for successful site migration.
As you can see, sometimes even when you do everything by the book you can encounter some unexpected issues.
-
Yes I have. I could see the 301 redirects correctly and without any further issues.
-
Yes, it sounds like perhaps there is a technical issue here. I like Keri's suggestion below. Also, have you grepped your server logs to see if Googlebot is having issues?
It can taken Google a long long time to take down search results to old pages that either don't exist any more or that 301 to a new page. You may have to resort to using the removal tool. I realize that for 2,000 URLs doing these one at a time is inconvenient, but it may just be what you have to do.
I have some old notes on domain migration that I'll try to dig up, but unfortunately I don't think there's much there that's helpful after the fact. But I'll see what I can find.
-
the URL remover tool would be one of my last options, since I too would be afraid of any authority vanishing with the old link.
Google must have some reason to continue to index the pages and I wouldn't want them removed until I'm positive I gained back all the authority I could, from these old pages.
-
Are you certain the 301 redirects are active and working?
-
Can you add canonical tags to the 301'ed pages?
-
Make sure that none of the URLs in the 301 URL chain are disallowed by a robots.txt file. If they were in the redirect chain, Google would not be able to properly crawl the new page and properly index.
That last point may be what's preventing a portion of the old URLs from dropping, if they are being blocked in the robots.txt file.
-
-
What happens when you go into GWT and fetch fancydiamonds.net as googlebot? Is there some reason that perhaps googlebot isn't seeing the redirects correctly?
-
Hi David,
see my answer to RaymondPP.
Also, what do you mean by saying "you are linking out to your other site"?
Did you see anything?
-
There is a perfect correlation between the organic drop and the revenue – It has decreased dramatically. Of course I checked for Analytics issue but all the other traffic sources have stayed the same. We have big PPC campaigns and the traffic data is correct.
About management of expectations – usually we say that we expect 3-4 months of traffic droppings, but this had taken us a bit by surprise.
-
Thanks for the answer.
That's always a possibility - the problem is that these url's have not too few links (the old homepage is still indexed!).
If I'll use the url remover won't this result in losing all the link juice for those url's?
-
I agree with both of the previous suggestions and thought I would add a comment and a question too.
Seeing a decline of 50% or even more in traffic after a site migration is not uncommon. Hopefully your clients went into the migration with eyes open, knowing that they could see significantly lower traffic for anywhere from 6 weeks to a year, and maybe never fully recover. This sometimes happens. That's why the planning process is so important (and management of expectations).
That being said, when you installed Google Analytics on the new site, did anything change in your GA tracking code? Sometimes this happens and can lead to old analytics reports and new analytics reports not being an "apples to apples" comparison. It's just a thought. It could be that the traffic isn't actually 50% lower, but has changed much less than that.
Has revenue (or whatever your conversion goal is) dropped, increased or stayed the same?
-
I'm not understanding why your traffic is lower. If you have 301 redirects in place, even if your old pages show up and someone clicks the link, it will take them to the new site.
Another option you could pursue is a 410 (gone) for your old pages. This states to Google that the page has been removed and should no longer be indexed or linked.
But beware, you are linking out to your other site.
The 410 error is primarily intended to assist the task of web maintenance by notifying the client system that the resource is intentionally unavailable and that the Web server wants remote links to the URL to be removed. Such an event is common for URLs which are effectively dead i.e. were deliberately time-limited or simply orphaned. The Web server has complete discretion as to how long it provides the 410 error before switching to another error such as 404.
-
Hi skifr - Have you tried using the URL remover tool in GWT? And if you really want those pages out of the search engine, how about a noindex tag on the old domain?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Escort directory page indexing issues
Re; escortdirectory-uk.com, escortdirectory-usa.com, escortdirectory-oz.com.au,
Technical SEO | | ZuricoDrexia
Hi, We are an escort directory with 10 years history. We have multiple locations within the following countries, UK, USA, AUS. Although many of our locations (towns and cities) index on page one of Google, just as many do not. Can anyone give us a clue as to why this may be?0 -
Can't get Google to index our site although all seems very good
Hi there, I am having issues getting our new site, https://vintners.co indexed by Google although it seems all technical and content requirements are well in place for it. In the past, I had way poorer websites running with very bad setups and performance indexed faster. What's concerning me, among others, is that the crawler of Google comes from time to time when looking on Google Search Console but does not seem to make progress or to even follow any link and the evolution does not seem to do what google says in GSC help. For instance, our sitemap.xml was submitted, for a few days, it seemed like it had an impact as many pages were then visible in the coverage report, showing them as "detected but not yet indexed" and now, they disappeared from the coverage report, it's like if it was not detected any more. Anybody has any advice to speed up or accelerate the indexing of a new website like ours? It's been launched since now almost two months and I was expected, at least on some core keywords, to quickly get indexed.
Technical SEO | | rolandvintners1 -
Purpose of static index.html pages?
Hi All, I am fairly new to the technical side of SEO and was hoping y'all could help me better understand the purpose of dynamic rendering with index.html pages and any implications they might hold for SEO. I work to support an eComm site that includes a subdomain for its product pages: products.examplesite.com. I recently learned from one of our developers that there are actually two sets of product pages - a set of pages that he terms "reactive," that are present on our site, that only display content when a user clicks through to them and are not retrievable by search engines. And then a second set of static pages that were created just for search engines and end in .index.html. So, for example: https://products.examplesite.com/product-1/ AND https://products.examplesite.com/product-1/index.html I am confused as to what specifically the index.html pages are doing to support indexation, as they do not show up in Google Site searches, but the regular pages do. Is there something obvious I am missing here?
Technical SEO | | Lauren_Brick0 -
Should I worry that thousands of spam sites are linking to me?
I was browsing Google Webmaster Tools and discovered there are 117,301 links to my site— mostly from very low-quality, spammy websites. I definitely did not solicit these links. I'm worried they are from a competitor trying to get me penalized by Google. Should I be worried about this? spam-websites.png?1505218483
Technical SEO | | steve_benjamins0 -
Sudden Drop in Indexed Pages and Images under Sitemap
Hello! Just a couple days back, realised that under the Google Webmaster Tool > Sitemap, my website www.bibliotek.co has a sudden drop in indexed pages and images. Previously, it was almost fully indexed. However, I checked and the Google Index > Index Status, it is still fully indexed Any reason why and how do I resolve? Any help is very much appreciated! Thanks in advance!
Technical SEO | | Bibliotek1230 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Redirect chains after a site migration
Hi A clients site was originally canonicalised to the www. from the non www versions Now its migrating to an international config of www.domain.com/uk and www.domain.com/us with the existing pages/urls (such as www.domain.com/pageA) 301'd to the new www.domain.com/uk/pageA for example Will this will create a 301 redirect chain due to the existence of the original canonicalised urls or is the way that works 'catch all' so to speak, and automatically update the canonical 301 redirects of the non www old architexcture url's to the new international architecture URL's ? I presume so but just want to check ? cheers dan
Technical SEO | | Dan-Lawrence0 -
Google Has Indexed Most of My Site, why won't Bing?
We've got 600K+ pages indexed by Google and have submitted our same sitemap.xml's to Bing, but have only seen 100-200 pages get indexed by Bing. Is this fairly typical? Is there anything further we can do to increase indexation on Bing?
Technical SEO | | jamesti0