Post Site Migration - thousands of indexed pages, 4 months after
-
Hi all,
Believe me. I think I've already tried and googled for every possible question that I have. This one is very frustrating – I have the following old domain – fancydiamonds dot net.
We built a new site – Leibish dot com and done everything by the book:
- Individual 301 redirects for all the pages.
- Change of address via the GWT.
- Trying to maintain and improve the old optimization and hierarchy.
4 months after the site migration – we still have to gain back more than 50% of our original organic traffic (17,000 vs. 35,500-50,000
The thing that strikes me the most that you can still find 2400 indexed pages on Google (they all have 301 redirects).
And more than this – if you'll search for the old domain name on Google – fancydiamonds dot net you'll find the old domain!
Something is not right here, but I have no explanation why these pages still exist.
Any help will be highly appreciated. Thanks!
-
Thanks Dana. Honestly, we have a lot of experience dealing with site migrations - I read dozens of posts and we've implemented our own step-by=step guidelines for successful site migration.
As you can see, sometimes even when you do everything by the book you can encounter some unexpected issues.
-
Yes I have. I could see the 301 redirects correctly and without any further issues.
-
Yes, it sounds like perhaps there is a technical issue here. I like Keri's suggestion below. Also, have you grepped your server logs to see if Googlebot is having issues?
It can taken Google a long long time to take down search results to old pages that either don't exist any more or that 301 to a new page. You may have to resort to using the removal tool. I realize that for 2,000 URLs doing these one at a time is inconvenient, but it may just be what you have to do.
I have some old notes on domain migration that I'll try to dig up, but unfortunately I don't think there's much there that's helpful after the fact. But I'll see what I can find.
-
the URL remover tool would be one of my last options, since I too would be afraid of any authority vanishing with the old link.
Google must have some reason to continue to index the pages and I wouldn't want them removed until I'm positive I gained back all the authority I could, from these old pages.
-
Are you certain the 301 redirects are active and working?
-
Can you add canonical tags to the 301'ed pages?
-
Make sure that none of the URLs in the 301 URL chain are disallowed by a robots.txt file. If they were in the redirect chain, Google would not be able to properly crawl the new page and properly index.
That last point may be what's preventing a portion of the old URLs from dropping, if they are being blocked in the robots.txt file.
-
-
What happens when you go into GWT and fetch fancydiamonds.net as googlebot? Is there some reason that perhaps googlebot isn't seeing the redirects correctly?
-
Hi David,
see my answer to RaymondPP.
Also, what do you mean by saying "you are linking out to your other site"?
Did you see anything?
-
There is a perfect correlation between the organic drop and the revenue – It has decreased dramatically. Of course I checked for Analytics issue but all the other traffic sources have stayed the same. We have big PPC campaigns and the traffic data is correct.
About management of expectations – usually we say that we expect 3-4 months of traffic droppings, but this had taken us a bit by surprise.
-
Thanks for the answer.
That's always a possibility - the problem is that these url's have not too few links (the old homepage is still indexed!).
If I'll use the url remover won't this result in losing all the link juice for those url's?
-
I agree with both of the previous suggestions and thought I would add a comment and a question too.
Seeing a decline of 50% or even more in traffic after a site migration is not uncommon. Hopefully your clients went into the migration with eyes open, knowing that they could see significantly lower traffic for anywhere from 6 weeks to a year, and maybe never fully recover. This sometimes happens. That's why the planning process is so important (and management of expectations).
That being said, when you installed Google Analytics on the new site, did anything change in your GA tracking code? Sometimes this happens and can lead to old analytics reports and new analytics reports not being an "apples to apples" comparison. It's just a thought. It could be that the traffic isn't actually 50% lower, but has changed much less than that.
Has revenue (or whatever your conversion goal is) dropped, increased or stayed the same?
-
I'm not understanding why your traffic is lower. If you have 301 redirects in place, even if your old pages show up and someone clicks the link, it will take them to the new site.
Another option you could pursue is a 410 (gone) for your old pages. This states to Google that the page has been removed and should no longer be indexed or linked.
But beware, you are linking out to your other site.
The 410 error is primarily intended to assist the task of web maintenance by notifying the client system that the resource is intentionally unavailable and that the Web server wants remote links to the URL to be removed. Such an event is common for URLs which are effectively dead i.e. were deliberately time-limited or simply orphaned. The Web server has complete discretion as to how long it provides the 410 error before switching to another error such as 404.
-
Hi skifr - Have you tried using the URL remover tool in GWT? And if you really want those pages out of the search engine, how about a noindex tag on the old domain?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Which pages should I index or have in my XML sitemap?
Hi there, my website is ConcertHotels.com - a site which helps users find hotels close to concert venues. I have a hotel listing page for every concert venue on my site - about 12,000 of them I think (and the same for nearby restaurants). e.g. https://www.concerthotels.com/venue-hotels/madison-square-garden-hotels/304484 Each of these pages list the nearby hotels to that concert venue. Users clicking on the individual hotel are brought through to a hotel (product) page e.g. https://www.concerthotels.com/hotel/the-new-yorker-a-wyndham-hotel/136818 I made a decision years ago to noindex all of the /hotel/ pages since they don't have a huge amount of unique content and aren't the pages I'd like my users to land on . The primary pages on my site are the /venue-hotels/ listing pages. I have similar pages for nearby restaurants, so there are approximately 12,000 venue-restaurants pages, again, one listing page for each concert venue. However, while all of these pages are potentially money-earners, in reality, the vast majority of subsequent hotel bookings have come from a fraction of the 12,000 venues. I would say 2000 venues are key money earning pages, a further 6000 have generated income of a low level, and 4000 are yet to generate income. I have a few related questions: Although there is potential for any of these pages to generate revenue, should I be brutal and simply delete a venue if it hasn't generated revenue within a time period, and just accept that, while it "could" be useful, it hasn't proven to be and isn't worth the link equity. Or should I noindex these "poorly performing pages"? Should all 12,000 pages be listed in my XML sitemap? Or simply the ones that are generating revenue, or perhaps just the ones that have generated significant revenue in the past and have proved to be most important to my business? Thanks Mike
Technical SEO | | mjk260 -
Pages not indexed
Hey everyone Despite doing the necessary checks, we have this problem that only a part of the sitemap is indexed.
Technical SEO | | conversal
We don't understand why this indexation doesn't want to take place. The major problem is that only a part of the sitemap is indexed. For a client we have several projects on the website with several subpages, but only a few of these subpages are indexed. Each project has 5 to 6 subpages. They all should be indexed. Project: https://www.brody.be/nl/nieuwbouwprojecten/nieuwbouw-eeklo/te-koop-eeklo/ Mainly subelements of the page are indexed: https://www.google.be/search?source=hp&ei=gZT1Wv2ANouX6ASC5K-4Bw&q=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&oq=site%3Abrody.be%2Fnl%2Fnieuwbouwprojecten%2Fnieuwbouw-eeklo%2F&gs_l=psy-ab.3...30.11088.0.11726.16.13.1.0.0.0.170.1112.8j3.11.0....0...1c.1.64.psy-ab..4.6.693.0..0j0i131k1.0.p6DjqM3iJY0 Do you have any idea what is going wrong here?
Thanks for your advice! Frederik
Digital marketeer at Conversal0 -
Similar pages on a site
Hi I think it was at BrightonSEO where PI DataMetrics were talking about similar pages on a website can cause rankings to drop for your main page. This has got me thinking. if we have a category about jumpers so: example.com/jumpers but then our blog has a category about jumpers, where we write all about jumpers etc which creates a category page example.com/blog/category/jumpers, so these blog category pages have no index put on them to stop them ranking in Google? Thanks in Advance for any tips. Andy
Technical SEO | | Andy-Halliday1 -
What is the best way to stop a page being indexed?
What is the best way to stop a page being indexed? Is it to implement robots.txt at a site level with a Robots.txt file in the main directory or at a page level with the tag?
Technical SEO | | cbarron0 -
Indexed pages and current pages - Big difference?
Our website shows ~22k pages in the sitemap but ~56k are showing indexed on Google through the "site:" command. Firstly, how much attention should we paying to the discrepancy? If we should be worried what's the best way to find the cause of the difference? The domain canonical is set so can't really figure out if we've got a problem or not?
Technical SEO | | Nathan.Smith0 -
Wrong page version in the index
Hi, my site is currently accessible through URL with and without www. The Version with www has 10 times more Backlinks (PA 45 vs 38) but is not listet into the google Index. As far as I know there was never made a google Webmaster account or declared otherwise the version without www to be 'cannonical'. Basically I think that for SEO reasons it would be much better to declare the with www version to be cannonical and redirect the without www version to it. My questions are: Do you have an idea why the with www version is not indexed?
Technical SEO | | Naturalmente
How long does Google usually take to change the version in the index?
Do I risk my site to be thrown out of the index for some days untill the change is made? Thanks in advance.0 -
Development site accidentally got indexed and now appears in SERPs. How to fix?
I work at a design firm, and we just redesigned a website for a client. When it came time for the coding, we initially built a development site to work out all the kinks before going live. Then we relaunched the actual site about a week ago. Here's the problem: Somehow, the developer who coded the site for us (a freelancer) allowed the development site to be indexed by Google. Now, when you enter the client's name into Google, the development site appears higher in the results pages than the real site! In fact, the real site isn't even in the top 50 search results. The client is understandably angry about this for multiple reasons. We quickly added a robots.txt file to the development site and a 301 redirect to the real site. However, that did seemed to have no effect on the problem. Any ideas on how to fix this mess? Thank you in advance!
Technical SEO | | matt-145670 -
Google News not indexing .index.html pages
Hi all, we've been asked by a blog to help them better indexing and ranking on Google News (with the site being already included in Google News with poor results) The blog had a chronicle URL duplication problem with each post existing with 3 different URLs: #1) www.domain.com/post.html (currently in noindex for editorial choices as showing all the comments) #2) www.domain.com/post/index.html (currently indexed showing only top comments) #3) www.domain.com/post/ (very same as #2) We've chosen URL #2 (/index.html) as canonical URL, and included a rel=canonical tag on URL #3 (/) linking to URL #2.
Technical SEO | | H-FARM
Also we've submitted yesterday a Google News sitemap including consistently the list of URLs #2 from the last 48h . The sitemap has been properly "digested" by Google and shows that all URLs have been sent and indexed. However if we use the site:domain.com command on Google News we see something completely different: Google News has indexed actually only some news and more specifically only the URLs #3 type (ending with the trailing slash instead of /index.html). Why ? What's wrong ? a) Does Google News bot have problems indexing URLs ending with .index.html ? While figuring out what's wrong we've found out that http://news.google.it/news/search?aq=f&pz=1&cf=all&ned=us&hl=en&q=inurl%3Aindex.html gives no results...it seems that Google News index overall does not include any URLs ending with /index.html b) Does Google News bot recognise rel=canonical tag ? c) Is it just a matter of time and then Google News will pick up the right URLs (/index.html) and/or shall we communicate Google News team any changes ? d) Any suggestions ? OR Shall we do the other way around. meaning make URL #3 the canonical one ? While Google News is showing these problems, Google Web search has actually well received the changes, so we don't know what to do. Thanks for your help, Matteo0