What is the best practice to re-index the de-indexed pages due to a bad migration
-
Dear Mozers,
We have a Drupal site with more than 200K indexed URLs. Before 6 months a bad website migration happened without proper SEO guidelines. All the high authority URLs got rewritten by the client. Most of them are kept 404 and 302, for last 6 months.
Due to this site traffic dropped more than 80%. I found today that around 40K old URLs with good PR and authority are de-indexed from Google (Most of them are 404 and 302).
I need to pass all the value from old URLs to new URLs.
Example URL Structure
Before Migration (Old)
http://www.domain.com/2536987
(Page Authority: 65, HTTP Status:404, De-indexed from Google)After Migration (Current)
http://www.domain.com/new-indexed-and-live-url-versionDoes creating mass 301 redirects helps here without re-indexing the old URLS? Please share your thoughts.
- Riyas
-
OK Gary. Thanks
-
Actually we just had this conversation in another Question
Try some of the ideas spoken about there?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Only fraction of the AMP pages are indexed
Back in June, we had seen a sharp drop in traffic on our website. We initially assumed that it was due to the Core Update that was rolled out in early June. We had switched from http to https in May, but thought that should have helped rather than cause a problem. Until early June the traffic was trending upwards. While investigating the issue, I noticed that only a fraction (25%) of the AMP pages have been indexed. The pages don't seem to be getting indexed even though they are valid. Accordingly to Google Analytics too, the percentage of AMP traffic has dropped from 67-70% to 40-45%. I wonder if it is due to the indexing issue. In terms of implementation it seems fine. We are pointing canonical to the AMP page from the desktop version and to the desktop version from the AMP page. Any tips on how to fix the AMP indexing issue. Should I be concerned that only a fraction of the AMP pages are indexed. I really hope you can help in resolving this issue.
Technical SEO | | Gautam1 -
Should you use google url remover if older indexed pages are still being kept?
Hello, A client recently did a redesign a few months ago, resulting in 700 pages being reduced to 60, mostly due to panda penalty and just low interest in products on those pages. Now google is still indexing a good number of them ( around 650 ) when we only have 70 on our sitemap. Thing is google indexes our site on average now for 115 urls when we only have 60 urls that need indexing and only 70 on our sitemap. I would of thought these urls would be crawled and not found, but is taking a very long period of time. Our rankings haven't recovered as much as we'd hope, and we believe that the indexed older pages are causes this. Would you agree and also would you think removing those old urls via the remover tool would be best option? It would mean using the url remover tool for 650 pages. Thank you in advance
Technical SEO | | Deacyde0 -
Blog page won't get indexed
Hi Guys, I'm currently asked to work on a website. I noticed that the blog posts won't get indexed in Google. www.domain.com/blog does get indexed but the blogposts itself won't. They have been online for over 2 months now. I found this in the robots.txt file: Allow: / Disallow: /kitchenhandle/ Disallow: /blog/comments/ Disallow: /blog/author/ Disallow: /blog/homepage/feed/ I'm guessing that the last line causes this issue. Does anyone have an idea if this is the case and why they would include this in the robots.txt? Cheers!
Technical SEO | | Happy-SEO2 -
Best Practices for Image Optimisation
Hi Guys, I would love some recommendations from you all. A potential client of mine is currently hosting all their website image galleries (of which there are many) on a flickr account and realise that they could gain more leverage in Google images (currently none of their images cover off any of the basics for optimisation eg filename, alt text etc), I did say that these basics would at least need to be covered off and that Image hosting is supposedly an important factor especially when it comes to driving traffic from Google Image Search. (potentially images hosted on the same domain as the text are given more value than the images hosted at another domain like websites such as Flickr). The client has now come back saying they have done some 'reading' and that this suggests a sub-domain could be the way to go, e.g. images.mydomain.com - would love feedback on this before I go back to them as it would be a huge undertaking for them. Cheers
Technical SEO | | musthavemarketing0 -
My sites "pages indexed by Google" have gone up more than qten-fold.
Prior to doing a little work cleaning up broken links and keyword stuffing Google only indexed 23/333 pages. I realize it may not be because of the work but now we have around 300/333. My question is is this a big deal? cheers,
Technical SEO | | Billboard20120 -
Should I index my search result pages?
I have a job site and I am planning to introduce a search feature. The question I have is, is it a good idea to index search results even if the query parameters are not there? Example: A user searches for "marketing jobs in New York that pay more than 50000$". A random page will be generated like example.com/job-result/marketing-jobs-in-new-york-that-pay-more-than-50000/ For any search that gets executed, the same procedure would be followed. This would result in a large number of search result pages automatically set up for long tail keywords. Do you think this is a good idea? Or is it a bad idea based on all the recent Google algorithm updates?
Technical SEO | | jombay0 -
Whats the best way to stop search results from being indexed?
I Have a Wordpress Site, and just realized that the search results are being indexed on Google creating duplicate content. Whats the best way for me to stop these search result pages from being indexed without stopping the regulars and important pages and posts from being indexed as well? **The typical search query looks like this: ** http://xxx.com/?s=Milnerton&search=search&srch_type AND this also includes results that are linked to the "view more" such as:
Technical SEO | | stefanok
http://xxx.com/index.php?s=viewmore Your help would be much appreciated. regards Stef0 -
Rel canonical or 301 the Index Page?
Still a bit confused on best practice for /index.php showing up as duplicate for www.mysite.com. What do I need to do and How?
Technical SEO | | bozzie3110