Best way to migrate to a new URL structure
-
Hello everyone,
We’re changing our URL structure from something like this: example.com/index.php?language=English
To something like this: example.com**/english/**index.php
The change is implemented with mod_rewrite so all the old URLs can still work
We have hundreds of thousands of pages that are currently indexed with the old URL structure
What’s the best way to get Google to rapidly update its index and to maintain as much ranking as possible?
-
301 redirect all the old URLs to the new equivalent format?
-
If we detect that the URL is in an old format, render the page with a canonical tag pointing to the new equivalent format as well as adding a noindex, nofollow tag?
-
Something else?
Thanks for your input!
-
-
Alan response is great, but if by any chance you are using wordpress and want to migrate to a new permalink structure I would recommand this plugin.
This will allow you to change your URL structure and put in place de redirections for one another.
-
1. REWRITE the new url to the old url, unless you have reworked your code to read the new urls.
2. REDIRECT the old url to the new one, incase you already have links to the old urls, you dont want duplicate content
3. you need to make sure that all internal links point to the new url, you dont want un-necessary redirects as they leak link juice.
If you are using IIS server, this is so easy todo, just a few clicks, see half way down page
http://www.seomoz.org/ugc/microsoft-technologies-and-seo-web-development
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating to a new domain
Hi The company I work for are planing to re-brand & come under our parent company name. This means the whole site will be moved to a new domain. Does anyone have any experience with this and can give me some useful docs to read/any advice? Thank you!
Intermediate & Advanced SEO | | BeckyKey1 -
If we migrate the URLs from HTTP to HTTPS, Do I need to request again an inclusion in Google News?
Hi, If we migrate the URLs from HTTP to HTTPS, Do I need to request again an inclusion in Google News? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Will disallowing URL's in the robots.txt file stop those URL's being indexed by Google
I found a lot of duplicate title tags showing in Google Webmaster Tools. When I visited the URL's that these duplicates belonged to, I found that they were just images from a gallery that we didn't particularly want Google to index. There is no benefit to the end user in these image pages being indexed in Google. Our developer has told us that these urls are created by a module and are not "real" pages in the CMS. They would like to add the following to our robots.txt file Disallow: /catalog/product/gallery/ QUESTION: If the these pages are already indexed by Google, will this adjustment to the robots.txt file help to remove the pages from the index? We don't want these pages to be found.
Intermediate & Advanced SEO | | andyheath0 -
Need suggestion for URL structure to show my website in Google News section
I need suggestion for URL structure to include my news section in Google News.Google recommend any specific URL structure to include website in Google News.?
Intermediate & Advanced SEO | | vivekrathore0 -
Best to Fix Duplicate Content Issues on Blog If URLs are Set to "No-Index"
Greetings Moz Community: I purchased a SEMrush subscription recently and used it to run a site audit. The audit detected 168 duplicate content issues mostly relating to blog posts tags. I suspect these issues may be due to canonical tags not being set up correctly. My developer claims that since these blog URLs are set to "no-index" these issues do not need to be corrected. My instinct would be to avoid any risk with potential duplicate content. To set up canonicalization correctly. In addition, even if these pages are set to "no-index" they are passing page rank. Further more I don't know why a reputable company like SEMrush would consider these errors if in fact they are not errors. So my question is, do we need to do anything with the error pages if they are already set to "no-index"? Incidentally the site URL is www.nyc-officespace-leader.com. I am attaching a copy of the SEMrush audit. Thanks, Alan BarjWaO SqVXYMy
Intermediate & Advanced SEO | | Kingalan10 -
Url structure of a blog
We are trying to work out what the best structure for our blog is as we want each page to rank as highly as possible, we were looking at a flat structure similar to http://www.hongkiat.com/blog/ where every posts is after the blog/ but not in category's although the viewers can look in different category's from the top buttons on the page- photoshop - icons etc or we where going to go for the structured way- blog/photoshop/blog-post.html the only problem is that we will end up 4 deep at least with this and at least 80 characters in the url. any help would be appreciated. Thanks Shaun
Intermediate & Advanced SEO | | BobAnderson0 -
Is it OK to have a site that has some URLs with hyphens and other, older, legacy URLs that use underscores?
I'm working with a VERY large site that has recently been redesigned/recategorized. They kept only about 20% of the URLs from the legacy site, the URLs that had revenue tied to them, and these URLs use underscores. Whereas the new URLs created for the site use hyphens. I don't think that this would be an issue for Google, as long as the pages are of quality, but I wanted to get everyone's opinion on this. Will it hurt me to have two different sets of URLs, those with using hyphens and those using underscores?
Intermediate & Advanced SEO | | Business.com0 -
10,000 New Pages of New Content - Should I Block in Robots.txt?
I'm almost ready to launch a redesign of a client's website. The new site has over 10,000 new product pages, which contain unique product descriptions, but do feature some similar text to other products throughout the site. An example of the page similarities would be the following two products: Brown leather 2 seat sofa Brown leather 4 seat corner sofa Obviously, the products are different, but the pages feature very similar terms and phrases. I'm worried that the Panda update will mean that these pages are sand-boxed and/or penalised. Would you block the new pages? Add them gradually? What would you recommend in this situation?
Intermediate & Advanced SEO | | cmaddison0