Changing URL Structure
-
We are going to be relaunching our website with a new URL structure. My question is, how is it best to deal with the migration process in terms of old URLS appearing whilst we launch the new ones.
How best should we launch the new structure, considering we've in the region of 10,000 pages currently indexed in Google.
-
Another thought might be to place a noindex on the new pages to start with and as we migrate and 301 redirect the old to the new remove the noindex on the new pages ?
Thoughts ??
-
Bump - can anyone else offer some input ?
-
Yes, that is the problem we are going to have. Can anyone else help or offer some advice ?
-
Hmm, I'd release the new pages gradually. 10,000 pages hitting the index in short order may get your site noticed and maybe treated as spam.
Maybe someone else on here can give you better guidance on the release schedule but I'd go with maybe 100 to 150 pages a day. That's a 3 month release plan.
Steve
-
On another point, the new pages will all have new meta, content etc, so if we left the old pages online until we had the chance to go through them I don't think we'd have any problem re duplicate content.
-
Hi Steve,
I fully understand the workings of 301 redirect etc. However, my concern is we have over 10,000 pages. Whats the best way to change from URL to another ?
Shall I release the new homepage, with links to the new already created pages, and then go through the old content and redirect it all ?
-
OK, for each new page you implement, 301 redirect the old one to the new (when the new is complete) that way you won't loose any traction from google.
a 301 Permanent Redirect tell the SEs that your content has MOVED PERMANETLY
Make the 301 redirect another action on your make live list for each page.
HTH
Steve
-
Hi, the old URLs will still remain whilst the new ones are being uploaded. There is no real pattern between the old and new URLs. All new URLs are more SEO friendly and keyword targetted.
Our old pages use ASP so I can easily implement a way of updating the old pages with a redirect.
My concern is what process should we take when changing the URLs in the transition period.
-
Hi,
Presumably you'll be getting "404 errors" as you migrate / remove the old pages. My view is that you'll need to carefully construct a 301 redirect in a .htaccess file.
This may be a real bind if there is no real pattern to your old structure that you could "translate" to the new URLs.
I'm not good on redirecting using GREP so maybe someone else here could advise. Could you post some URL examples (old => New)
Steve
-
You should create 301 redirects for all old pages. By the way you can use "Mod rewrite" module for Apache.
If it's hard to determine rules, you can redirect object pages to new category page, but of course it is less efficiency way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL indexed but not submitted in sitemap, however the URL is in the sitemap
Dear Community, I have the following problem and would be super helpful if you guys would be able to help. Cheers Symptoms : On the search console, Google says that some of our old URLs are indexed but not submitted in sitemap However, those URLs are in the sitemap Also the sitemap as been successfully submitted. No error message Potential explanation : We have an automatic cache clearing process within the company once a day. In the sitemap, we use this as last modification date. Let's imagine url www.example.com/hello was modified last time in 2017. But because the cache is cleared daily, in the sitemap we will have last modified : yesterday, even if the content of the page did not changed since 2017. We have a Z after sitemap time, can it be that the bot does not understands the time format ? We have in the sitemap only http URL. And our HTTPS URLs are not in the sitemap What do you think?
Intermediate & Advanced SEO | | ZozoMe0 -
Changing URLS: from a short well optimised URL to a longer one – What's the traffic risk
I'm working with a client who has a website that is relatively well optimised, thought it has a pretty flat structure and a lot of top level pages. They've invested in their content over the years and managed to rank well for key search terms. They're currently in the process of changing CMS and as a result of new folder structuring in the CMS the URLs for some pages look to have significantly changed. E.g Existing URL is: website.com/grampians-luxury-accommodation which ranked quite well for luxury accommodation grampians New URL when site is launched on new CMS would be website.com/destinations/victoria/grampians My feeling is that the client is going to lose out on a bit of traffic as a result of this. I'm looking for information or ways or case studies to demonstrate the degree of risk, and to help make a recommendation to mitigate risk.
Intermediate & Advanced SEO | | moge0 -
URL structure change and xml sitemap
At the end of April we changed the url structure of most of our pages and 301 redirected the old pages to the new ones. The xml sitemaps were also updated at that point to reflect the new url structure. Since then Google has not indexed the new urls from our xml sitemaps and I am unsure of why. We are at 4 weeks since the change, so I would have thought they would have indexed the pages by now. Any ideas on what I should check to make sure pages are indexed?
Intermediate & Advanced SEO | | ang0 -
¿Disallow duplicate URL?
Hi comunity, thanks for answering my question. I have a problem with a website. My website is: http://example.examples.com/brand/brand1 (good URL) but i have 2 filters to show something and this generate 2 URL's more: http://example.examples.com/brand/brand1?show=true (if we put 1 filter) http://example.examples.com/brand/brand1?show=false (if we put other filter) My question is, should i put in robots.txt disallow for these filters like this: **Disallow: /*?show=***
Intermediate & Advanced SEO | | thekiller990 -
Structured Data + Meta Descriptions
Hey All, Was just looking through some google pages on best practices for meta descriptions and came across this little tidbit. "Include clearly tagged facts in the description. The meta description doesn't just have to be in sentence format; it's also a great place to include structured data about the page. For example, news or blog postings can list the author, date of publication, or byline information. This can give potential visitors very relevant information that might not be displayed in the snippet otherwise. Similarly, product pages might have the key bits of information—price, age, manufacturer—scattered throughout a page. A good meta description can bring all this data together. For example, the following meta description provides detailed information about a book. " This is the first time I have seen suggested use of structured data in meta descriptions. Does this totally replace a regular meta description or will it work in conjunction with the regular meta description? If I provide both structured data and text, will the SERP display text and the structured data the way it was previously displayed? Or will the 150 -160 character limit take precedence and just cut off all info after that?
Intermediate & Advanced SEO | | Whebb0 -
Received "Googlebot found an extremely high number of URLs on your site:" but most of the example URLs are noindexed.
An example URL can be found here: http://symptom.healthline.com/symptomsearch?addterm=Neck%20pain&addterm=Face&addterm=Fatigue&addterm=Shortness%20Of%20Breath A couple of questions: Why is Google reporting an issue with these URLs if they are marked as noindex? What is the best way to fix the issue? Thanks in advance.
Intermediate & Advanced SEO | | nicole.healthline0 -
Blog URL Canonical
Hi Guy's, I would like to know your thoughts on the following set-up for blog canonical. Option 1 domain.com/blog = <link rel="canonical" href="domin.com/blog"> domain.com/blog-category/general = <link rel="canonical" href="domain.com/blog"> domain.com/blog-article/how-to-set-canonical = no canonical option 2 domain.com/blog = <link rel="canonical" href="domin.com blog"="">(as option 1)</link rel="canonical" href="domin.com> domain.com/blog-category/general = <link rel="canonical" href="domain.com blog-category="" general"="">(this time has the canonical of the category)</link rel="canonical" href="domain.com> domain.com/blog-article/how-to-set-canonical = <link rel="canonical" href="domain.com blog-article="" how-to-set-canonical"="">(this time has the canonical of the article full URL)</link rel="canonical" href="domain.com> Just not sure which is the best option, or even if it is any of the above! Thanks Dan
Intermediate & Advanced SEO | | Dan1e10 -
Page URL keywords
Hello everybody, I've read that it's important to put your keywords at the front of your page title, meta tag etc, but my question is about the page url. Say my target keywords are exotic, soap, natural, and organic. Will placing the keywords further behind the URL address affect the SEO ranking? If that's the case what's the first n number of words Google considers? For example, www.splendidshop.com/gift-set-organic-soap vs www.splendidshop.com/organic-soap-gift-set Will the first be any less effective than the second one simply because the keywords are placed behind?
Intermediate & Advanced SEO | | ReferralCandy0