URL Change Best Practice
-
I'm changing the url of some old pages to see if I can't get a little more organic out of them.
After changing the url, and maybe title/desc tags as well, I plan to have Google fetch them.
How does Google know that the old url is 301'd to the new url and the new url is not just a page of duplicate content?
Thanks... Darcy
-
Yes, and this is especially problematic if you change all of your internal links to point to the new page, thereby leaving Google little reason to recrawl the old page. There are a couple of quick, simple solutions to this...
1. Update your XML sitemap to include the OLD URLs and set their priority to 1, update frequency to daily, and last updated date to today. This will tell Google that the old URLs are important and updated, so you may be able to coax Google to recrawl them quickly.
2. Use "Fetch as Googlebot" on the old URLs to show Google the 301 redirects
These are, admittedly, speculative, but Google hasn't given us a clear solution to this very common problem. Good luck!
-
Hi Bryan,
Wouldn't it have to re-crawl the old url to see that if forwards to the new url?
-
So long as you set your 301 redirect up correctly, it's not an issue. A 301 tells Google that Page-A should permanently direct to Page-B. Because this is often done to replace or update a page, Google and others will know that the similarity / duplicitous nature of the pages is likely due to that very same thing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL in russian
Hi everyone, I am doing an audit of a site that currently have a lot of 500 errors due to the russian langage. Basically, all the url's look that way for every page in russian: http://www.exemple.com/ru-kg/pешения-для/food-packaging-machines/
Intermediate & Advanced SEO | | alexrbrg
http://www.exemple.com/ru-kg/pешения-для/wood-flour-solutions/
http://www.exemple.com/ru-kg/pешения-для/cellulose-solutions/ I am wondering if this error is really caused by the server or if Google have difficulty reading the russian langage in URL's. Is it better to have the URL's only in english ?0 -
Microsite Subfolder URL vs Redirected TLD for best SEO
We have a healthcare microsite that is in a subfolder off a hospital site.They wanted to keep their TLD and redirect from the subfolder URL. Even with good on-page SEO, link building, etc., they're not organically ranking as well as we think they should be. ie. They have http://our-business-name.com vs. http://hospital.org/our-business-name/ For best SEO value, are they better off having only their homepage as TLD and not redirect any interior pages but display as subfolder URL? ie. Keep homepage as http://our-business-name.com but use hospital urls for interior pages http://hospital.org/our-business-name/about/ Or is there some better way to handle this?
Intermediate & Advanced SEO | | IT-dmd0 -
Attack of the dummy urls -- what to do?
It occurs to me that a malicious program could set up thousands of links to dummy pages on a website: www.mysite.com/dynamicpage/dummy123 www.mysite.com/dynamicpage/dummy456 etc.. How is this normally handled? Does a developer have to look at all the parameters to see if they are valid and if not, automatically create a 301 redirect or 404 not found? This requires a table lookup of acceptable url parameters for all new visitors. I was thinking that bad url names would be rare so it would be ok to just stop the program with a message, until I realized someone could intentionally set up links to non existent pages on a site.
Intermediate & Advanced SEO | | friendoffood1 -
New Website. Changing TLD or not?
Hi, At my company we are making a new website because the days of the old one are numbered. We already decided that the folder structure will be changed so we have more "clean" url's. Now we also would like to change from .net/nl to .nl . Since we already are redirecting all url's (>10.000), we think this is the moment to switch the TLD. What do you guys think? Is their anyone who has some kind of experience/tip they would like to share?
Intermediate & Advanced SEO | | SEO_ACSI0 -
What is the best practice to optimize page content with strong tags?
For example, if I have a sub page dedicated to the keyword "Houston Leather Furniture" is it best practice to bold ONLY the exact match keyword? Or should ONLY the words from the keyword (so 'Houston' 'Leather' and 'Furniture') Is there a rule to how many times it should be done before its over-optimization? I appreciate any information as I want to do the BEST possible practice when it comes to this topic. Thanks!
Intermediate & Advanced SEO | | MonsterWeb280 -
Overly-Dynamic URL
Hi, We have over 5000 pages showing under Overly-Dynamic URL error Our ecommerce site uses Ajax and we have several different filters like, Size, Color, Brand and we therefor have many different urls like, http://www.dellamoda.com/Designer-Pumps.html?sort=price&sort_direction=1&use_selected_filter=Y http://www.dellamoda.com/Designer-Accessories.html?sort=title&use_selected_filter=Y&view=all http://www.dellamoda.com/designer-handbags.html?use_selected_filter=Y&option=manufacturer%3A&page3 Could we use the robots.txt file to disallow these from showing as duplicate content? and do we need to put the whole url in there? like: Disallow: /*?sort=price&sort_direction=1&use_selected_filter=Y if not how far into the url should be disallowed? So far we have added the following to our robots,txt Disallow: /?sort=title Disallow: /?use_selected_filter=Y Disallow: /?sort=price Disallow: /?clearall=Y Just not sure if they are correct. Any help would be greatly appreciated. Thank you,Kami
Intermediate & Advanced SEO | | dellamoda2 -
Exact keyword URL or not?
Hi all, I have a quick question about the proper use of permalinks. Let's say that I have a website about sports and I want to create an internal page dedicated to shoes. I know that the keyword "shoe" has 15.000 monthly visits, while the keyword "shoes" has 1.000 monthly visits. How do I have to name the internal page? http://www.example.com/shoe or http://www.example.com/shoes (with a final 's')? I would think that by naming the URL http://www.example.com/shoes, the search engine would consider that page for the keywords "shoe" and "shoes", but I am not sure about it. Should I create a URL that only focuses on one specific keyword ("shoe", in this example) or a URL that may encompass more than one keyword ("shoe" and "shoes")? I hope this is clear. Thank you for your time and help. All best, Sal
Intermediate & Advanced SEO | | salvyy0 -
Google Places, Multiple locations best practice
What is the best practice with having multiple locations in Google Places. Does having multiple Google Places set up for each business have a big effect on local rankings for the individual areas? Should I use the home page for the website listed on each page or is it better to have a specific landing page for each Google Places listing? Any other tips? Thanks, Daniel
Intermediate & Advanced SEO | | iSenseWebSolutions0