Redirect questions
-
Hi!
A client of mine have created a new site with a new URL structure which they launched the other day. They have done a 301 redirect on all pages on the old site to the start page on the new site. E.g:
www.olddomain.com/subfolder1/index.html -> www.newdomain.com
www.olddomain.com/subfolder2/index.html -> www.newdomain.comI'm thinking of fixing this now so the redirect instead looks someting like this:
www.olddomain.com/subfolder1/index.html -> www.newdomain.com/newsubfolder1/index.html
www.olddomain.com/subfolder1/index.html -> www.newdomain.com/newsubfolder1/index.htmlTwo questions:
1. Is it worth doing the latter kind of redirect in all cases (after all, it involves quite a lot more work compared to the first solution)? or do you recommend the first solution for all redirect projects?
2. Now that they have already done the first solution, is it at all worth amending this to the latter or is everything spoiled now that they have already gone ahead with the first solution?Many thanks in advance!
-
Thanks guys!
So my interpretation of your feedback and the short answer to my questions are:
1. Yes, it's worth doing.
2. Yes, it's worth doing.Cheers!
-
You should only need one redirect if link stucture is the same,
point both domains at the new site
then create a rule if HTTP_HOST is not newdomain.com then redirect to new domain,
here is the rule for IIS
<rule name="CanonicalHostNameRule1"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^thatsit.com.au$" negate="true"></add></conditions>
<action type="Redirect" url="http://thatsit.com.au/{R:1}"></action></match></rule>http://thatsit.com.au/seo/tutorials/how-to-fix-canonical-domain-name-issues
-
From a visitors perspective, if you've got links to deep pages, then it would be worth creating redirects to the relevant content on the new site.
If someone follows a link from another site with anchor text along the lines of "see this great article about x" and it just goes to the homepage, the visitor is going to find it rather jarring...
-
Agree with Rasmus pm the whole.
I'd still go with the updated subfolder version even if google has crawled the pages...these things take a while to settle down.
If there are lots of pages, check out the tool Russ Jones shared for using Levenshtein distance to automate creating redirects http://www.seomoz.org/blog/set-it-and-forget-it-seo-chasing-the-elusive-passive-seo-dream
There's another great post for larger sites that could help: http://www.seomoz.org/blog/scripting-seo-5-pandafighting-tricks-for-large-sites-14455
And here's another version that creates the htaccess redirects for you (though I've not tried it so don't know how well it works) http://www.conversationmarketing.com/2010/10/levinshtein_link_fixer_aka_the.htm
-
in cpanel there is an option to redirect all at once, keeping the structure of site. try cpanel.yourdomain.com
-
I would recommend making the redirects from subfolder to subfolder. If you redirect all pages to the new frontpage Google needs to crawl the new site from scratch in order to index all pages.
If you make the 301 redirect from old pages to corresponding new pages I would say it is worth the effort. Otherwise www.newdomain.com/newsubfolder1/index.html needs to build up its own new page ranking since it is a new URL that Google does not know.
Question is if Google has already crawled a lot of the old URLs, but if it was me I would get on making the correct redirects before Google crawls to many of the old URLs. This will give the new site better ranking from the start I should think AND it will save time for the Google crawlers. One should always anticipate a drop when changing domain, but it is always a good idea to take precautions in order to ensure a quick bounce back.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting issue
Please I have a domain name miaroseworld.com I redirected (301 redirect) it to one of my domain names and I am having issues with the website so I decided to redirect to to my new site but moz is still showing redirecty to the previous websites
Technical SEO | | UniversalBlog
Even when I change the redirect on search console it still showing redirecting to the previous site.0 -
Question on canonicals
hi community let's say i have to 2 e-commerce sites selling the same English books in different currencies - one of the site serves the UK market ( users can purchase in sterling) while another one European markets ( user can purchase in euro). Sites are identical. SEO wise, while the "European" site homepage has a good ranking across major search engines in europe, product pages do not rank very well at all. Since site is a .com too it s hard to push it in local search engines. I would like then to push one of the sites across all search engines,tackling duplicate content etc.Geotargeting would make the rest. I would like to add canonicals tag pointing at the UK version across all EU site product pages, while leaving the EU homepage rank. I have 2 doubts though: is it ok to have canonical tags pointing at an external site. is it ok to have part of a site with canonical tags, while other parts are left ranking?
Technical SEO | | Mrlocicero0 -
Canonical redirects
Hello, I have a quick question: I use wordpress for my website. I have a plugin for translating the website in other languages. Thus, I have 2 versions of urls, one with /en, one without (original languale). This has been seen as duplicate content. I have been advised that the best to do is to use canonical redirect. Should I use it on the general header.php (the only header I can find in the CMS), or should I redirect each page singularly? I believe the second is the best way, but I can't find headers and txt documents for each page in my FTP. As well I have seen this post, in which is explained that canonical redirects can be done directly in the general header.php http://www.bin-co.com/blog/2009/02/avoid-duplicate-content-use-canonical-url-in-wordpress-fix-plugin/ Is it true? You have any suggestion?
Technical SEO | | socialengaged
Thanks! 🙂 Eugenio0 -
Server redirect query
Hi there, due to so much traffic coming through to our e-commerce site our host is going to do an apache re-direct for over flow traffic from www.mywebsite.com to a ww2.mywebsite.com canonical tags will be in, but if there is a re-direct this is surely bad for seo, telling on onccasions the page has moved? Any advice on this and the best way to re-direct users when there is too much traffic please let me know.
Technical SEO | | pauledwards0 -
Specific Domain Migration Question
My company will be taking over an ecommerce site that is built to get local city/state traffic where the competition is slim to none for the given keyword. This site gets 2500+ visits per day, and we're looking to maintain and eventually grow that traffic. We would like to move that site onto our ecommerce platform which will force URL change and of every 'keyword' city/state page on the site. We're undecided whether to keep it on an unfamiliar platform that already gets traffic or to move it and possibly face the 404's or weeks of redirecting a single keyword-city/state page to another. Any advice or insight would be great!
Technical SEO | | BMac540 -
Canonical Question
Can someone please help me with a question, I am learning about Canonical URls at the moment and have had some errors come up, it is saying ```![Priority 1](http://try.powermapper.com/Reports/89db420a-2cf2-46dc-bae4-543efbefc241/report/Report/p1.png)This page has multiple rel=canonical tags.Line 9 Best Practice[![](http://try.powermapper.com/Reports/89db420a-2cf2-46dc-bae4-543efbefc241/report/Report/dropbox.png)](http://try.powermapper.com/Reports/89db420a-2cf2-46dc-bae4-543efbefc241/report/res/2.view.htm#)![Help](http://try.powermapper.com/Reports/89db420a-2cf2-46dc-bae4-543efbefc241/report/Report/help.png)Search engine behavior is unpredictable when a page has multiple canonical tags. <link rel="canonical" href="http://www.finalduties.co.uk/" /><link rel="alternate" type="application/rss+xml" title="Final Duties – Low cost probate RSS Feed" href="http://www.finalduties.co.uk/feed/" /> <link rel="alternate" type="application/atom+xml" title="Final Duties – Low cost probate Atom Feed" href="http://www.finalduties.co.uk/feed/atom/" /><link rel="pingback" href="http://www.finalduties.co.uk/xmlrpc.php" />That canonical link to Feed? should that be there, I know the Plugin has done this but I am lost to what should be there, I have no duplicate pages as far as I am aware than needs a canonical URL ??Thanks ``` >
Technical SEO | | Chris__Chris0 -
Summarize your question.Sitemap blocking or not blocking that is the question?
Hi from wet & overcast wetherby UK 😞 Ones question is this... " Is the sitemap plus boxes blocking bots ie they cant pass on this page http://www.langleys.com/Site-Map.aspx " Its just the + boxes that concern me, i remeber reading somewherte javascript nav can be toxic. Is there a way to test javascript nav set ups and see if they block bots or not? Thanks in advance 🙂
Technical SEO | | Nightwing0 -
Robots.txt questions...
All, My site is rather complicated, but I will try to break down my question as simply as possible. I have a robots.txt document in the root level of my site to disallow robot access to /_system/, my CMS. This looks like this: # /robots.txt file for http://webcrawler.com/
Technical SEO | | Horizon
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/ I have another robots.txt file in another level down, which is my holiday database - www.mysite.com/holiday-database/ - this is to disallow access to /holiday-database/ControlPanel/, my database CMS. This looks like this: **User-agent: ***
Disallow: /ControlPanel/ Am I correct in thinking that this file must also be in the root level, and not in the /holiday-database/ level? If so, should my new robots.txt file look like this: # /robots.txt file for http://webcrawler.com/
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/
Disallow: /holiday-database/ControlPanel/ Or, like this: # /robots.txt file for http://webcrawler.com/
# mail webmaster@webcrawler.com for constructive criticism **User-agent: ***
Disallow: /_system/
Disallow: /ControlPanel/ Thanks in advance. Matt0