Best approach to launch a new site with new urls - same domain
-
We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results.
The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites.
Except for the homepage the URL structure for the new site is different than the old site.
What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues?
Here is what we got back from a Google post which may highlight our concerns better:
Thank You,
sincerely,
Stephan Woo Cude
SEO Specialist
-
Hi there,
I was just reading this old thread to get some info, but I'd love it if you could share you actual results from the launch. What did you do and how much did traffic change? How long before you were back to normal?
I usually find that with a new website and all new URLs, I end up seeing maybe a month or sodip in traffic that can be up to 10%. But that seems to be less and less as time goes on. The search engines are usually on top of it though, they recrawl and recatalog quite quickly.
Would love to hear from you.
Thanks!
Leslie
-
Just to chime in on this, albeit maybe a little late now... I had the same thought as I was reading through this with using rel=canonical to point the old pages to the new for now, so the search engines don't have any duplicate content issues until a 301 redirect can take over when the new site is fully launched.
However, depending on your rollout schedule, this would mean that the SERPs would soon be indexing only the new pages. You'd need to ensure that the traffic diverter you are using would handle this. Otherwise you could put the rel=canonical on the new pages for now, which would avoid the duplicate content until you are fully launched. Then you'd remove it and 301 redirect the old pages to the new.
Just something you maybe want to think about! Hopefully your traffic diverter can handle this though.
-
Thank you very much for the insight!
-
Ah ok. I understand now. I wasn't picking up on what you were saying before.
If with the soft launch you are already putting the "new" version of the site on their intended final URLs then yes, you can let the engines start crawling those URLs. For each new URL you let the search engines crawl make sure to 301 its corresponding old URL (the old site) to the new version to minimize any duplicate content issues.
If for whatever reason you can't quite 301 the old URLs yet (like if you still need instant access to reroute traffic back to them) you could try using rel=canonical on the old pages and point them to their new counter part only if the main content on each of the pages is almost exactly the same. You don't want Google to think you're manipulating them with rel=canonical.
-
Sorry this is so confusing and thank you so much for your responses... there would be no subdomain when we do the soft launch... it would be http://www.sierratradingpost.com/Mens-Clothing.html (old site) vs http://www.sierratradingpost.com/mens-clothing~d~15/ (new site)...
-
As I'd said, there really isn't a reason to let them get a head start. The URL's will be changing when you transition the new site out of the subdomain (ie beta.sierratradingpost.com/mens vs sierratradingpost.com/mens - those are considered 2 completely different URLs) and the engines will have to recrawl all of the new pages at that point anyway.
-
We do plan to do that... it is just since we plan a soft launch we will essentially have 2 sites out there. We are wondering when to remove the noindex from the new site. We will have 2 sites for about a month... should we let the bots crawl the new site (new urls, same domain) only we we take down the old site and have the 301's or let Google crawl earlier to get the new site a head start on indexing.
-
And when you drop the sub domain you definitely want to 301 all of the old site structure's URLs to their corresponding new page's URLs. That way nothing gets lost in the transition.
-
We would drop the subdomain - so we would have 2 "Men's Clothing" department pages - different URLs, slightly different content...
-
Yeah, just refer to our conversation above as I think it will pertain better to your situation.
-
The only issue is that you have to keep in mind that Google/Bing defines pages on the internet through their URL's, not the content. The content only describes the pages.
So if you let the engines pre crawl the pages before dropping the subdomain - simply for the reason of letting them have a "sneak peek" - you won't really be doing yourself much of a favor, as the engines will just be recrawling the content on the non subdomain URL as if it were brand new anyway.
The reason to do it the pre crawl way would be if you're already building back links to the new beta pages. Then it could make sense to let the engines index those pages and 301 them to their new non subdomain versions later. In my opinion the benefit from this route would outweigh any potential duplicate content issues.
-
But the URL structer is different... does that matter?
-
What YesBaby is talking about is somehting like Google's Website Optimizer. When someone goes to sierratradingpost.com/mens-stuff, for example, it will give 50% of the people the old version of the site for that page, and the other 50% the new version. It will eliminate any duplicate content issues as the 2 page variations will still be attached to the same exact URL.
Definitely a viable option if it fits with your game plan of how you want to do things.
-
SInce all of the URLs except for the homepage - what do you think about letting the new site get crawled maybe 2 weeks before it is 100% launched? We would have some duplicate content issues but I am hoping this would give us a head start with the new site.... then when we go 100% we add the 301's and new sitemap. It is my understanding we will be dropping the sub domain for the soft launch.
Thank you so much!
-
First of all - I love the new design. It looks great!
The absolutel best way to go about it in my opinion would be to simply have the new site ready, and then launch it fully under the base domain (no subdomain) while 301 redirecting important old pages on the site to their related new versions. That way the search engine will have the easiest time of discovering the new site and indexing it, while making sure you don't lose anything in the transition via proper 301'ing.
I can't say it would provide you with a massive benefit to set up a way for the search engines to start crawling the new site for now, as you're just going to be moving all of those URL's off of the subdomain in the near future anyway - where they will then need to be recrawled on the parent domain as if they were brand new.
If the traffic diverter you have set up automatically 301's requests for old site pages to their new beta URL version then you might as well let those new versions be indexed for the time being. Just make sure that when you transfer the beta site to the parent domain that you 301 the old beta URL's to their new permanent home.
-
So with the service - the new site is not crawled until we launch it?
-
The new site is beta.sierratradingpost.com where we will be dropping the beta. On the old one has catalog departments... ie Men's Classics, which, at this time, are not being carried over to the new site. I guess we are wonding when we should allow the robots to crawl the new site?
-
Hey Stephan,
I'm assuming you want to measure how the traffic is converting on the new site, hence the strategy to send small portions of traffic to new pages?
If so, the easiest way might to just straight up A/B split test the new pages with a service like Adobe/Omniture Test&Target. This doesn't cause any cloaking/dupe isseues. When you are happy with the results you can realese the site with all the 301's in place.
-
Let me make sure I have this straight... you're not going to be directing the new site format to a subdomain permanently, right? You were only using the sub domain for beta purposes?
The way I see it, when I go to Sierra Trading Post's site now I can make out what looks like 2 different types of architecture structures. You have one link on the page pointing to Men's clothing which executes at a single defined .htm file. Then you can see that you have the "Men's Classics" (still general men's clothing?) which points to a directory which I'm guessing is your new site. Correct me if I'm wrong on this, or if I'm right but have the old vs. new reversed.
If that is the case your best bet to try and minimalize any ranking impact would be to 301 redirect pages from the old catalog architecture to the new. That way you could remove the old site files completely and let the server take care of the direction.
If you need to leave the old site up for throttling purposes like you said - you could use canoniclazation tags to refer the old pages to the new ones. That along with employing 301 tags would help train the search engines into understanding what you're doing.
I'm sorry if I didn't answer your question as you needed. I'm still not sure if I understood your issue as intended. =P
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Highly ranked pages to new domain?
Hi everyone! We are ranked #1 for about 30 product pages at www.oldsite.com/product1 and we are wanting to move about 30 of those pages to a new site www.newsite.com/product1 (new domain and hosting - which we own). What is the best way to do this? I'm confused if you recreate those pages on the new domain vs. ftp move them, 301 re-directs, etc. Looking for the things we must do and the sequence to do it all, etc. Thanks so much!
Intermediate & Advanced SEO | | Jamesmcd030 -
Taxonomy question - best approach for site structure
Hi all, I'm working on a dentist's website and want some advice on the best way to lay out the navigation. I would like to know which structure will help the site work naturally. I feel the second example would be better as it would focus the 'power' around the type of treatment and get that to rank better. .com/assessment/whitening
Intermediate & Advanced SEO | | Bee159
.com/assessment/straightening
.com/treatment/whitening
.com/treatment/straightening or .com/whitening/assessment
.com/straightening/assessment
.com/whitening/treatment
.com/straightening/treatment Please advise, thanks.0 -
Transferring Domain and redirecting old site to new site and Having Issues - Please help
I have just completed a site redesign under a different domain and new wordpress woo commerce platform. The typical protocol is to just submit all the redirects via the .htaccess file on the current site and thereby tell google the new home of all your current pages on the new site so you maintain your link juice. This problem is my current site is hosted with network solutions and they do not allow access to the .htaccess file and there is no way to redirect the pages they say other than a script they can employ to push all pages of the old site to the new home page of the new site. This is of course bad for seo so not a solution. They did mention they could also write a script for the home page to redirect just it to the new home page then place a script of every individual page redirecting each of those. Does this sound like something plausible? Noone at network solutions has really been able to give me a straight answer. That being said i have discussed with a few developers and they mentioned a workaround process to avoid the above: “The only thing I can think of is.. point both domains (www.islesurfboards.com & www.islesurfandsup.com) to the new store, and 301 there? If you kept WooCommerce, Wordpress has plugins to 301 pages. So maybe use A record or CName for the old URL to the new URL/IP, then use htaccess to redirect the old domain to the new domain, then when that comes through to the new store, setup 301's there for pages? Example ... http://www.islesurfboards.com points to http://www.islesurfandsup.com ... then when the site sees http://www.islesurfboards.com, htaccess 301's to http://www.islesurfandsup.com.. then wordpress uses 301 plugin for the pages? Not 100% sure if this is the best way... but might work." Can anyone confirm this process will work or suggest anything else to redirect my current site on network solutions to my new site withe new domain and maintain the redirects and seo power. My domain www.islesurfboards.com has been around for 10 years so dont just want to flush the link juice down the toilet and want to redirect everything correctly.
Intermediate & Advanced SEO | | isle_surf0 -
Dfferent url of some other site is shown by Google in cace copy of our site's page
Hi, When i check cached copy of url of my site http://goo.gl/BZw2Zz , the url in cache copy shown by Google is of some other third party site. Why is Google showing third party url in our site's cached url. Did any of you guys faced any such issue. Regards,
Intermediate & Advanced SEO | | vivekrathore0 -
Best support site software to use
Hi Guys We currently use Desk to run our company support site, it seems ok (I don't administer it), however is it very template driven and doesn't allow useful tools such as being able to add metadata to each page (hence in our Moz crawl tests we get a large number of no metadata errors (which seems like a lost opportunity for us to optimise the site). Our support team are looking to implement MadCap Flare as an information management tool, however this tool outputs HTML as iframes which obviously make it hard for google to crawl the content. We recently implemented HubSpot as our content marketing platform which is great, and we'd love to have the support site hosted on this (great for tracking traffic etc), however as far as I'm aware MadCap Flare doesn't integrate directly with HubSpot....so looking for suggestions on what others are successfully using to host/manage their SEO optimised support sites? Cheers Matt
Intermediate & Advanced SEO | | SnapComms0 -
Unique domains vs. single domain for UGC sites?
Working on a client project - a UGC community that has a DTC model as well as a white label model. Is it categorically better to have them all under the same domain? Trying to figure which is better: XXX,XXX pages on one site vs. A smaller XXX,XXX pages on one site and XX,XXX pages on 10-20 other sites all pointing to the primary site. The thinking on the second was that those domains would likely achieve high DA as well as the primary, and would passing their value to the primary. Thoughts? Any other considerations we should be thinking about?
Intermediate & Advanced SEO | | intentionally0 -
Best linking practice for international domains
SEOMoz team, I am wondering that in the days of Panda and Penguin SEOs have an opinion on how to best link between international domains for a web page property. Let's say you have brandname.DE (German site) brandname.FR (French site) brandname.CO.UK (British site) Right now we are linking form each site on the page to the other two language sites to make users aware of the translated version of the site which obviously make it a site wide link which seems to be lately disencouraged by Google. Did anyone out there have any ideas how to strategically interlink between international domains that represent language versions of a web site? /PP
Intermediate & Advanced SEO | | tomypro0 -
New web site - 404 and 301
Hello, I have spent a lot of times on the forum trying to make sure how to deal with my client situation. I will tell you my understanding of the strategy to apply and I would appreciate if you could tell me if the strategy will be okay. CONTEXT I am working on a project where our client wants to replace its current web site with a new one. The current web site has at least 100 000 pages. The new web site will replace all the existing pages of the current site. What I have heard for the strategy the client wants to adopt is to 404 each pages and to 301 redirect each page. Every page would be redirect to a page that make sense in the new web site. But after reading other answers and reading the following comment, I am starting to be concerned: '(4) Be careful with a massive number of 301s. I would not 301 100s of pages at once. There's some evidence Google may view this as aggressive PR sculpting and devalue those 301s. In that case, I'd 301 selectively (based on page authority and back-links) and 404 the rest.' I have also read about performance issue ... QUESTION So, if we suppose that we can manage to map each of the old site pages to a page in the new web site, is a problem to do it? Do you see a performance issue or devaluation potential issue? If it is a problem, please comment the strategy I might considere to suggest: Identify the pages for which I gain links From that group, identify the pages, that gives me most of my juice 301 redirect them and for the other, create a real great 404 ... Thanks ! Nancy
Intermediate & Advanced SEO | | EnigmaSolution0