Best approach to launch a new site with new urls - same domain
-
We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results.
The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites.
Except for the homepage the URL structure for the new site is different than the old site.
What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues?
Here is what we got back from a Google post which may highlight our concerns better:
Thank You,
sincerely,
Stephan Woo Cude
SEO Specialist
-
Hi there,
I was just reading this old thread to get some info, but I'd love it if you could share you actual results from the launch. What did you do and how much did traffic change? How long before you were back to normal?
I usually find that with a new website and all new URLs, I end up seeing maybe a month or sodip in traffic that can be up to 10%. But that seems to be less and less as time goes on. The search engines are usually on top of it though, they recrawl and recatalog quite quickly.
Would love to hear from you.
Thanks!
Leslie
-
Just to chime in on this, albeit maybe a little late now... I had the same thought as I was reading through this with using rel=canonical to point the old pages to the new for now, so the search engines don't have any duplicate content issues until a 301 redirect can take over when the new site is fully launched.
However, depending on your rollout schedule, this would mean that the SERPs would soon be indexing only the new pages. You'd need to ensure that the traffic diverter you are using would handle this. Otherwise you could put the rel=canonical on the new pages for now, which would avoid the duplicate content until you are fully launched. Then you'd remove it and 301 redirect the old pages to the new.
Just something you maybe want to think about! Hopefully your traffic diverter can handle this though.
-
Thank you very much for the insight!
-
Ah ok. I understand now. I wasn't picking up on what you were saying before.
If with the soft launch you are already putting the "new" version of the site on their intended final URLs then yes, you can let the engines start crawling those URLs. For each new URL you let the search engines crawl make sure to 301 its corresponding old URL (the old site) to the new version to minimize any duplicate content issues.
If for whatever reason you can't quite 301 the old URLs yet (like if you still need instant access to reroute traffic back to them) you could try using rel=canonical on the old pages and point them to their new counter part only if the main content on each of the pages is almost exactly the same. You don't want Google to think you're manipulating them with rel=canonical.
-
Sorry this is so confusing and thank you so much for your responses... there would be no subdomain when we do the soft launch... it would be http://www.sierratradingpost.com/Mens-Clothing.html (old site) vs http://www.sierratradingpost.com/mens-clothing~d~15/ (new site)...
-
As I'd said, there really isn't a reason to let them get a head start. The URL's will be changing when you transition the new site out of the subdomain (ie beta.sierratradingpost.com/mens vs sierratradingpost.com/mens - those are considered 2 completely different URLs) and the engines will have to recrawl all of the new pages at that point anyway.
-
We do plan to do that... it is just since we plan a soft launch we will essentially have 2 sites out there. We are wondering when to remove the noindex from the new site. We will have 2 sites for about a month... should we let the bots crawl the new site (new urls, same domain) only we we take down the old site and have the 301's or let Google crawl earlier to get the new site a head start on indexing.
-
And when you drop the sub domain you definitely want to 301 all of the old site structure's URLs to their corresponding new page's URLs. That way nothing gets lost in the transition.
-
We would drop the subdomain - so we would have 2 "Men's Clothing" department pages - different URLs, slightly different content...
-
Yeah, just refer to our conversation above as I think it will pertain better to your situation.
-
The only issue is that you have to keep in mind that Google/Bing defines pages on the internet through their URL's, not the content. The content only describes the pages.
So if you let the engines pre crawl the pages before dropping the subdomain - simply for the reason of letting them have a "sneak peek" - you won't really be doing yourself much of a favor, as the engines will just be recrawling the content on the non subdomain URL as if it were brand new anyway.
The reason to do it the pre crawl way would be if you're already building back links to the new beta pages. Then it could make sense to let the engines index those pages and 301 them to their new non subdomain versions later. In my opinion the benefit from this route would outweigh any potential duplicate content issues.
-
But the URL structer is different... does that matter?
-
What YesBaby is talking about is somehting like Google's Website Optimizer. When someone goes to sierratradingpost.com/mens-stuff, for example, it will give 50% of the people the old version of the site for that page, and the other 50% the new version. It will eliminate any duplicate content issues as the 2 page variations will still be attached to the same exact URL.
Definitely a viable option if it fits with your game plan of how you want to do things.
-
SInce all of the URLs except for the homepage - what do you think about letting the new site get crawled maybe 2 weeks before it is 100% launched? We would have some duplicate content issues but I am hoping this would give us a head start with the new site.... then when we go 100% we add the 301's and new sitemap. It is my understanding we will be dropping the sub domain for the soft launch.
Thank you so much!
-
First of all - I love the new design. It looks great!
The absolutel best way to go about it in my opinion would be to simply have the new site ready, and then launch it fully under the base domain (no subdomain) while 301 redirecting important old pages on the site to their related new versions. That way the search engine will have the easiest time of discovering the new site and indexing it, while making sure you don't lose anything in the transition via proper 301'ing.
I can't say it would provide you with a massive benefit to set up a way for the search engines to start crawling the new site for now, as you're just going to be moving all of those URL's off of the subdomain in the near future anyway - where they will then need to be recrawled on the parent domain as if they were brand new.
If the traffic diverter you have set up automatically 301's requests for old site pages to their new beta URL version then you might as well let those new versions be indexed for the time being. Just make sure that when you transfer the beta site to the parent domain that you 301 the old beta URL's to their new permanent home.
-
So with the service - the new site is not crawled until we launch it?
-
The new site is beta.sierratradingpost.com where we will be dropping the beta. On the old one has catalog departments... ie Men's Classics, which, at this time, are not being carried over to the new site. I guess we are wonding when we should allow the robots to crawl the new site?
-
Hey Stephan,
I'm assuming you want to measure how the traffic is converting on the new site, hence the strategy to send small portions of traffic to new pages?
If so, the easiest way might to just straight up A/B split test the new pages with a service like Adobe/Omniture Test&Target. This doesn't cause any cloaking/dupe isseues. When you are happy with the results you can realese the site with all the 301's in place.
-
Let me make sure I have this straight... you're not going to be directing the new site format to a subdomain permanently, right? You were only using the sub domain for beta purposes?
The way I see it, when I go to Sierra Trading Post's site now I can make out what looks like 2 different types of architecture structures. You have one link on the page pointing to Men's clothing which executes at a single defined .htm file. Then you can see that you have the "Men's Classics" (still general men's clothing?) which points to a directory which I'm guessing is your new site. Correct me if I'm wrong on this, or if I'm right but have the old vs. new reversed.
If that is the case your best bet to try and minimalize any ranking impact would be to 301 redirect pages from the old catalog architecture to the new. That way you could remove the old site files completely and let the server take care of the direction.
If you need to leave the old site up for throttling purposes like you said - you could use canoniclazation tags to refer the old pages to the new ones. That along with employing 301 tags would help train the search engines into understanding what you're doing.
I'm sorry if I didn't answer your question as you needed. I'm still not sure if I understood your issue as intended. =P
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Start a new site to get out of Google penalties?
Hey Moz, I have several questions in regards to whether I should a start a new second site to save my online presence after a series of Google penalties. The main questions being: Is this the best way to spend my time/resources? If I’m forced to jump my company over to the new site can Google see that and transfer the penalty? I plan on all new content (no link redirect, no dup content) so do I need to kill the original site? Are there any Pro’s/cons I am missing? Summary of my situation: Looking at analytics it appears I was hit with both Penguin 2.0 and 2.1, each cutting my traffic in half, despite a link remediation campaign in the summer of 2013. There was a manual penalty also imposed on the site in the fall of 2013, which was released in early 2014. With Penguin 3.0’s release at the end of 2014, the site saw a slight uptick in organic traffic, improving from essentially nothing to next to nothing. Most of the site’s issues revolved around cheap $5 links from India in the 2006-09 time frame. This link building was abandoned, and replaced with nothing but “letting them happen naturally” from 2010 through the 2013 penalties. Since 2013 we have done a small amount of quality articles on a monthly basis to promote the site, social media, and continuous link remediation. In addition the whole site has been redesigned, optimized for speed/mobile, secured, and completely rewritten. Given all of this, the site has really only recovered to page 2 and 3 of the SERPs for our key words. Even after a highly circulated piece appeared on an Authority site (97 DA) a few months ago there was zero movement. It appears we have an anvil tied around our leg until Penguin 4.0. With all of the above, and no sign of when the next penguin will be released, I ask, is it time to start investing in a new site? With no movement in 2.5 years, it’s impossible to know where my current site stands, so I don’t know what else I can do to improve it. I am considering slowly building a new site that is a high quality informational site. My thought process is it will take a year for a new site to gain any traction with Google. If by that time my main site has not recovered, I can jump to that new site, add a commercial component, and use it as a life boat for my company. If I have recovered, then I have a future asset. Thanks in advance!
Intermediate & Advanced SEO | | TheDude0 -
Subdomain vs totally new domain
Problem: Our organization publish maps for public viewing using google maps. We are currently getting limited value from these links. We need to separate our public and private maps for infrastructure purposes, and are weighing up the strengths and weaknesses of separating by domain or sub domain with regards SEO and infrastructure. Current situation: maps.mycompany.com currently has a page authority of 30 and mycompany.com has a domain authority of 39. We are currently only getting links from 8 maps which are shared via social media whereas most people embed our maps on their website using an iframe which I believe doesn't do us any favour with SEO. We currently have approx 3K public maps. Question: What SEO impact can you see if we move our public maps from the subdomain maps.mycompany.com to mycompanypublicmaps.com? Thanks in advance for your help and happy to give more info if you need it!
Intermediate & Advanced SEO | | eSpatial0 -
Building a product clients will integrate into their sites: What is the best way to utilize my clients' unique domain names?
I'm designing a hosted product my clients will integrate into their websites, their end users would access it via my clients' customer-facing websites. It is a product my clients pay for which provides a service to their end users, who would have to login to my product via a link provided by my clients. Most clients would choose to incorporate this link prominently on their home page and site nav.
Intermediate & Advanced SEO | | emzeegee
All clients will be in the same vertical market, so their sites will be keyword rich and related to my site.
Many may even be .org and ,edus The way I see it, there are three main ways I could set this up within the product.
I want to know which is most beneficial, or if I'm missing anything. 1: They set up a subdomain at their domain that serves content from my domain product.theirdomain.com would render content from mydomain.com's database.
product.theirdomain.com could have footer and/or other no-follow links to mydomain.com with target keywords The risk I see here is having hundreds of sites with the same target keyword linking back to my domain.
This may be the worst option, as I'm not sure about if the nofollow will help, because I know Google considers this kind of link to be a link scheme: https://support.google.com/webmasters/answer/66356?hl=en 2: They link to a subdomain on mydomain.com from their nav/site
Their nav would include an actual link to product.mydomain.com/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. 3: They link to a subdirectory on mydomain.com from their nav/site
Their nav would include an actual link to mydomain.com/product/theircompanyname
Each client would have a different "theircompanyname" link.
They would decide and/or create their link method (graphic, presence of alt tag, text, what text, etc).
I would have no control aside from requiring them to link to that url on my server. In all scenarios, my marketing content would be set up around mydomain.com both as static content and a blog directory, all with SEO attractive url slugs. I'm leaning towards option 3, but would like input!0 -
Only homepage is ranking after site re-launch
We've been moving all our sites over to a new platform (Demandware) this year. In the process, they've all gotten updated designs (from the same template), on-page optimizations, etc. Since they're all on the same platform and are essentially copies from one template, any technical issues found have been fixed across all sites. The problem I'm seeing is there are a few sites that haven't really seen much/any recovery from the site launch, and these are sites that were done 4-5 months ago. There's one in particular that's especially concerning, since it's showing issues that none of the other sites seem to have. In my Moz reports, it looks like of all the keywords that are ranking, they're only ranking the https version of the homepage (and from what I'm seeing, the https version wasn't picked up and ranked until the beginning of October, which was also the time that WMT shows a huge drop in clicks and impressions). I've crawled the site (ScreamingFrog), done a site search in Google (all pages look to be indexed), etc. and I haven't come across any specific problems there that would suggest a technical issue. We're wondering if it might be a link authority problem, since this site had the most dramatic change in navigation. The navigation used to be product based (Boots, Shoes, etc.) and is now broken up by gender. I've noticed that a few other pages that are ranking are dual gender pages that also existed on the old site, whereas all of these new categories aren't ranking at all and I'm not seeing this happen with any of our other sites. I've gone down a bunch of different paths trying to figure this out, but I haven't come up with any concrete answers as to why this is happening and how to fix it. Any thoughts as to what else I can look into or try for this?
Intermediate & Advanced SEO | | WWWSEO0 -
New site causes massive drop off in ranking, old site restored how long to recover?
Hello, We launched and updated version of our site, mainly design changes and some functionality. 3 days after the launch we vanished from the rankings, previous page one results were now out of the top 100. We have identified some of the issues with the new site and chose to restore the old well ranking site. My question is how long might it take for the ranking to come back, if at all? The drop happened on the third day and the site was restored on the third day. We are now on day 6. Using GWT with have used fetch as Google and resubmitted the site map. Any help would be gladly received. Thanks James
Intermediate & Advanced SEO | | JamesBryant0 -
301 forwarding old urls to new urls - when should you update sitemap?
Hello Mozzers, If you are amending your urls - 301ing to new URLs - when in the process should you update your sitemap to reflect the new urls? I have heard some suggest you should submit a new sitemap alongside old sitemap to support indexing of new URLs, but I've no idea whether that advice is valid or not. Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
What is the best: one big site or several small ones?
Sometimes I've felt that my choice was incorrect. I know taht with a big amount of content on the same subject a bigger one is better: it’s easier to maintain, and a big number of pages is good for Google ranking, but when you have many small sites, each one focusing on a totally different subject, you can crosslink them and this will improve your rankings. Furthermore, many small sites allow you to focus in specific niches and help you rank better for different keywords. So, what is the best choice?
Intermediate & Advanced SEO | | sergio_redondo0 -
Best Practices for Pagination on E-commerce Site
One of my e-commerce clients has a script enabled on their category pages that allows more products to automatically be displayed as you scroll down. They use this instead of page 1, 2, and a view all. I'm trying to decide if I want to insist that they change back to the traditional method of multiple pages with a view all button, and then implement rel="next", rel="prev", etc. I think the current auto method is disorienting for the user, but I can't figure out if it's the same for the spiders. Does anyone have any experience with this, or thoughts? Thanks!
Intermediate & Advanced SEO | | smallbox0