Best approach to launch a new site with new urls - same domain
-
We have a high volume e-commerce website with over 15K items, an average of 150K visits per day and 12.6 pages per visit. We are launching a new website this spring which is currently on a beta sub domain and we are looking for the best strategy that preserves our current search rankings while throttling traffic (possibly 25% per week) to measure results.
The new site will be soft launched as we plan to slowly migrate traffic to it via a load balancer. This way we can monitor performance of the new site while still having the old site as a backup. Only when we are fully comfortable with the new site will we submit the 301 redirects and migrate everyone over to the new site. We will have a month or so of running both sites.
Except for the homepage the URL structure for the new site is different than the old site.
What is our best strategy so we don’t lose ranking on the old site and start earning ranking on the new site, while avoiding duplicate content and cloaking issues?
Here is what we got back from a Google post which may highlight our concerns better:
Thank You,
sincerely,
Stephan Woo Cude
SEO Specialist
-
Hi there,
I was just reading this old thread to get some info, but I'd love it if you could share you actual results from the launch. What did you do and how much did traffic change? How long before you were back to normal?
I usually find that with a new website and all new URLs, I end up seeing maybe a month or sodip in traffic that can be up to 10%. But that seems to be less and less as time goes on. The search engines are usually on top of it though, they recrawl and recatalog quite quickly.
Would love to hear from you.
Thanks!
Leslie
-
Just to chime in on this, albeit maybe a little late now... I had the same thought as I was reading through this with using rel=canonical to point the old pages to the new for now, so the search engines don't have any duplicate content issues until a 301 redirect can take over when the new site is fully launched.
However, depending on your rollout schedule, this would mean that the SERPs would soon be indexing only the new pages. You'd need to ensure that the traffic diverter you are using would handle this. Otherwise you could put the rel=canonical on the new pages for now, which would avoid the duplicate content until you are fully launched. Then you'd remove it and 301 redirect the old pages to the new.
Just something you maybe want to think about! Hopefully your traffic diverter can handle this though.
-
Thank you very much for the insight!
-
Ah ok. I understand now. I wasn't picking up on what you were saying before.
If with the soft launch you are already putting the "new" version of the site on their intended final URLs then yes, you can let the engines start crawling those URLs. For each new URL you let the search engines crawl make sure to 301 its corresponding old URL (the old site) to the new version to minimize any duplicate content issues.
If for whatever reason you can't quite 301 the old URLs yet (like if you still need instant access to reroute traffic back to them) you could try using rel=canonical on the old pages and point them to their new counter part only if the main content on each of the pages is almost exactly the same. You don't want Google to think you're manipulating them with rel=canonical.
-
Sorry this is so confusing and thank you so much for your responses... there would be no subdomain when we do the soft launch... it would be http://www.sierratradingpost.com/Mens-Clothing.html (old site) vs http://www.sierratradingpost.com/mens-clothing~d~15/ (new site)...
-
As I'd said, there really isn't a reason to let them get a head start. The URL's will be changing when you transition the new site out of the subdomain (ie beta.sierratradingpost.com/mens vs sierratradingpost.com/mens - those are considered 2 completely different URLs) and the engines will have to recrawl all of the new pages at that point anyway.
-
We do plan to do that... it is just since we plan a soft launch we will essentially have 2 sites out there. We are wondering when to remove the noindex from the new site. We will have 2 sites for about a month... should we let the bots crawl the new site (new urls, same domain) only we we take down the old site and have the 301's or let Google crawl earlier to get the new site a head start on indexing.
-
And when you drop the sub domain you definitely want to 301 all of the old site structure's URLs to their corresponding new page's URLs. That way nothing gets lost in the transition.
-
We would drop the subdomain - so we would have 2 "Men's Clothing" department pages - different URLs, slightly different content...
-
Yeah, just refer to our conversation above as I think it will pertain better to your situation.
-
The only issue is that you have to keep in mind that Google/Bing defines pages on the internet through their URL's, not the content. The content only describes the pages.
So if you let the engines pre crawl the pages before dropping the subdomain - simply for the reason of letting them have a "sneak peek" - you won't really be doing yourself much of a favor, as the engines will just be recrawling the content on the non subdomain URL as if it were brand new anyway.
The reason to do it the pre crawl way would be if you're already building back links to the new beta pages. Then it could make sense to let the engines index those pages and 301 them to their new non subdomain versions later. In my opinion the benefit from this route would outweigh any potential duplicate content issues.
-
But the URL structer is different... does that matter?
-
What YesBaby is talking about is somehting like Google's Website Optimizer. When someone goes to sierratradingpost.com/mens-stuff, for example, it will give 50% of the people the old version of the site for that page, and the other 50% the new version. It will eliminate any duplicate content issues as the 2 page variations will still be attached to the same exact URL.
Definitely a viable option if it fits with your game plan of how you want to do things.
-
SInce all of the URLs except for the homepage - what do you think about letting the new site get crawled maybe 2 weeks before it is 100% launched? We would have some duplicate content issues but I am hoping this would give us a head start with the new site.... then when we go 100% we add the 301's and new sitemap. It is my understanding we will be dropping the sub domain for the soft launch.
Thank you so much!
-
First of all - I love the new design. It looks great!
The absolutel best way to go about it in my opinion would be to simply have the new site ready, and then launch it fully under the base domain (no subdomain) while 301 redirecting important old pages on the site to their related new versions. That way the search engine will have the easiest time of discovering the new site and indexing it, while making sure you don't lose anything in the transition via proper 301'ing.
I can't say it would provide you with a massive benefit to set up a way for the search engines to start crawling the new site for now, as you're just going to be moving all of those URL's off of the subdomain in the near future anyway - where they will then need to be recrawled on the parent domain as if they were brand new.
If the traffic diverter you have set up automatically 301's requests for old site pages to their new beta URL version then you might as well let those new versions be indexed for the time being. Just make sure that when you transfer the beta site to the parent domain that you 301 the old beta URL's to their new permanent home.
-
So with the service - the new site is not crawled until we launch it?
-
The new site is beta.sierratradingpost.com where we will be dropping the beta. On the old one has catalog departments... ie Men's Classics, which, at this time, are not being carried over to the new site. I guess we are wonding when we should allow the robots to crawl the new site?
-
Hey Stephan,
I'm assuming you want to measure how the traffic is converting on the new site, hence the strategy to send small portions of traffic to new pages?
If so, the easiest way might to just straight up A/B split test the new pages with a service like Adobe/Omniture Test&Target. This doesn't cause any cloaking/dupe isseues. When you are happy with the results you can realese the site with all the 301's in place.
-
Let me make sure I have this straight... you're not going to be directing the new site format to a subdomain permanently, right? You were only using the sub domain for beta purposes?
The way I see it, when I go to Sierra Trading Post's site now I can make out what looks like 2 different types of architecture structures. You have one link on the page pointing to Men's clothing which executes at a single defined .htm file. Then you can see that you have the "Men's Classics" (still general men's clothing?) which points to a directory which I'm guessing is your new site. Correct me if I'm wrong on this, or if I'm right but have the old vs. new reversed.
If that is the case your best bet to try and minimalize any ranking impact would be to 301 redirect pages from the old catalog architecture to the new. That way you could remove the old site files completely and let the server take care of the direction.
If you need to leave the old site up for throttling purposes like you said - you could use canoniclazation tags to refer the old pages to the new ones. That along with employing 301 tags would help train the search engines into understanding what you're doing.
I'm sorry if I didn't answer your question as you needed. I'm still not sure if I understood your issue as intended. =P
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site Title When Your Domain name is the keyword
Hello, I've been struggling with an issue for years. I own UsedCubicles.com. I'm ranked very well, the site generates leads and income on a regular basis. However, I never know what to name the site on the home page. The main keyword I go after is obviously "Used Cubicles". I also have product category named "Used Cubicles". I know google doesn't recognize my Used Cubicles product category as conical and reverts to the home page. I would rather Google use my categories more effectively to rank content or pages. My current site title on the home page is NOT GOOD. Used Cubicles | Usedcubicles.com. Im nervous to change it, however because Im not sure if its helping me rank for the keyword, which I do. For years I was #1 across the country but lately its been dropping. Any advice is appreciated.
Intermediate & Advanced SEO | | Grant06970 -
Best way to "Prune" bad content from large sites?
I am in process of pruning my sites for low quality/thin content. The issue is that I have multiple sites with 40k + pages and need a more efficient way of finding the low quality content than looking at each page individually. Is there an ideal way to find the pages that are worth no indexing that will speed up the process but not potentially harm any valuable pages? Current plan of action is to pull data from analytics and if the url hasn't brought any traffic in the last 12 months then it is safe to assume it is a page that is not beneficial to the site. My concern is that some of these pages might have links pointing to them and I want to make sure we don't lose that link juice. But, assuming we just no index the pages we should still have the authority pass along...and in theory, the pages that haven't brought any traffic to the site in a year probably don't have much authority to begin with. Recommendations on best way to prune content on sites with hundreds of thousands of pages efficiently? Also, is there a benefit to no indexing the pages vs deleting them? What is the preferred method, and why?
Intermediate & Advanced SEO | | atomiconline0 -
Best-practice URL structures with multiple filter combinations
Hello, We're putting together a large piece of content that will have some interactive filtering elements. There are two types of filters, topics and object types. The architecture under the hood constrains us so that everything needs to be in URL parameters. If someone selects a single filter, this can look pretty clean: www.domain.com/project?topic=firstTopic
Intermediate & Advanced SEO | | digitalcrc
or
www.domain.com/project?object=typeOne The problems arise when people select multiple topics, potentially across two different filter types: www.domain.com/project?topic=firstTopic-secondTopic-thirdTopic&object=typeOne-typeTwo I've raised concerns around the structure in general, but it seems to be too late at this point so now I'm scratching my head thinking of how best to get these indexed. I have two main concerns: A ton of near-duplicate content and hundreds of URLs being created and indexed with various filter combinations added Over-reacting to the first point above and over-canonicalizing/no-indexing combination pages to the detriment of the content as a whole Would the best approach be to index each single topic filter individually, and canonicalize any combinations to the 'view all' page? I don't have much experience with e-commerce SEO (which this problem seems to have the most in common with) so any advice is greatly appreciated. Thanks!0 -
'?q=:new&sort=new' URL parameters help...
Hey guys, I have these types of URLs being crawled and picked up on by MOZ but they are not visible to my users. The URLs are all 'hidden' from users as they are basically category pages that have no stock, however MOZ is crawling them and I dont understand how they are getting picked up as 'duplicate content'. Anyone have any info on this? http://www.example.ch/de/example/marken/brand/make-up/c/Cat_Perso_Brand_3?q=:new&sort=new Even if I understood the technicality behind it then I could try and fix it if need be. Thanks Guys Kay
Intermediate & Advanced SEO | | eLab_London0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
Weird 404 URL Problem - domain name being placed at end of urls
Hey there. For some reason when doing crawl tests I'm finding pages with the domain name being tacked on the end and causing 404 errors.
Intermediate & Advanced SEO | | Jay328
For example: http://domainname.com/page-name/http://domainname.com This is happening to all pages, posts and even category type 1. Site is in Wordpress
2. Using Yoast SEO plugin Any suggestions? Thanks!0 -
Best possible linking on site with 100K indexed pages
Hello All, First of all I would like to thank everybody here for sharing such great knowledge with such amazing and heartfelt passion.It really is good to see. Thank you. My story / question: I recently sold a site with more than 100k pages indexed in Google. I was allowed to keep links on the site.These links being actual anchor text links on both the home page as well on the 100k news articles. On top of that, my site syndicates its rss feed (Just links and titles, no content) to this page. However, the new owner made a mess, and now the site could possibly be seen as bad linking to my site. Google tells me within webmasters that this particular site gives me more than 400K backlinks. I have NEVER received one single notice from Google that I have bad links. That first. But, I was worried that this page could have been the reason why MY site tanked as bad as it did. It's the only source linking so massive to me. Just a few days ago, I got in contact with the new site owner. And he has taken my offer to help him 'better' his site. Although getting the site up to date for him is my main purpose, since I am there, I will also put effort in to optimizing the links back to my site. My question: What would be the best to do for my 'most SEO gain' out of this? The site is a news paper type of site, catering for news within the exact niche my site is trying to rank. Difference being, his is a news site, mine is not. It is commercial. Once I fix his site, there will be regular news updates all within the niche we both are in. Regularly as in several times per day. It's news. In the niche. Should I leave my rss feed in the side bars of all the content? Should I leave an achor text link on the sidebar (on all news etc.) If so: there can be just one keyword... 407K pages linking with just 1 kw?? Should I keep it to just one link on the home page? I would love to hear what you guys think. (My domain is from 2001. Like a quality wine. However, still tanked like a submarine.) ALL SEO reports I got here are now Grade A. The site is finally fully optimized. Truly nice to have that confirmation. Now I hope someone will be able to tell me what is best to do, in order to get the most SEO gain out of this for my site. Thank you.
Intermediate & Advanced SEO | | richardo24hr0 -
Internal Site Structure Question (URL Formation and Internal Link Design)
Hi, I have an e-commerce website that has an articles section: There is an articles.aspx file that can be reached from the top menu and it holds links to all of the articles as follows: xxx.com/articles/article1.aspx
Intermediate & Advanced SEO | | BeytzNet
xxx.com/articles/article2.aspx I want to add several new articles under a new sections, for example a complete set of articles under the title of "buying guide" and the question is what would be the best way? I was thinking of adding a "computers-buying-guides.aspx" accessible from the top menu / footer and from it linking to: xxx.com/computer-buying-ghudes/what-to-check-prior-to-buying-a-laptop.aspx
xxx.com/computer-buying-ghudes/weight-vs-performance.aspx
etc. Any thoughts / recommendations? Thanks0