SEO Risks for redirecting sites
-
Hey Everyone,
I've tried searching for this question, but am not exactly sure what keywords to search for so I'm probably missing the resources if they already exist...
My client has had duplicated sites for years, and after multiple penalizations of those sites I was finally able to convince him to consolidate them into a "mega-site". Currently, he has a main domain, a geo-subdomain for each office location under the main domain, and a geo-domain for each office location.
We plan on redirecting each geo-domain to the corresponding geo-subdomain. So, the final result will be one main domain, and a sub-domain for each office location.
I'm looking for any information regarding tracking SEO data after the redirects are in place, how to guard against potential drops in SERPs, what's the smartest strategy to implement, etc... My client is very sensitive to his sites' SEO data, so if anyone has any SEO-related advice regarding redirecting sites it would be greatly appreciated!
Thank you!
-
Her general example was based on her, Matt Cutts and another Google employee having Fitness websites that she then buys and how she would go about the use of canonicals and 301s to show users and search engines that she is the most relevant authority. One of the pluses with the canonicals involved branded searches that would still pull the Matt Cutts site instead of the Maile Ohye site but link equity would be directed to her site and users would be less likely to bounce since the canonical would serve the Matt page whereas a 301 may be jarring at first because if you're looking for Matt and get Maile you may be more likely to bounce even though its the content you are looking for.
(God, I swear I can explain this better in my own head than I can once I attempt to write it out... which is bad considering I actually make a living writing)
-
I have found that folders perform better than subdomains.
I have done 301 redirects of subdomains into folders and the rankings went up strongly. I attribute this to combining the link popularity.
-
Hey Mike,
That sounds like an interesting discussion and sorry I couldn't hear it in person. I'm not sure if this would be relevant to my specific project though - our ultimate goal is to merge two existing sites into one. They have relatively the same content and subjects, meaning the userbase is also probably very similar. The speech that you're referring to sounds like it would be more suitable for a site that is going through a redesign and moving to a new domain name. Your thoughts?
-
I was leaning towards doing all the redirects at once, but didn't have the experience to back that theory up. Thanks for the advice!
-
Hi EGOL,
The sites that we are merging were penalized about 2 years ago. Their value has since begun to increase again, but they still haven't reached the level they were at before the penalization. After all the research we did to answer "why" they were penalized, we could only assume that it was because of our client's massive network of sites.
This was all implemented before I came onto the project. A mistake was made somewhere along the way that set off a "red flag" to Google that these sites were in cahoots (for lack of a better phrase) and penalized them.
We're hoping that the penalization occurred long enough ago that it will not harm the sites we are consolidating the penalized sites into.
My understanding of subdomains vs. folders is that for instances where the topics could virtually be separated (as is the case with multiple geographical locations), then subdomains are the optimal way to go. My client is also interested in ranking well in Google Local, and wants to have separate listings for each business location - which will also be beneficial if we take the subdomain route. I'll admit though, I'm a bit rusty on this topic and would welcome any advice or further resources! I'll do some more research, but if you have any further thoughts on the matter please do send them my way!
Thanks again!
-
Thanks, TommyTan, that's very reassuring!
-
I generally advocate for consolidating a group of sites.... but... are some of these sites penalized? Might be a good idea to understand why and the impact of merging them with unpenalized sites.
Also, if you are going to consolidate, combing the site into folders instead of subdomains can merge the domain authority. Do some research on this before move forward.
-
This reminded me of something Maile Ohye from Google was talking about at SMX NY.
In her talk about 301 vs Rel=Canonical she brought up the idea of putting notices on a site first that Site A is now a part of the Site B Family and will be consolidated in the future and giving people (and the spiders) a link to the new site to start building relevancy. Then later on implementing canonicals across Site A pointing to the relevant places on Site B. Gradually moving towards 301 redirects once Site B becomes the place people now recognize as the main site and the original is no longer needed.
[This of course is all paraphrased and from memory as I don't have my voice recordings of the session handy nor did anyone from Google provide PDFs of their slideshows]
-
Hi Daniel,
From what I know, I don't believe their is any risk at doing all the domains at once. Furthermore, you are not doing to the same page but from geodomains to different geo-subdomains. I've read about people redirecting 50,000 pages in comments and I don't recall anyone mentioning their is any risk.
The only risk I can think of is it might take longer for search engines to recrawl all the pages. Maybe someone can show some light in this.
You can tell your client that suffer a few drop in ranks in the short run for success in the long run.
-
As TommyTan says:
For large redirect projects, you should redirect all that you want to keep the value from. Don't mass redirect to a single page, as most won’t end up passing value, so keep all redirects pointed to pages which are relevant to the original.
You will have to wait a little while for search engines to recrawl and update their indexes. For a clean switchover that pushes all existing link juice to the new pages I'd do them at once.
-
Hi TommyTan,
Thanks for this info and the article, it's a great resource! We plan on mapping out both sites entirely and matching the topics A to A and B to B as you mentioned and this reminded me of a question I should have included in my last post:
Are there any potential risks to doing these redirects all at once? We have at least 3 domains that we want to redirect to the corresponding subdomains, and we don't want to make such a major change to the site that we risk major SERP drops. I'm wondering if we should redirect one domain at a time over a few months, or just do it all in one major sweep...
We're expecting the delay in seeing the affects, and while I know it's unavoidable, convincing my client of this is the big challenge! Like I said, he's very sensitive to his site's SEO data, and any drop in stats after we make a major change instantly makes him second guess the move.
Thanks again!
-
Hi Daniel,
Using 301 redirect is definitely the best bet. 301 redirect usually transfer 99% of the SEO to your new page so I don't believe there is any risk other than making a mistake while setting it up.
When you are redirecting make sure that eacy specific page is redirecting to another specific that is similar. You don't want a visitor wanting to learn something specific and you redirect them to the home page. You are losing opportunities there. So redirect home page to home page, topic A to topic A, topic B to topic B.
After setting up all the redirects, it might take a while for Search Engines to recrawl so don't fret when you still don't see any effect after a week or so.
SEOmoz wrote a SEO Redirection Best Practices Article, hopefully it will help!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best SEO way to implement multi language store
Hi, I have a magento 1.7 multilanguage store with the following structure: www.example.com/nl and www.example.com (Dutch) www.example.com/uk (English) www.example.com/de (German) As you can see the dutch language basically has two url and this gives problems according to Roger. Both urls show the same page and therefor duplicate content. Should i 301 www.example.com to www.example.com/nl ?
Industry News | | mikehenze
And would this not cause problems with the indexing because www.example.com is shown when searching for my keywords. I need to have all three languages to be indexed good and used only for the correct countries.0 -
Best Places to Post SEO/Marketing Jobs?
I have several colleagues in the industry (and some fellow marketers) who have asked me where the best places are to look for and post SEO job opportunities. I personally like InBound.org and LinkedIn. Where do you recommend marketeers look for job opportunities? D
Industry News | | danatanseo1 -
Impact SEO when sharing with other PC urls a mobile website url
Struggle with the following impact on SEO if starting to share mobile website URL. We have multiple PC urls (5 different domains).For example www.site1.com, www.site2.com, www.site3.com, www.site4.com and www.site5.com. Now I have to convince other people within the company not to use one mobile website url for all different domains. The intention is to direct all mobile traffic from site1, site2, site3, site4 and site5 to the url: m.site4.com. Based on the following articles I already wanted to combine the www.site4.com with the m.site4.com to one entity, based on the third method which is supported by Google. https://developers.google.com/webmasters/smartphone-sites/details and http://googlewebmastercentral.blogspot.com.au/2013/06/changes-in-rankings-of-smartphone_11.html But now I have to give SEO impact feedback on the other domains. The obvious one is site1, site2, site3 and site5 will not have positive mobile SERP indexation. Second impact: Duplicated content issues across multiple websites.
Industry News | | Letty
Third impact:users from site1, site2, site3 and site5 will see the change in URL, could give a negative user experience.
Fourth impact: text on site1 / site2 / site3 / site5 is not the same as on site 4 this will implement maybe hidden content issues, which could cause penalties. Do I miss other negative SEO impact, I have been searching a lot of the articles / blogs, Q&A but can't really find sufficient information about this particular subject. Any feedback or links to articles / blogs / Q&A are highly appreciated.0 -
Is there a way to get a list (backlink profile) of all tiny url's that point to my site or a competitors site?
I have noticed that most all links you find in all the major back link profile tools such as OSE or GWM, etc... do not show tiny url's. If there is a service that shows all the tiny urls pointing to your site, can someone please share. It has already been proven that tiny url's do pass link juice, so with that being said... if there is no way to find all the tiny urls that point to a site, wouldn't it be a great strategy to create all my back links with tiny url's to mask my profile from competitors? Thanks!
Industry News | | johnd57890 -
Ideal SEO / Social Media Employee Skillset
I’ve been wondering recently what makes a good SEO / Social Media employee. It seems to me that SEO and Social Media are in the process of merging into a single role. What are your thoughts on the skills that this new world SEO / Social Media employee would need? Or do you think these roles should ideally remain separate and that a “traditional” SEO is more what is needed? My own role has been moving much more towards social media recently and I was wondering if this was a common trend!
Industry News | | RG_SEO0 -
Chinese SEO specialists
Wondered if any of you could recommend a very white-hat and trustworthy Chinese SEO specialist who I can partner with? Thanks, Luke
Industry News | | McTaggart0 -
Any pointers for my site would be greatly apprecieated and rewarded.
We just started changing all the meta data and adding relevant blog content. Anything I'm missing? Any advise or "after panda" link building strategies? inkfarm.com
Industry News | | ibex0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690