International IP redirection - help please!
-
Hi,
We have a new client who has built a brand in the UK on a xyz.com domain. The "xyz.com" is now a brand and features on all marketing. Lots of SEO work has taken place and the UK site has good rankings and traffic.
They have now expanded to the US and with offline marketing leading the way, xyz.com is the brand being pushed in the US.
So with the launch of the offline marketing US IP's are now redirected to a US version of the site (subfolder) with relevant pricing and messaging. This is great for users, but with Googlebot being on a US IP it is also being redirected and the UK pages have now dropped out of the index.
The solution we need would ideally have both UK and US users searching for xyz.com, but would see them land on respective static pages with correct prices. Ideally no link authority would be moved via redirection of users.
We have considered the following solutions
-
Move UK site to subfolder /uk and redirect UK ips to this subfolder (and so not googlebot)
-
downside of this is it will massively impact the UK rankings which are the core driver of the business - also would this be deemed as illegal cloaking?
-
natural links will always be to the xyz.com page and so longer term the US homepage will gain authority and UK homepage will be more reliant on artificial linkbuilding.
-
Use a overlay that detects IP address and requests users to select relevant country (and cookies to redirect on second visit)
-
this has been rejected by ecommerce team as will increase bounce rate% & we dont want users to be able to see other countries due to prduct and price differences.
-
Use a homepage with country selection (and cookies to redirect on second visit)
-
this has been rejected by ecommerce team as will increase bounce rate% & we dont want users to be able to see other countries due to prduct and price differences.
Is there an easy solution to this problem that we're overlooking?
Is there another way of legal cloaking we could use here?
Many thanks in advance for any help here
-
-
You can use hreflang & Alternate tag to solve this CCTLD duplication issue. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
-
Hannah,
We are currently in the requirements phase for an international.xyz.com ecommerce web site. Your overlay recommendation is appreciated and makes sense, but I'm just wondering about the duplicate content issue. Xyz.com and international.xyz.com will essentially be the same sites and both in English, with slight variances in the brand selection and some customer service-oriented messaging. Any insights would be appreciated.
-
Hi Ralph,
Sounds good to me
Also great shout re first click free!
Hannah
-
Thanks Hannah. I think we have managed to convince them to go with a separate site on .co.uk which has always been my preferred approach to international SEO.
I'm always worried that even with the right intentions, there is still a risk of Google misinterpreting.
Separate domains means no confusion.
As for overlays, it works very well for another client of ours and makes no difference to bounce rate. One lesson we did learn was to ensure we had first click free activated!!
-
Hi Ralph,
Using an IP redirect to serve country-specific content to the user is fine (i.e. it isn't considered to be cloaking as your intent isn't manipulative). However, there are issues with doing so - as you've highlighted - you're also redirecting the bots so have seen the UK site suffer.
Because of these issues I don't normally recommend the IP redirect approach. I also think that it can be bad for users too - just because someone is in the UK right now - doesn't necessarily mean that they want to see UK content - e.g. they may just be here temporarily on holiday / on business etc.
Personally I would prefer the javascript overlay (your option two) which allows users to pick the relevant country rather than implementing a hard redirect. This will also allow the bots to index both versions of your site.
I do understand the ecommerce team's concerns over this increasing bounce rate - but I'd suggest that if it's implemented well - check out amazon.co.uk from the US - this shouldn't cause you problems with bounce rate.
Likewise I understand their concerns re US / UK customers seeing the wrong content and therefore exposing the price differences - however, again I'd suggest that it's probably more important to have both versions of the site indexed.
I hope this helps,
Hannah
-
Thanks for reply, yes its definietly a hole. The more i research the deeper it seems to get. It wasn't our recommendation for the current solution (decided in house) so luckily its not a hole I've dug even though I'm now in it!
-
looks like you have dug yourselfs a hole.
you could with a bit of work, detect where the vistits have come from, and then show prices relevent to the vistis, you will not be done for clocking in this case. There is geo locating solutions out there for detection, and you could write a function that can sort price and then replace all the prices with the function something like @GetCorrectPrice(19.99)
As far as i can tell about clocking by the way, you dont get dome for clocking unless you are trying to decive, if they detect clocking the will have a human look and see why and what you are doing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Internal Links - Different URLs
Hey so, In my product page, I have recommended products at the bottom. The issue is that those recommended products have long parameters such as sitename.com/product-xy-z/https%3A%2F%2Fwww.google.co&srcType=dp_recs The reason why it has that long parameter is due to tracking purposes (internally with the dev and UX team). My question is, should I replace it with the clean URL or as long as it has the canonical tag, it should be okay to have such a long parameter? I would think clean URL would help with internal links and what not...but if it already has a canonical tag would it help? Another issue is that the URL is different and not just the parameter. For instance..the canonical URL is sitename.com/productname-xyz/ and so the internal link used on the product page (same exact page just different URL with parameter) sitename.com/xyz/https%3A%2F%2Fwww.google.co&srcType=dp_recs (missing product name), BUT still has the canonical tag!
Intermediate & Advanced SEO | | ggpaul5620 -
SEO impact of 301 redirects based on IP addresses from a specific state
Hello Moz Community! We are facing an issue that may or may not be unique, but need some advice and/or clarification on the best way to address the issue. We recently rebranded and launched a new site under a new domain and things have been progressing well. However, despite all the up front legwork on trademarks and licensing, we have recently encountered a hiccup that forces us to revert to the old URL/branding for one specific state. This may be a temporary issue that lasts a couple of months or it could potentially be in the court system for a couple of years. One potential solution we have discussed is to redirect the new site to the old site based on IP addresses for the state in question. Looking for any guidance on what type of impact this may have on SEO. Also open to any other suggestions or guidance on dealing with this situation. Thanks
Intermediate & Advanced SEO | | VeteransFirstMarketing0 -
Huge httaccess with old 301 redirects. Is it safe to delete all redirects with no traffic in last 2 months?
We have a huge httaccess file over several MB which seems to be the cause for slow server response time. There are lots of 301 redirects related to site migration from 9 months ago where all old URLs were redirected to new URL and also lots of 301 redirects from URL changes accumulated over the last 15 years. Is it safe to delete all 301 redirects which did not receive any traffic in last 2 months ? Or would you apply another criteria for identifying those 301 that can be safely deleted? Any way to get in google analytics or webmaster tools all 301 that received traffic in the last 2 months or any other easy way to identify those, apart from checking the apache log files ?
Intermediate & Advanced SEO | | lcourse0 -
To redirect or not to redirect, that is the question
I work for a software company that is redeveloping the website (same domain.) We have tons of content in the form of articles and documents for support, how to use the product better, case studies, and blog posts. I've downloaded a landing page report and many of these have low impressions and little or no clicks (some ranked high other very low.) Should I redirect all this content to the new site where some of it won't exist or forget about it because of the lack of juice? Is there a rule-of-thumb threshold for redirecting for content?
Intermediate & Advanced SEO | | Nobody15969167212220 -
Rel canonical or redirect
Hi, my client has the following links pointing to the home page http://www.weddingrings.com/index.cfm http://www.weddingrings.com In this case would I use rel canonical or redirect?
Intermediate & Advanced SEO | | alexkatalkin0 -
Please help on this penalized site!
OK, this is slowly frying my brain and would like some clarification from someone in the know, we have posted multiple reconsideration requests the regular "site violates googles quality guidelines" .."look for unnatural links etc" email back in March 2012, I came aboard the business in August 2012 to overcome bad SEO companies work. So far i have filled several disavow requests by domain and cleared over 90% of our backlink profile which where all directory, multiple forum spam links etc from WMT, OSE and Ahrefs and compiled this to the disavow tool, as well as sending a google docs shared file in our reconsideration request of all the links we have been able to remove and the disavow tool, since most where built in 2009/2010 a lot where impossible to remove. We managed to shift about 12 - 15% of our backlink profile by working very very hard too remove them. The only links that where left where quality links and forum posts created by genuine users and relevant non spam links As well as this we now have a high quality link profile which has also counteracted a lot of the bad "seo" work done by these previous companies, i have explained this fully in our reconsideration request as well as a massive apology on behalf of the work those companies did, and we are STILL getting generic "site violates" messages, so far we have spent in excess of 150 hours to get this penalty removed and so far Google hasn't even batted an eyelid. We have worked SO hard to combat this issue it almost feels almost very personal, if Google read the reconsideration request they would see how much work we have done too remove this issue. If anyone can give any updates or help on anything we have missed i would appreciate it, i feel like we have covered every base!! Chris www.palicomp.co.uk
Intermediate & Advanced SEO | | palicomp0 -
redirect 404 pages to homepage
Hello, I'm puting a new website on a existing domain. In order to not loose the links that point to the varios old url I would like to redirect them to homepage. The old website was a mess as there was no seo and the pages didn't target any keywords. Thats why I would like to redirect all links to home. What do you think is the best way to do this ? I tried to ad this in the .htaccess but it's not working; ErrorDocument 404 /index.php Con you tell me how it exacly look? Now the hole file is like this: @package Joomla @copyright Copyright (C) 2005 - 2012 Open Source Matters. All rights reserved. @license GNU General Public License version 2 or later; see LICENSE.txt READ THIS COMPLETELY IF YOU CHOOSE TO USE THIS FILE! The line just below this section: 'Options +FollowSymLinks' may cause problems with some server configurations. It is required for use of mod_rewrite, but may already be set by your server administrator in a way that dissallows changing it in your .htaccess file. If using it causes your server to error out, comment it out (add # to beginning of line), reload your site in your browser and test your sef url's. If they work, it has been set by your server administrator and you do not need it set here. Can be commented out if causes errors, see notes above. Options +FollowSymLinks Mod_rewrite in use. RewriteEngine On Begin - Rewrite rules to block out some common exploits. If you experience problems on your site block out the operations listed below This attempts to block the most common type of exploit attempts to Joomla! Block out any script trying to base64_encode data within the URL. RewriteCond %{QUERY_STRING} base64_encode[^(]([^)]) [OR] Block out any script that includes a
Intermediate & Advanced SEO | | igrizo0 -
Please advice
Hi there I was wondering if is any profit from an aged domain. Can anyone advice how to take out the most of seo benefits? White hat only techniques. Thanks in advance
Intermediate & Advanced SEO | | nyanainc0