SEO issues with IP based content delivery
-
Hi,
I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site.
Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me:
1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites.
2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience.
3. You tell me
-
Just a thought as well to add to what everyone else said. Make sure you go into your Google webmasters and tell Google what country you want them to rank up for. I have had odd instances when a site with a co.uk extension will still rank up in the US for terms even though I don't want it to. So I advise you to set them.
Have a nice day.
-
normally I would have said keep only one site but in understanding what you've said you need to differentiate the sites substantially not just wording. Brandon differently to be different the whole reason the customer wants them not be the same is because the audience is not the same I'm from Germany I do understand the difference between being pitch something in Germany and in the United States where I am now and I do notice my own behavior patterns to those I'm far more likely to buy something with the .de as I know which German I know there will be no issues when I'm in the United States I am far more likely to purchase something with the .com TDL as I know I will not have problems hopefully. Differentiate the sites is much as you can one of them sounds like the US site should be rewritten.
I hope I've helped you,
Thomas Von Zickell
-
You have the option to show different content depending on users location. If you use PHP on your site you can use PHP GEOIP functions.
You can get your site personalised by country here: http://www.maxmind.com/en/geolocation_landing
-
Great response and I agree with Big. Large e-commerce sites with thousands of SKUs that often only have tiny difference from one product to the next are bound to have many similar product descriptions. Yes, Amazon is a perfect example. Google is smart enoug to know if what you are doing with your two sites is "an attempt to get 2 bites of the cherry." It's pretty clear that you are trying to serve the most appropriate content to the most appropriate audience. Content management would be easier with everything all on one site, but with the history of these sites, it's probably best to keep them as they are. INow, if you had two domains in either the US or UK that had all the same identical product pages, that would be an issue.
-
Well what i would do its very simple, just have it all in one site and block ip's from user in the US to UK and the other way around.
so you have on your site 2 flags US and UK and if a user from a UK try to go to US site you show a message that that this products not available that say " this option not available from your location" (i am not a copy writer so use your own words) or just block the prices for ip from a different country.
hope that help
-
Oh yeah!
Thanks keri, I really din't check the date
-
Hi Khem,
This question is over a year old. Your best bet is to start a new thread with just this question. Thanks!
-
I would suggest to run only one web site and then use the visitors IP address to insert relevant content into the site. So the web site content will change according to where the visitor comes from.
Creating multiple domains in same language with identical content might attract penalty.
Or else, do whatever you're thinking but ensure to keep the content unique, even if you're using IP delivery.
| |
-
So, you mean to say that being in a same industry, I can copy the whole content of any UK website and then can restrict UK people from accessing my website, as my target audience is in US.
Please advice
-
If you keep the two sites separate would you will be penalized by Google for having duplicate content? if this is the case how should you deal with this?
-
Personally i would simply redirect your visitors to the proper web site associated through their IP address. There a few of server side tools or plugins if you're using a blog to change the entire sites title and body content to reflect the differences between sites at a click of a button.
affportal.com has many such tools and insights to help you with this.
hope this helps.
-
I say keep them separate. The ccTLD (.co.uk - clearly shows geographic relevance) - now the .com, which is a global TLD can be targeted/biased to the USA by using theGeographic target in Google Webmasters (the .co.uk is already set to UK and cannot be set to point to another country - only 'unlisted'.)
Furthermore, I wouldn't want to lose any benefits of the aged and established .co.uk by merging it with the .com. You will never get 100% of the link juice back with 301 redirects - maybe 90% at best and after some weeks have passed and then when you consider you are already well establised with your .co.uk site you would be mad to mess with it without VERY GOOD REASON!
Can't you just not restrict shipping to the US on the UK site? I have 2 ecommerce sites setup this way (one.co.uk and one .com - (which operates on a dropship basis only - as we are UK based)
With regards to the duplicate content issue - I would look at the fact that Amazon.co.uk and Amazon.com have hundreds of thousands of product pages with the same/VERY similar content (descriptions etc) - and last time I check they were ranking pretty well ;o) - without the need to block users from certain locations - (they do of course pickup your IP address instead and SUGGEST(with a big flag and arrows that you visit the UK site instead when you select Amazon.com site from the UK) - They still restrict shipping of certain items should you persist and try and order anyway.
Your prices will also differ between £ and $ - as will the converted price - another clear indication that this isn't an attempt to get 2 bites of the cherry.
I would also move the .com site to a US based server too - as this helps with ranking anyway (server/website speed and location are factors)
Maybe bung a flag in your header graphics to further denote your geo-targeting?
Changing the spelling of UK/US variants is sensible anyway - though difficult to research initially - I spent some time battling between ize and ise!
Keep the .Co.Uk and .Com separate - state that you do not ship to the US from the UK site - restrict purchases accordingly (by shipping address). That should make it clear enough - hope that helps!
-
Hi Devaki,
Are you still deciding what to do here, or have you gone ahead and made a decision? Let us know if we can help you out anymore, or tell us what your decision was -- we'd be interested to hear what choice was made and how it's worked out.
Thanks!
-
Duplicate content shouldn't be an issue with regards maintaining a US and UK site, the search engines will decide what version to show. Admittedly I'm not 100% convinced the are perfect at doing this at the moment, but confident enough I would do it myself in this case, so don't worry about rewriting all your content (though remember UK and US are one nation divided by a single language).
As a precaution you could georoute customers by IP to either or, but remember googlebot will probably crawl from a US address so you might want to let theirs passed.
To be honest unless you have a reason to do it I wouldn't change the set up you currently have. Bringing them under one domain is going to be awkward (though possible) to show geospecific content and as I mentioned I don't believe you even need to rewrite the content.
Just build UK links to the .co.uk site and you should be alright for the most part.
Is there a particular reason you feel you need to change the set-up?
-
Sorry, I must not be awake. What is the problem? You have two sites with common products and you only offer certain promotions in the UK on the UK ccTLD (country code top level domain). Are these promos showing up on the US site?
-
You have a few options.
You could build out one website and whenever a visitor comes from a specific country you could show that visitor separate pieces of the site, different products, etc... but this is not easily done and is a nightmare to manage.
You can keep the two separate websites and focus on rewriting the content. This is my first option, if it were me. You have two websites that are in separate countries selling products that are the same but with different offers, discounts, currencies, etc... so this makes the most sense to have a clear line of separation. However, it shouldn't be too difficult to hire a freelancer writer to go through one of the websites and rewrite the content. Make it more relevant to that countries users, add in videos, helpful information, etc... you only have to rewrite content for one of the sites, that would make sure they are not full of dupe content. Then, you could down the road hire the same writer to optimize the content for the other website but approach it with different content that is just as relevant and you should have a win-win-win situation.
Does that make sense?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
SEO question
Hi there! I'm the SEO manager for 5 Star Loans. I have 2 city pages running. We are running our business in 2 locations: Berkeley, CA & San Jose, CA. For those offices we've created 2 google listings with separate gmail accounts. Berkeley (http://5starloans.com/berkeley/) ranks well in Berkeley in Gmaps and it shows on first page in organic results. However the second city page San Jose (http://5starloans.com/san-jose/) doesn't show in the Gmaps local pack results and also doesn't rank well in organic results. Both of them have authentic backlinks and reviews. It has been a year already and it's high time we knew the problem 🙂 any comment would be helpful. thanks a lot
Intermediate & Advanced SEO | | moonalev0 -
Website Migration and SEO
Recently I migrated three websites from www.product.com to www.brandname.com/product. Two of these site are performing as normal when it comes to SEO but one of them lost half of its traffic and dropped in rankings significantly. All pages have been properly redirected, onsite SEO is intact and optimized, and all pages are indexed by Search engines. Has anyone had experience with this type of migration that could give some input on what a possible solution could be? Any help would be greatly appreciated!
Intermediate & Advanced SEO | | AlexVelazquez0 -
Blog content - what to do, and what to avoid in terms of links, when you're paying for blog content
Hi, I've just been looking at a restaurant site which is paying food writers to put food news and blogs on their website. I checked the backlink profile of the site and the various bloggers in question usually link from their blogs / company websites to the said restaurant to help promote any new blogs that appear on the restaurant site. That got me wondering about whether this might cause problems with Google. I guess they've been putting about one blog live per month for 2 years, from 12/13 bloggers who have been linking to their website. What would you advise?
Intermediate & Advanced SEO | | McTaggart0 -
Help diagnosing a complex SEO issue
Good evening SEOMoz. A series events, in close succession are making it somewhat difficult for me to diagnose a cause of fluctuations in traffic. Please excuse some of the stupid moves I made, but desperation got the better of me. One of my most beloved websites was hit by Panda on January 18th. Pretty sure it was due to a CMS bug that is now fixed. The website site started to show great signs of recovery from April 19th - Panda 3.5. I'm going to be as explicit as possible with the traffic for the days that follow. Traffic was stable previously. April 20th +10%. April 21st +5%. April 22nd +5%. (half way recovered, also the first real fluctuation since the site was hit in Jan). Due to the looming over-optimisation penalty, on the 22nd I changed the titles to unoptimise them a little. (fear is a dangerous thing at times). April 23rd -10%. April 24th -10% April 25th onwards, pretty much levelled out. The websites I've seen hit by Penguin, lost around 40% of their traffic, very steeply on 24th and 25th April. So the drops aren't in keeping with my experience of Penguin. But they do coincide perfectly with the massive site-wide title change. I've haven't read anything definitive about a penalty for changing titles too often, but for obvious reasons, it makes sense. The drop seems terribly soon after changing titles, but the site is very heavily indexed. It's also worth mentioning that I did changed the titles BACK, incase it was purely the fact the titles had been slightly de-optimised, that caused the drop. I waited until May 5th. This had no positive nor negative effect. It's a lot to take in but I'd love to hear your thoughts. I'm feeling a little bamboozled looking at all the figures. There was of course the above the fold update on the 19th Jan, but lets ignore that as we've only ever had a max 1 ad per page, most pages have none.
Intermediate & Advanced SEO | | seo-wanna-bs0 -
Duplicate content for swatches
My site is showing a lot of duplicate content on SEOmoz. I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes. Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name. Is the fact that they all have duplicate content affecting my individual rankings and my domain authority? What can I do about it? I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0 -
Predictive SEO
Hello all, I am trying to perform a predictive competitive SEO analysis to estimate what I will need to do to surpass my competitors. I am unsure of how to do this and would like some advice or link to an article. What I am trying to do is predict where I can rank in three months, six months and one year as well as what I need to do compared to my competitors. Specifically also to estimate how many links I would need to acquire to both my page as well as domain. I have already pulled my competitors domain links, page links, and age. Adam
Intermediate & Advanced SEO | | digitalops0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0