SEO issues with IP based content delivery
-
Hi,
I have two websites say website A and Website B. The website A is set up for the UK audience and the website B is set up for the US audience. Both websites sell same products with some products and offers not available in either country. Website A can't be accessed if you are in US. Similarly website B can't be accessed if you are in UK. This was a decision made by the client long time ago as they don’t want to offer promotions etc in the US and therefore don’t want the US audience to be able to purchase items from the UK site.
Now the problem is both the websites have same description for the common products they sell.Search engine spiders tend to enter a site from a variety of different IP addresses/locations. So while a UK visitor will not be able to access the US version of the site and vice versa, a crawler can. Now i have following options with me:
1. Write a different product descriptions for US website to keep both the US and UK versions of the site in the Google Index for the foreseeable future. But this is going to be time consuming and expensive option as there are several hundred products which are common to both sites.
2. Use a single website to target both US and UK audience and make the promotions available only to the UK audience. There is one issue here. Website A address ends with '.co.uk' and website B has different name and ends with .com. So website A can't be used for the US audience. Also website A is older and more authoritative than the new website B. Also website A is pretty popular among UK audience with the .co.uk address. So website B can't be used to target the UK audience.
3. You tell me
-
Just a thought as well to add to what everyone else said. Make sure you go into your Google webmasters and tell Google what country you want them to rank up for. I have had odd instances when a site with a co.uk extension will still rank up in the US for terms even though I don't want it to. So I advise you to set them.
Have a nice day.
-
normally I would have said keep only one site but in understanding what you've said you need to differentiate the sites substantially not just wording. Brandon differently to be different the whole reason the customer wants them not be the same is because the audience is not the same I'm from Germany I do understand the difference between being pitch something in Germany and in the United States where I am now and I do notice my own behavior patterns to those I'm far more likely to buy something with the .de as I know which German I know there will be no issues when I'm in the United States I am far more likely to purchase something with the .com TDL as I know I will not have problems hopefully. Differentiate the sites is much as you can one of them sounds like the US site should be rewritten.
I hope I've helped you,
Thomas Von Zickell
-
You have the option to show different content depending on users location. If you use PHP on your site you can use PHP GEOIP functions.
You can get your site personalised by country here: http://www.maxmind.com/en/geolocation_landing
-
Great response and I agree with Big. Large e-commerce sites with thousands of SKUs that often only have tiny difference from one product to the next are bound to have many similar product descriptions. Yes, Amazon is a perfect example. Google is smart enoug to know if what you are doing with your two sites is "an attempt to get 2 bites of the cherry." It's pretty clear that you are trying to serve the most appropriate content to the most appropriate audience. Content management would be easier with everything all on one site, but with the history of these sites, it's probably best to keep them as they are. INow, if you had two domains in either the US or UK that had all the same identical product pages, that would be an issue.
-
Well what i would do its very simple, just have it all in one site and block ip's from user in the US to UK and the other way around.
so you have on your site 2 flags US and UK and if a user from a UK try to go to US site you show a message that that this products not available that say " this option not available from your location" (i am not a copy writer so use your own words) or just block the prices for ip from a different country.
hope that help
-
Oh yeah!
Thanks keri, I really din't check the date
-
Hi Khem,
This question is over a year old. Your best bet is to start a new thread with just this question. Thanks!
-
I would suggest to run only one web site and then use the visitors IP address to insert relevant content into the site. So the web site content will change according to where the visitor comes from.
Creating multiple domains in same language with identical content might attract penalty.
Or else, do whatever you're thinking but ensure to keep the content unique, even if you're using IP delivery.
| |
-
So, you mean to say that being in a same industry, I can copy the whole content of any UK website and then can restrict UK people from accessing my website, as my target audience is in US.
Please advice
-
If you keep the two sites separate would you will be penalized by Google for having duplicate content? if this is the case how should you deal with this?
-
Personally i would simply redirect your visitors to the proper web site associated through their IP address. There a few of server side tools or plugins if you're using a blog to change the entire sites title and body content to reflect the differences between sites at a click of a button.
affportal.com has many such tools and insights to help you with this.
hope this helps.
-
I say keep them separate. The ccTLD (.co.uk - clearly shows geographic relevance) - now the .com, which is a global TLD can be targeted/biased to the USA by using theGeographic target in Google Webmasters (the .co.uk is already set to UK and cannot be set to point to another country - only 'unlisted'.)
Furthermore, I wouldn't want to lose any benefits of the aged and established .co.uk by merging it with the .com. You will never get 100% of the link juice back with 301 redirects - maybe 90% at best and after some weeks have passed and then when you consider you are already well establised with your .co.uk site you would be mad to mess with it without VERY GOOD REASON!
Can't you just not restrict shipping to the US on the UK site? I have 2 ecommerce sites setup this way (one.co.uk and one .com - (which operates on a dropship basis only - as we are UK based)
With regards to the duplicate content issue - I would look at the fact that Amazon.co.uk and Amazon.com have hundreds of thousands of product pages with the same/VERY similar content (descriptions etc) - and last time I check they were ranking pretty well ;o) - without the need to block users from certain locations - (they do of course pickup your IP address instead and SUGGEST(with a big flag and arrows that you visit the UK site instead when you select Amazon.com site from the UK) - They still restrict shipping of certain items should you persist and try and order anyway.
Your prices will also differ between £ and $ - as will the converted price - another clear indication that this isn't an attempt to get 2 bites of the cherry.
I would also move the .com site to a US based server too - as this helps with ranking anyway (server/website speed and location are factors)
Maybe bung a flag in your header graphics to further denote your geo-targeting?
Changing the spelling of UK/US variants is sensible anyway - though difficult to research initially - I spent some time battling between ize and ise!
Keep the .Co.Uk and .Com separate - state that you do not ship to the US from the UK site - restrict purchases accordingly (by shipping address). That should make it clear enough - hope that helps!
-
Hi Devaki,
Are you still deciding what to do here, or have you gone ahead and made a decision? Let us know if we can help you out anymore, or tell us what your decision was -- we'd be interested to hear what choice was made and how it's worked out.
Thanks!
-
Duplicate content shouldn't be an issue with regards maintaining a US and UK site, the search engines will decide what version to show. Admittedly I'm not 100% convinced the are perfect at doing this at the moment, but confident enough I would do it myself in this case, so don't worry about rewriting all your content (though remember UK and US are one nation divided by a single language).
As a precaution you could georoute customers by IP to either or, but remember googlebot will probably crawl from a US address so you might want to let theirs passed.
To be honest unless you have a reason to do it I wouldn't change the set up you currently have. Bringing them under one domain is going to be awkward (though possible) to show geospecific content and as I mentioned I don't believe you even need to rewrite the content.
Just build UK links to the .co.uk site and you should be alright for the most part.
Is there a particular reason you feel you need to change the set-up?
-
Sorry, I must not be awake. What is the problem? You have two sites with common products and you only offer certain promotions in the UK on the UK ccTLD (country code top level domain). Are these promos showing up on the US site?
-
You have a few options.
You could build out one website and whenever a visitor comes from a specific country you could show that visitor separate pieces of the site, different products, etc... but this is not easily done and is a nightmare to manage.
You can keep the two separate websites and focus on rewriting the content. This is my first option, if it were me. You have two websites that are in separate countries selling products that are the same but with different offers, discounts, currencies, etc... so this makes the most sense to have a clear line of separation. However, it shouldn't be too difficult to hire a freelancer writer to go through one of the websites and rewrite the content. Make it more relevant to that countries users, add in videos, helpful information, etc... you only have to rewrite content for one of the sites, that would make sure they are not full of dupe content. Then, you could down the road hire the same writer to optimize the content for the other website but approach it with different content that is just as relevant and you should have a win-win-win situation.
Does that make sense?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issues on same page with urls to multiple tabs?
Hello everyone! I'm first time here, and glad to be part of Moz community! Jumping right into the question I have. For a type of pages we have on our website, there are multiple tabs on each page. To give an example, let's say a page is for the information about a place called "Ladakh". Now the various urls that the page is accessible from, can take the form of: mywanderlust.in/place/ladakh/ mywanderlust.in/place/ladakh/photos/ mywanderlust.in/place/ladakh/places-to-visit/ and so on. To keep the UX smooth when the user switches from one tab to another, we load everything in advance with AJAX but it remains hidden till the user switches to the required tab. Now since the content is actually there in the html, does Google count it as duplicate content? I'm afraid this might be the case as when I Google for a text that's visible only on one of the tabs, I still see all tabs in Google results. I also see internal links on GSC to say a page mywanderlust.in/questions which is only supposed to be linked from one tab, but GSC telling internal links to this page (mywanderlust.in/questions) from all those 3 tabs. Also, Moz Pro crawl reports informed me about duplicate content issues, although surprisingly it says the issue exists only on a small fraction of our indexable pages. Is it hurting our SEO? Any suggestions on how we could handle the url structure better to make it optimal for indexing. FWIW, we're using a fully responsive design with the displayed content being exactly same for both desktop and mobile web. Thanks a ton in advance!
Intermediate & Advanced SEO | | atulgoyal0 -
Blog content and panda?
If we want to create a blog to keep in front of our customers (via email and posting on social) and the posts will be around 300 - 1000 words like this site http://www.solopress.com/blog/ are we going to be asking for a panda slap as the issue would be the very little shares and traction after the first day or two. Also would panda only affect the blogs that are crap if we mix in a couple of really good posts or would it affect theses as well and possibly even the site? Any help would be appreciated.
Intermediate & Advanced SEO | | BobAnderson0 -
Best place to submit an SEO RFP? Anyone interested in 60 hours of SEO work?
I have a small SEO project (~ 60 hours of work) that I would like to get some help with. It is spread out over the span of 4 to 6 months (2 to 3 hours of work a week with the help of 10 - 15 support staff hours per week), and if it goes well there is an opportunity to extend the project through the rest of 2014. Does anyone here want to see the RFP or have any recommendations on where I can submit this request to get the maximum exposure? Thanks!
Intermediate & Advanced SEO | | pbhatt0 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0 -
Duplicate content issue for franchising business
Hi All We are in the process of adding a franchise model to our exisitng stand alone business and as part of the package given to the franchisee will be a website with conent identical to our existing website apart from some minor details such as contact and address details. This creates a huge duplicate content issue and even if we implement a cannonical approach to this will still be unfair to the franchisee in terms of their markeitng and own SEO efforts. The url for each franchise will be unique but the content will be the same to a large extend. The nature of the service we offer (professional qualificaitons) is such that the "products" can only be described in a certain way and it will be near on in impossible to have a unique set of "product" pages for each franchisee. I hope that some of you have come across a similar problem or that some of you have suggestions or ideas for us to get round this. Kind regards Peter
Intermediate & Advanced SEO | | masterpete0 -
"Original Content" Dynamic Hurting SEO? -- Strategies for Differentiating Template Websites for a Nationwide Local Business Segment?
The Problem I have a stable of clients spread around the U.S. in the maid service/cleaning industry -- each client is a franchisee, however their business is truly 'local' with a local service area, local phone/address, unique business name, and virtually complete control over their web presence (URL, site design, content; apart from a few branding guidelines). Over time I've developed a website template with a high lead conversion rate, and I've rolled this website out to 3 or 4 dozen clients. Each client has exclusivity in their region/metro area. Lately my white hat back linking strategies have not been yielding the results they were one year ago, including legitimate directories, customer blogging (as compelling as maid service/cleaning blogs can really be!), and some article writing. This is expected, or at least reflected in articles on SEO trends and directory/article strategies. I am writing this question because I see sites with seemingly much weaker back link profiles outranking my clients (using SEOMoz toolbar and Site Explorer stats, and factoring in general quality vs. quantity dynamics). Questions Assuming general on-page optimization and linking factors are equal: Might my clients be suffering because they're using my oft-repeated template website (albeit with some unique 'content' variables)? If I choose to differentiate each client's website, how much differentiation makes sense? Specifically: Even if primary content (copy, essentially) is differentiated, will Google still interpret the matching code structure as 'the same website'? Are images as important as copy in differentiating content? From an 'machine' or algorithm perspective evaluating unique content, I wonder if strategies will be effective such as saving the images in a different format, or altering them slightly in Photoshop, or using unique CSS selectors or slightly different table structures for each site (differentiating the code)? Considerations My understanding of Google's "duplicate content " dynamics is that they mainly apply to de-duping search results at a query specific level, and choosing which result to show from a pool of duplicate results. My clients' search terms most often contain client-specific city and state names. Despite the "original content" mantra, I believe my clients being local businesses who have opted to use a template website (an economical choice), still represent legitimate and relevant matches for their target user searches -- it is in this spirit I ask these questions, not to 'game' Google with malicious intent. In an ideal world my clients would all have their own unique website developed, but these are Main St business owners balancing solutions with economics and I'm trying to provide them with scalable solutions. Thank You! I am new to this community, thank you for any thoughts, discussion and comments!
Intermediate & Advanced SEO | | localizedseo0 -
ECommerce products duplicate content issues - is rel="canonical" the answer?
Howdy, I work on a fairly large eCommerce site, shop.confetti.co.uk. Our CMS doesn't allow us to have 1 product with multiple colour and size options so we created individual product pages for each product variation. This of course means that we have duplicate content issues. The layout of the shop works like this; there is a product group page (here is our disposable camera group) and individual product pages are below. We also use a Google shopping feed. I'm sure we're being penalised as so many of the products on our site are duplicated so, my question is this - is rel="canonical" the best way to stop being penalised and how can I implement it? If not, are there any better suggestions? Also, we have targeted some long-tail keywords in some of the product descriptions so will using rel-canonical effect this or the Google shopping feed? I'd love to hear experiences from people who have been through similar things and what the outcome was in terms of ranking/ROI. Thanks in advance.
Intermediate & Advanced SEO | | Confetti_Wedding0 -
Website Ranking Issue
Hey All My question is specfic to a particular website. The category of the website is Kitchen Appliances. The keyword is extremely competitive. The website I am currently optimizing has loads of products and many pages as well. I am constantly building links from industry specific websites for the website as well as composing articles and leading the users back to the website with keyword rich anchor text. I have been doing this for around 3 months and I do not see the website in the first 30 pages of the SERP (for the keyword kitchen appliances - the site is a page rank 2 BTW). No bugs reported as well in Webmaster tools. My next step is to add these articles to the website (www.example.com/KitchenAppliances ) with keyword rich metatags as well as content with internal links to my product pages. I also plan on sending traffic to these pages to build the pages link popularity. Do you think I can expect better results for the article pages than my original website product pages or do you think I should continue with the link building activity I was performing originally for the website. regards Ryan
Intermediate & Advanced SEO | | SEO5Team2