Clone TLD Problems
-
I have an online services website www.geekwik.com which I started 3 months back. I also recently made a clone in TLD geekwik.in which has the same content, only pricing is in INR and is targeted at India users, while geekwik.com is targetted at global users with pricing in USD. How do I manage these 2 sites so that I do not face duplicate content penalty from google and the sites do not cannibalize on each other. Is there anything specific I need to do in robots.txt or .htaccess or sitemaps or hrelang etc? I personally feel that after putting up geekwik.in couple of weeks ago, the ranking of geekwik.com went down and I started getting lesser search queries. I would be putting up an IP based switch on both sites shortly so that Indian users are redirected to .in TLD and non-Indians are redirected to .com TLD. From SEO standpoint what are the things I need to do to counter these problems mentioned above. Putting India version in a subdirectory is also an option.
-
Thanks for taking the time to reply. I decided to stick to geekwik.com only and will dump geekwik.in. The pricing will continue to remain in USD. Only in checkout page will put an Indian Payment gateway option which will collect payments in INR (After applying exchange rate factor) and paypal gateway would be there as it is now. Invoicing would still be in USD but there would be option to pay by INR which I wanted.
I guess this solves the duplicate content issue to quite an extent. What do you think.
-
Hi Alankar,
We have recently discussed your topic here:
http://www.seomoz.org/q/same-content-pages-in-different-versions-of-google-is-it-duplicate
Hope this help you out....
-
Hello,
You can use the hreflang attribute to specify alternative language and url.
<link rel="alternate" hreflang="x-default" href="http://www.example.com/" />
<link rel="alternate" hreflang="en-gb" href="http://en-gb.example.com/page.html" />
<link rel="alternate" hreflang="en-us" href="http://en-us.example.com/page.html" />
<link rel="alternate" hreflang="en" href="http://en.example.com/page.html" />
<link rel="alternate" hreflang="de" href="http://de.example.com/seite.html" />To learn more about this http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Our protected pages 302 redirect to a login page if not a member. Is that a problem for SEO?
We have a membership site that has links out in our unprotected pages. If a non-member clicks on these links it sends a 302 redirect to the login / join page. Is this an issue for SEO? Thanks!
Technical SEO | | rimix1 -
Site wide links from another domain - could these cause a problem?
Hi I manage the SEO in house for the site http://www.naturalworldsafaris.com/ A new add on to our services has been launched in the form of an online store allowing us to sell, for example, expedition clothing that is relevant to the trips we offer. The store is managed elsewhere and sits on a subdomain of the company who are providing this service for us. There are sitewide links throughout this site back to our homepage: http://naturalworld.newheadings.com/index.php I'm just a bit concerned about these links from an SEO perspective and was wondering if we should request these are set up as no follow. Would appreciate any thoughts on this. Thanks!
Technical SEO | | KateWaite0 -
Www2 vs www problem
Hi, I have a website that has an old version and a new version. The content is not duplicate on the different versions.
Technical SEO | | TihomirPetrov
The point is that the old version uses www. and non-www before the domain and the new one uses www2. My questions is: Is that a problem and what should be done? Thank you in advance!0 -
Duplicate Content Problems
Hi I am new to the seomoz community I have been browsing for a while now. I put my new website into the seomoz dashboard and out of 250 crawls I have 120 errors! So the main problem is duplicate content. We are a website that finds free content sources for popular songs/artists. While seo is not our main focus for driving traffic I wanted to spend a little time to make sure our site is up to standards. With that said you can see when two songs by an artist are loaded. http://viromusic.com/song/125642 & http://viromusic.com/song/5433265 seomoz is saying that it is duplicate content even though they are two completely different songs. I am not exactly sure what to do about this situation. We will be adding more content to our site such as a blog, artist biographies and commenting maybe this will help? Although if someone was playing multiple bob marley songs the biography that is loaded will also be the same for both songs. Also when a playlist is loaded http://viromusic.com/playlist/sldvjg on the larger playlists im getting an error for to many links on the page. (some of the playlists have over 100 songs) any suggestions? Thanks in advance and any tips or suggestions for my new site would be greatly appreciated!
Technical SEO | | mikecrib10 -
Duplicate Version of Home Page Causing Problems?
Hello, I have a .php based site and i'm curious if how we split traffic is negatively affecting our rankings. Currently, if you visit Lipozene.com you are split 50/50 between two pages, indexa.php and indexb.php. These have identical content right now, and i'm curious if this has negatively affected our rankings. We've dropped off the SERPs for our brand term "lipozene" even though we are the official site and own www.lipozene.com . Any thoughts are greatly appreciated.
Technical SEO | | lipoweb0 -
How long to reverse the benefits/problems of a rel=canonical
If this wasn't so serious an issue it would be funny.... Long store cut short, a client had a penalty on their website so they decided to stop using the .com and use the .co.uk instead. They got the .com removed from Google using webmaster tools (it had to be as it was ranking for a trade mark they didn't own and there are legal arguments about it) They launched a brand new website and placed it on both domains with all seo being done on the .co.uk. The web developer was then meant to put the rel=canonical on the .com pointing to the .co.uk (maybe not needed at all thinking about it, if they had deindexed the site anyway). However he managed to rel=canonical from the good .co.,uk to the ,com domain! Maybe I should have noticed it earlier but you shouldn't have to double check others' work! I noticed it today after a good 6 weeks or so. We are having a nightmare to rank the .co.uk for terms which should be pretty easy to rank for given it's a decent domain. Would people say that the rel=canonical back to the .com has harmed the co.uk and is harming with while the tag remains in place? I'm off the opinion that it's basically telling google that the co.uk domain is a copy of the .com so go rank that instead. If so, how quickly after removing this tag would people expect any issues caused by it's placement to vanish? Thanks for any views on this. I've now the fun job of double checking all the coding done by that web developer on other sites!
Technical SEO | | Grumpy_Carl0 -
Can URL re writes fix the problem of critical content too deep in a sites structure?
Good morning from Wetherby UK 🙂 Ok imagine this scenario. You ask the developers to design a site where "offices to let" is on level two of a sites hierachy and so the URL would look like this: http://www.sandersonweatherall.co.uk/office-to-let. But Yikes when it goes live it ends up like this: http://www.sandersonweatherall.co.uk...s/residential/office-to-let Is a fix to this a URL re - write? Or is the only fix relocating the office to let content further up the site structure? Any insights welcome 🙂
Technical SEO | | Nightwing0 -
Code problem and the impact on links
We have a specific URL naming convention for 'city landing pages': .com/Burbank-CA .com/Boston-MA etc. We use this naming convention almost exclisively as the URLs for links. Our website had a code breakdown and all those URLs within that naming convention led to an error message on the website. Will this impact our links?
Technical SEO | | Storitz0