Non .Com or .Co Versus .ca or .fm sites - In terms of SEO value
-
We are launching a new site with a non traditional top level domain . We were looking at either .ca or .in as we are not able to get the traditional .com or .co or .net etc .
I was wondering if this has any SEO effect ? Does Google/Bing treat this domain differently .Will it be penalized ?
Note : My site is a US based site targeting US audience
-
Mmm.... try to use a termination which is not a country level one. .CA or .IN automatically are targeting they respective countries and you can't avoid the inconviences of that geotargetization if not doing an huge link building in your real target country.
Try to check out other generic termination (avoind cc.which is banned by Google)
-
Chait
I think there are really two questions you should ask: the one re SEO effect, Google/Bong, etc. and, how they are perceived.
As to Google others have stated correctly there is no penalty. I am not sure that using a .ca/.in is wise even if you are US based, use US server, and make correct geotarget selections in GWMT. The reason is not Google/SEO per se, but perception.
It is a given that in some countries, having a cc tld that is different is not wise due to bias. A documented one is the French bias in Europe (no hate mail please, my son was born in Paris - yes France, not Texas). In the US, as open minded as half of us seem to be, there will always be a bias if someone knows the .in is India and there may well be one with .ca especially if you are in more northern states that are more likely to see it.So, you have to factor all in and then ask: "Why am I going this way?" Is whatever domain name so critical that you are willing to sacrifice for it? Have you considered using hyphens?
We do a lot with EMD's using hyphens and without. At the same time, I believe many over emphasize EMD and could spend time and energy better elsewhere. Is there a second best EMD and then spend all the energy on something else SEO?
I cannot imagine, short of a known and, likely, very well known brand where I would risk a .ca or .in in the US for that EMD.
Good Luck, let us know what direction you take and how it works,
Robert
-
Thank You for the answers . To extend this discussion further (to help me and others interested)
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1347922&topic=2371325&ctx=topic
Google apparantly treats some ccTLDs (such as .tv, .me, etc.) as gTLD
.as .bz .cc .cd .co .dj .fm .la .me .ms .nu .sc .sr .tv .tk .ws
-
Google should not penalize you for using .in or .ca however like William said Google may give top priority to the .com version of your site especially because you are targeting the USA and your SEO effort will benefit the.com more than it will benefit you especially with branded names and direct traffic. Hope this helps.
-
It won't rank you as well. Penalize too harsh of a word. You will still rank but your rankings increase much more if you have a TLD, .com, .net, .org.
Of course these statements can be irrelevant if the site is a viral site. But even bit.ly moved to bitly.com(not exact reason why they changed though).
EDIT: Not to mention that the .ca or .in could be liable for Canada and India laws respectively.
But hey, if you are looking for business in Canada or India, it would be an awesome domain!
-
So if I understand you correctly .. Google will Penalize me when ranking(in the US) when I have .in or .ca
Is the above a fair statement
-
.com would be best but .ca and .in will most likely not work well if your target is in US.
.ca and .in is used and definitely considered when ranking. I suggest you come up with another variation of the domain and get a .com.
One of the main problems I can see off the bat is that the current owners of the sites .com, .net might already have a huge presence and it will hinder majority of your efforts because you will most likely never beat out the .com(unless its nonexistent or spam).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO: High intent organic revenue down in Europe
Our team is stumped and we are hoping some of you might have some insight! We are seeing a drop in Europe organic revenue and we can't seem to figure out what the core cause of the problem is. What's interesting, the high intent traffic is increasing across the business, as is organic-attributed revenue. And in Europe specifically, other channels appear to be doing just fine. This seems to be a Europe high-intent SEO problem. What we have established: Revenue was at a peak in Q4 2017 and Q1 2018 Revenue dips in mid-late Q2 2018 and again in Q4 2018 where it has stayed low since Organic traffic has gone up, conversion rate has gone down, purchases have gone down Paid search traffic has gone up, conversion rate has gone down slightly, submissions have gone up Currency changes are minimal We cannot find any site load issues What we know happened during this time frame (January 2018 onward): Updates to the website (homepage layout, some text changes) end of April 2018 GDPR end of May 2018 Google Analytics stops being able to track Firefox Europe is a key market for us and we cant figure out what might be causing this to happen - again, only in Europe - beyond GDPR and the changes we've made on our site is there anything else major that we're missing that could be causing this? Or does anyone have any insights as to where we should look? Thank you in advance!
Algorithm Updates | | RS-Marketing0 -
On page vs Off page vs Technical SEO: Priority, easy to handle, easy to measure.
Hi community, I am just trying to figure out which can be priority in on page, off page and technical SEO. Which one you prefer to go first? Which one is easy to handle? Which one is easy to measure? Your opinions and suggestions please. Expecting more realistic answers rather than usual check list. Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Technical Argument to Prefer non-www to www?
I've been recommending using non-www vs. www as a preferable set up if a client is starting a site from scratch, and there aren't any pre-existing links to consider. I'm wondering if this recommendation still holds? I’ve been looking on the interwebs and I’m seeing far fewer articles arguing for the non-www version. In the two courts, I’m seeing highlighted: Pro www: (ex: www.domain.com) Works better with CDN networks, where a domain needs to be specified (though that argument is 3 years old) Ability to restrict cookies to one hostname (www) or subdomain (info. blog. promo.) if using multiple subdomains IT people generally prefer it Pro non-www (ex: domain.com) If you ever want to support or add https://, you don’t have to support 2 sets of urls/domains Mindset: fewer and fewer people think in terms of typing in www before a site url, the future is heading towards dropping that anyway. Though that is a bit of a cosmetic argument…. Is there a trend going back to www? Is there a technical argument to recommend non-www over www? Thanks!
Algorithm Updates | | Allie_Williams0 -
Any SEO thoughts about Google's new Data Highlighter for products?
After searching around on the web for a while I couldn't find any case studies or interesting posting about Google's new feature to highlight structured data. In Google Webmaster Tools you can now tag your products to be displayed as structured data in Google's search results. Two questions that rose immediately: 1. What effect will Google's new Data Hightlighter for products have on your SEO? Can we expect better CTR's for productspage results in Google? Better conversion rates perhaps? Any case studies that show KPI improvements after using structured data for products? 2. I would love to see some examples in the search results to see what productpages would look like after Data Highlighting it. Your thoughts or input about this subject will be much appreciated.
Algorithm Updates | | SDIM0 -
Climate of fear in the world of SEO
There certainly appears to be a certain climate of fear about backlinks at the mo, and not without reason. I was wondering why Google moved from simply discounting links to punishing site owners for their backlink profiles, many of which were built up when the risks of punishment weren't there? I mean, I could send them the names of at least 1,000 sites in linkfarms / blog rings - you name it. I'm sure most of us on here could do the same. Responding to the whims of Google is such a waste of time and resources. Why doesn't Google simply choose a direction and stick with it? What is their strategy exactly?
Algorithm Updates | | McTaggart0 -
Question relates to mobile site & duplicate content.
We are working on the mobile version of a large site (migraine.com) and will be using a separate theme for it (directing visitors to m.migraine.com)- what are the necessary code or other important step we should take so that we do get penalized for having duplicate content? Thank you in advance for your responses
Algorithm Updates | | OlivierChateau0