Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Factors that affect Google.com vs .ca
-
Though my company is based in Canada, we have a .com URL, we're hosted on servers in the U.S., and most of our customers are in the U.S. Our marketing efforts are focused on the U.S. Heck, we even drop the "u" in "colour" and "favour"!
Nonetheless we rank very well in Google.ca, and rather poorly on Google.com.
One hypothesis is that we have more backlinks from .ca domains than .com, but I don't believe that to be true. For sure, the highest quality links we have come from .coms like NYTimes.com.
Any suggestions on how we can improve the .com rankings, other than keeping on with the link building?
-
Thanks for letting us know how things worked out Aspirant.
Andy
-
Final verdict:
I took the plunge. Even though our product is geography agnostic, I changed our Webmaster Tools setting to "U.S."
Sure enough, we immediately saw some improvements in the google.COM rankings. Not much of an impact on .CA, and any loss here was definitely made up in the new .COM traffic.
I'll be doing a deeper dive into the data later.
Thanks everyone.
-
Hey Rob,
I have a bit of exp with this - had a Canadian based site that wanted to target the states. We were ranking well for .CA and not so good in .COM. I actually did this in WMT for a site - set geo-targetting to USA - and after a week or so started noticing a huge jump in .COm for a lot of keywords. What was great was that the rankings in .CA stayed consistent.
The only drop I noticed was in the .CA (Canada Only) searches. These completely dropped off the map. But normal searches in google.ca were fine.Don't know if this will always happen, but this is my experience.
-
I had exactly the same with a spanish site of mine .es for a long time i was first in google.com but knowhere to be found in google.es . Everybody kept telling me that this was not because i had a lot of .com link and none where .es But when time passed without any link changes the keywords aked well in google.es . So is it maybe the case the some countries are just a few months behind?
-
I have noticed that getting links from the appropriate TLD extension really determines where you rank on each google serps for the individual country.
you can search for sites related to yours for the specific TLD by putting inurl:.com in google along with your keywords.
the same thing works for all other extensions.
this makes finding .edu link opportunities a breeze for example
Besides link building you will want to make sure on webmaster tools you have set your targeted country to the country you want to rank best for. For example I have a site about college students which I've set to target the US since Canada mostly calls post secondary education University and College so the audience is split much more.
Hope this helps.
-
Sorry, I meant David Mihm -- oops!
-
I suspect having the settings in WMT set for the USA "might" hurt your performance in other areas, however the small company website (that gets 90% of its business from the USA) I mentioned in my prior response has the setting set to USA and it ranks #3 for it's main search term in both .ca and .com. Having claimed a Local Places account might also be an issue. I'd suggest you contact either Todd Mihm (http://www.davidmihm.com/blog) or Mike Blumenthal (http://blumenthals.com/blog) for an answer to that question.
-
Thanks for the answer. A couple of questions come to mind:
Won't setting our Google Webmaster Tools to United States hurt our performance in other parts of the world? So far I've made a point of ensuring that Webmaster Tools has us as not geo-specific ("Target users in: unlisted", on the Site Configuration > Settings screen of Webmaster Tools).
Also (on the advice of another SEO advisor) we verified our Google Places location, so is there a risk of sending mixed signals to Google and getting hurt by that?
-
The competition is usually stronger in the USA (.com) arena than in Canada (.ca). I have a little company site (with little work done in the way of SEO) that ranks #3 in both .ca and .com for "wheelchair trays". You may want to adjust your settings on Google WebMasterTools to ensure your site is set to United States rather than Canada. As David Kauzlaric has mentioned, you will definitely benefit from having more links from US based sites - I'd focus on that as a first step.
-
Still no breakthroughs on this issue. Our performance keeps improving on .ca and .com, which is obviously good, but our ranking on .com is always very, very far behind our .ca performance.
It's still a mystery to me, given that most of the inbound links are from U.S.-based, .com websites.
The only answer that works in my mind is that .ca uses a different algorithm. But I'm still very interested in hearing other thoughts!
Thanks,
Rob
-
Hi Rob,
Have you seen any changes with your rankings on Google.ca and Google.com? Do you have any other questions or comments you can add to help others that may be in a similar situation?
Here's hoping you got to enjoy two long weekends in a row from both countries!
-
Agree.
We did a link building campaign for a german website (dot de) and most of the links were from .com websites. They started to rank very well on google.com and google.de had only minor impacts. Is clear that the links should be from the same country zone if you want to rank in that particular area.
You should focus on links from .com domain - but that should be easier then building links from .ca.
You should also get a google maps account with your US location - if you have one. That alone should bring up your results in the US.
-
It's a pretty well known fact that non-US versions of Google are not using the same algorithm and therefore are "behind". This could be the case where you are employing methods that a couple years ago were effective and are working well for .CA but on .COM not as well.
The biggest thing you can do is work on high quality content and build links. Remember, linking is somewhere around 70% of the algorithm alone. Work on getting more .COM authoritative links from sites like NYT, USAToday, etc...
Also, if a good portion of your links are from .CA, that very well could affect it too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
.com vs .co.uk
Hi, we are a UK based company and we have a lot of links from .com websites. Does the fact that they are .com or .co.uk affect the quality of the links for a UK website?
Intermediate & Advanced SEO | | Caffeine_Marketing0 -
How does Google handle fractions in titles?
Which is better practice, using 1/2" or ½"? The keyword research suggests people search for "1 2" with the space being the "/". How does Google handle fractions? Would ½ be the same as 1/2?
Intermediate & Advanced SEO | | Choice2 -
Can you index a Google doc?
We have updated and added completely new content to our state pages. Our old state content is sitting in a our Google drive. Can I make these public to get them indexed and provide a link back to our state pages? In theory it sounds like a great link building strategy... TIA!
Intermediate & Advanced SEO | | LindsayE1 -
My site shows 503 error to Google bot, but can see the site fine. Not indexing in Google. Help
Hi, This site is not indexed on Google at all. http://www.thethreehorseshoespub.co.uk Looking into it, it seems to be giving a 503 error to the google bot. I can see the site I have checked source code Checked robots Did have a sitemap param. but removed it for testing GWMT is showing 'unreachable' if I submit a site map or fetch Any ideas on how to remove this error? Many thanks in advance
Intermediate & Advanced SEO | | SolveWebMedia0 -
Microsites: Subdomain vs own domains
I am working on a travel site about a specific region, which includes information about lots of different topics, such as weddings, surfing etc. I was wondering whether its a good idea to register domains for each topic since it would enable me to build backlinks. I would basically keep the design more or less the same and implement a nofollow navigation bar to each microsite. e.g.
Intermediate & Advanced SEO | | kinimod
weddingsbarcelona.com
surfingbarcelona.com or should I rather go with one domain and subfolders: barcelona.com/weddings
barcelona.com/surfing I guess the second option is how I would usually do it but I just wanted to see what are the pros/cons of both options. Many thanks!0 -
How to NOT appear in Google results in other countries?
I have ecommerce sites the only serve US and Canada. Is there a way to prevent a site from appearing in the Google results in foreign countries? The reason I ask is that we also have a lot of informational pages that folks in other countries are visiting, then leaving right after reading. This is making our overall Bounce Rate very high (64%). When we segment the GA data to look at just our US visitors, then the Bounce Rate drops a lot. (to 48%) Thanks!
Intermediate & Advanced SEO | | GregB1230 -
Will Google View Using Google Translate As Duplicate?
If I have a page in English, which exist on 100 other websites, we have a case where my website has duplicate content. What if I use Google Translate to translate the page from English to Japanese, as the only website doing this translation will my page get credit for producing original content? Or, will Google view my page as duplicate content, because Google can tell it is translated from an original English page, which runs on 100+ different websites, since Google Translate is Google's own software?
Intermediate & Advanced SEO | | khi50 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0