How can I tell Google two sites are non-competing?
-
We have two sites, both English language. One is a .ca and the other is a .com, I am worried that they are hurting one another in the search results. I'd like to obviously direct google.ca towards the .ca domain and .com towards the .com domain and let Google know they are connected sites, non-competing.
-
The solution in the implementation of the rel="alternate" hreflang, as explained by Google here:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
-
Do you know of a better way for a company to do this? The content essentially has to be identical, but at the same time the shipping policies, pricing etc. has to be different because one is only for Canadian residents and one is only for American.
-
is the content the same because if so google will filter one out as duplicate content. Definitely geo target and add language / location meta tags on each site globally. If the CA site has a CA address that would help a lot too, you can submit each site in Google places / + and verify the addresses as being with that country address location.
you don't want to tell Google you own two sites with the same content targeting the same keywords.
-
The problem is they are the same company, selling the same products. Just the different Canadian and American pricing, slightly different information etc. So the sites are very similar. I'm worried that one (or both) are being penalized for this. Is there anyway I can tell Google they are the same company so it is ok that they are nearly identical? For example I wouldn't care if the .ca domain never appeared in Google.com and would never care if the .com Domain never appeared in google.ca.
Or is there a better way to handle this?
-
geo target different regions so Google knows to show .com in google.com and .ca in google.ca (although they already know because of the .ca domain.
you don't want google to know they are connected sites though. keep them as separate as possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
Technical Argument to Prefer non-www to www?
I've been recommending using non-www vs. www as a preferable set up if a client is starting a site from scratch, and there aren't any pre-existing links to consider. I'm wondering if this recommendation still holds? I’ve been looking on the interwebs and I’m seeing far fewer articles arguing for the non-www version. In the two courts, I’m seeing highlighted: Pro www: (ex: www.domain.com) Works better with CDN networks, where a domain needs to be specified (though that argument is 3 years old) Ability to restrict cookies to one hostname (www) or subdomain (info. blog. promo.) if using multiple subdomains IT people generally prefer it Pro non-www (ex: domain.com) If you ever want to support or add https://, you don’t have to support 2 sets of urls/domains Mindset: fewer and fewer people think in terms of typing in www before a site url, the future is heading towards dropping that anyway. Though that is a bit of a cosmetic argument…. Is there a trend going back to www? Is there a technical argument to recommend non-www over www? Thanks!
Algorithm Updates | | Allie_Williams0 -
Where has Google found the £1.00 value for the penny black? Is it Google moving beyond the mark-ups too?
Hi guys, I am curious, so am wondering something about the Penny Black SERPs.
Algorithm Updates | | madcow78
Apparently Google shows a value of £1.00 Penny Black SERP From where does it come from? It's not the value Penny Black Value SERP The Wikipedia page hasn't any mark-up about it, actually it has the Price value mark-up of 1 penny Penny Black Wiki Markup Among the rare stamps, also the Inverted Jenny shows a value Inverted Jenny SERP But it's clearly taken from USPS and it's the cost of a new version of this rare stamp USPS Inverted Jenny Indeed, the mark-up matches that value USPS Inverted Jenny Mark-up I've been looking on-line for a new version of the Penny Black, but couldn't find anything.
The only small piece of information that I've found to correlate one pound with the Penny Black is on the Wikipedia page, but the point is: is Google able to strip those information from that piece? It's not a mark-up, it's not a number and mostly it's not a simple sentence like "The penny black cost was of £1.00" It reads "One full sheet cost 240 pennies or one pound sterling". Penny Black Wikipedia particular Is it Google moving beyond the mark-ups too? Thanks, Pierpaolo 9Cm3MOs.jpg f7XYNtF.jpg 5PpwapB.jpg hYUJswI.jpg 7kbIC4Q.jpg jnu1Gbe.jpg Wzltg0t.jpg2 -
With MATT telling PR gone which factor tells now site is good
MATT CUTTS in his like second last video told the world.Guys turn off PR in your Browser.If PR is no longer have value than what an SEO professional needs to know is the site good or bad. 1.Domain authority. 2.alexa 3.SEMRUSH rank 4.compete. So guys need your advice about it.
Algorithm Updates | | csfarnsworth0 -
Google and Wikipedia
Ok, I love Wikipedia as much as the next guy but the amount of weight that google puts on this site is getting crazy. My search terms that I am going after are "speakers" and "loudspeakers" Can somebody tell me why wikipedia needs the top 8 -10 spots for those terms? is that really a good search result for users of google? More of a rant then a question I know. I just needed to get that off my chest!.
Algorithm Updates | | kevin48030 -
Google site links on sub pages
Hi all Had a look for info on this one but couldn't find much. I know these days that if you have a decent domain good will often automatically put site links on for your home if someone searches for your company name, however has anyone seen these links appear for sub pages? For example, lets say I had a .com domain with /en /fr /de sub folders, each seoed for their location. If I were to then have domain.com/en/ as no1 in Google for my company in the UK would I be able to get site links under this or does it only work on the 'proper' homepage domain.com/ A client of mine wants to reorganise their website so they have different location sections ranking in different markets but they also want to keep having sitewide links as they like the look of it Thanks Carl
Algorithm Updates | | Grumpy_Carl0 -
What determines rankings in a site: search?
When I perform a "site:" search on my domains (without specifying a keyword) the top ranked results seem to be a mixture of sensible top-level index pages plus some very random articles. Is there any significance to what Google ranks highly in a site: search? There is some really unrepresentative content returned on page 1, including articles that get virtually no traffic. Is this seriously what Google considers our best or most typical content?
Algorithm Updates | | Dennis-529610 -
Did google change their algorithm over the past week?
I did some home page optimization with the seo moz on page key word optimization tool and we are now back in the top three in the past week (after dropping to page 3 a month or so ago). It seems that google has gone back to combining google places with organic searches. Has anyone else noticed this type of change? I did read some posts about panda 2.2, which seems to explain some of these findings. I am wondering if things are in flux or they may be more stable this way? Thanks for the insights.
Algorithm Updates | | fertilityhealth0