How can I tell Google two sites are non-competing?
-
We have two sites, both English language. One is a .ca and the other is a .com, I am worried that they are hurting one another in the search results. I'd like to obviously direct google.ca towards the .ca domain and .com towards the .com domain and let Google know they are connected sites, non-competing.
-
The solution in the implementation of the rel="alternate" hreflang, as explained by Google here:
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
-
Do you know of a better way for a company to do this? The content essentially has to be identical, but at the same time the shipping policies, pricing etc. has to be different because one is only for Canadian residents and one is only for American.
-
is the content the same because if so google will filter one out as duplicate content. Definitely geo target and add language / location meta tags on each site globally. If the CA site has a CA address that would help a lot too, you can submit each site in Google places / + and verify the addresses as being with that country address location.
you don't want to tell Google you own two sites with the same content targeting the same keywords.
-
The problem is they are the same company, selling the same products. Just the different Canadian and American pricing, slightly different information etc. So the sites are very similar. I'm worried that one (or both) are being penalized for this. Is there anyway I can tell Google they are the same company so it is ok that they are nearly identical? For example I wouldn't care if the .ca domain never appeared in Google.com and would never care if the .com Domain never appeared in google.ca.
Or is there a better way to handle this?
-
geo target different regions so Google knows to show .com in google.com and .ca in google.ca (although they already know because of the .ca domain.
you don't want google to know they are connected sites though. keep them as separate as possible.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC Performance completely dropped off, but Google Analytics is steady. Why can't GSC track my site anymore?
Hey everyone! I'm having a weird issue that I've never experienced before. For one of my clients, GSC has a complete drop-off in the Performance section. All of the data shows that everything fell flat, or almost completely flat. But in Google Analytics, we have steady results. No huge drop-off in traffic, etc. Do any of you know why GSC would all of a sudden be unable to crawl our site? Or track this data? Let me know what you think!
Algorithm Updates | | TaylorAtVelox
Thanks!2 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Https & Google Updated Guidelines
Hi We have https on aspects of the site which users directly interact with, such as login, basket page. But we don't have https across the whole site. In light of Google adding it to their guidelines - is this something we need to put into action? Also same question on the Accessibility point Ensure that your pages are useful for readers with visual impairments, for example, by testing usability with a screen-reader. Are we going to be penalised if these are not added to our site? Thank you
Algorithm Updates | | BeckyKey0 -
Google not crawling click to expand content - suggestions?
It seems like Google confirmed this week in a G+ hangout that content in click to expand content e.g. 'read more' dropdown and tabbed content scenarios will be discounted. The suggestion was if you have content it needs to be visible on page load. Here's more on it https://www.seroundtable.com/google-index-click-to-expand-19449.html and the actual hangout, circa 11 mins in https://plus.google.com/events/cjcubhctfdmckph433d00cro9as. From a UX and usability point of view having a lot of content that was otherwise tabbed or in click to expand divs can be terrible, especially on mobile. Does anyone have workable solutions or can think of examples of really great landing pages (i'm mostly thinking ecommerce) that also has a lot of visible content? Thanks Andy
Algorithm Updates | | AndyMacLean0 -
Can I have the same item description on Amazon, eBay and my website?
Hi guys, After looking on the Internet and reading the Learn SEO section on this site, I've realised that Google doesn't like duplicate content and penalises it, whether that's duplication on your own site or of another site's content. We are an online retailer currently selling on different platforms including Amazon, eBay and our own ecommerce webstore. Is it okay to have the same item description (i.e. main page copy) on each of these sites, or will our search rankings get negatively impacted? Thank you in advance, I have researched on this issue also but I couldn't find a concrete answer. Tanay
Algorithm Updates | | goforgreen0 -
Relevant site outranked by powerful un-relevant sites
One of my clients has a site in a niche market, and has been ranking well for years. Since the Penguin algorithm changes his site dropped and 4-5 other sites came out of nowhere to take to top spots. These are very large sites, but they are not really reliant to the search terms. Sure, they sell one or two of the niche products, but our site is dedicated to those products. The site has been updated since I took over on the site, and is well SEOed. The site in question still ranks 1st for the keywords in every other search engine imaginable. Has anyone else encountered this? If so, how did you combat it?
Algorithm Updates | | DavidWilsonSEO0 -
Does Google do domain level topic modeling? If so, are off-site factors such as search traffic volume taken into account?
80% of my site's organic traffic is coming through a resource that is only somewhat related. Does Google think the main topic of my site is terms this resource targets thus bumping the terms I care about to a sub-topic level of sorts? If this is the case, would putting the resource information into a sub-domain help to solve the problem?
Algorithm Updates | | tatermarketing0 -
Organic CTR on Google - KPI?
Hi, I was hoping for some advice on my keyword analysis I have completed. So far I have identified a hitlist of high volume keyword associated to the industry I operate in. As well as this, I'm monitoring our keyword positions within the SERPS. Question: Is there a CTR metric available depending on the position your keyword ranks within Google? i.e. If I am position 3 and looking to move to position 1 on a specific keyword, what amount of incremental search volume would be geneerated to my website? PResumably the CTR would also depend on what market you operate in too I am also going on a 65% / 35% Organic/PPC split based on keyword search volume so to give me a true reflection of the search volume available... Any advice on this would be much appreciated... Simon
Algorithm Updates | | simonsw0