How can I penalise my own site in an international search?
-
Perhaps penalise isn't the right word, but we have two ecommerce sites.
One at .com and one at .com.au.
For the com.au site we would like only that site to appear for our brand name search in google.com.au.
For the .com site we would like only that site to appear for our brand name search in google.com.
I've targeted each site in the respective country in Google Webmaster Tools and published the Australian and English address on the respective site.
What I'm concerned about is people on Google.com.au searching our brand and clicking through to the .com site.
Is there anything I can do to lower the ranking of my .com site in Google.com.au?
-
One of the examples scenarios Google gives is:
Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland.
Tough call, you might have to do some research to see if this solution will help in your particular scenario.
-
They aren't identical, they have a different design, text, almost everything.
They are similar. As in they are both book stores.
The .com.au has Australian wording / spelling, the .com has English spelling and wording.
Do we need to specify hreflang="en-au" if they are different sites?
-
Are the sites identical but just hosted on different domains to target different regions?
Is there any variation in the English used on each site, for example, do you have Australian English spelling on the .com.au and US (or other) English on the .com?
If yes, you might want to have a look into the rel="alternative" hreflang="x" meta tags.
Checkout: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Especially the Example configuration: rel="alternate" hreflang="x" in action section
-
Thanks Mat, that definitely sounds wise.
Penalise was definitely the wrong word, I more meant, what other signals can we put out to Google to say that this is the com.au site and we want this to appear above the com.
-
I'd be ever so careful about doing anything to deliberately try to lower you ranking. It just sounds like an approach that could go horribly wrong.
You best bet might be to live with the fact that both will appear (or better still - enjoy and encourage it), but use the sites to achieve the end goal of getting users on to the correct site.
The usual way to do this would be to check the IP address of the user against a geoip database. I've used both the paid and free versions of the database available at maxmind.com for this. That will allow you to identify users that are in Australia and direct them towards to .au site.
How you direct them is important. You could just automatically redirect those users to the new site. Some people will say that this can look like cloaking and cause issues, but I don't believe that alone will do this. However it is often better to intercept those users with a message along the lines of "It looks like you are connecting from Australia - would you like to view our dedicated Australia website?" - then list the benefits and offer a choice there.
If you do that it would be good to set a custom variable in analytics to know when that message had been shown. That would allow you to measure how many people are following the suggestion.
Once you are happy it is working then you will probably end up encouraging both domains to appear as dominating the SERP for your brand is always useful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Redirecting Ecommerce Site
Hi I'm working on a big site migration I'm setting up redirects for all the old categories to point to the new ones. I'm doing this based on relevancy, the categories don't match up exactly but I've tried to redirect to the most relevant alternative. Would this be the right approach?
Intermediate & Advanced SEO | | BeckyKey1 -
Can cross domain canonicals help with international SEO when using ccTLDs?
Hello. My question is:** Can cross domain canonicals help with international SEO when using ccTLDs and a gTLD - and the gTLD is much more authoritative to begin with? ** I appreciate this is a very nuanced subject so below is a detailed explanation of my current approach, problem, and proposed solutions I am considering testing. Thanks for the taking the time to read this far! The Current setup Multiple ccTLD such as mysite.com (US), mysite.fr (FR), mysite.de (DE). Each TLD can have multiple languages - indeed each site has content in English as well as the native language. So mysite.fr (defaults to french) and mysite.fr/en-fr is the same page but in English. Mysite.com is an older and more established domain with existing organic traffic. Each language variant of each domain has a sitemap that is individually submitted to Google Search Console and is linked from the of each page. So: mysite.fr/a-propos (about us) links to mysite.com/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in French. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc) mysite.fr/en-fr/about-us links to mysite.com/en-fr/sitemap.xml that contains URL blocks for every page of the ccTLD that exists in English. Each of these URL blocks contains hreflang info for that content on every ccTLD in every language (en-us, en-fr, de-de, en-de etc). There is more English content on the site as a whole so the English version of the sitemap is always bigger at the moment. Every page on every site has two lists of links in the footer. The first list is of links to every other ccTLD available so a user can easily switch between the French site and the German site if they should want to. Where possible this links directly to the corresponding piece of content on the alternative ccTLD, where it isn’t possible it just links to the homepage. The second list of links is essentially just links to the same piece of content in the other languages available on that domain. Mysite.com has its international targeting in Google Search console set to the US. The problems The biggest problem is that we didn’t consider properly how we would need to start from scratch with each new ccTLD so although each domain has a reasonable amount of content they only receive a tiny proportion of the traffic that mysite.com achieves. Presumably this is because of a standing start with regards to domain authority. The second problem is that, despite hreflang, mysite.com still outranks the other ccTLDs for brand name keywords. I guess this is understandable given the mismatch of DA. This is based on looking at search results via the Google AdWords Ad Preview tool and changing language, location, and domain. Solutions So the first solution is probably the most obvious and that is to move all the ccTLDs into a subfolder structure on the mysite.com site structure and 301 all the old ccTLD links. This isn’t really an ideal solution for a number of reasons, so I’m trying to explore some alternative possible routes to explore that might help the situation. The first thing that came to mind was to use cross-domain canonicals: Essentially this would be creating locale specific subfolders on mysite.com and duplicating the ccTLD sites in there, but using a cross-domain canonical to tell Google to index the ccTLD url instead of the locale-subfolder url. For example: mysite.com/fr-fr has a canonical of mysite.fr
Intermediate & Advanced SEO | | danatello
mysite.com/fr-fr/a-propos has a canonical of mysite.fr/a-propos Then I would change the links in the mysite.com footer so that they wouldn’t point at the ccTLD URL but at the sub-folder URL so that Google would crawl the content on the stronger domain before indexing the ccTLD domain version of the URL. Is this worth exploring with a test, or am I mad for even considering it? The alternative that came to my mind was to do essentially the same thing but use a 301 to redirect from mysite.com/fr-fr to mysite.fr. My question is around whether either of these suggestions might be worth testing, or am I completely barking up the wrong tree and liable to do more harm than good?0 -
Splitting One Site Into Two Sites Best Practices Needed
Okay, working with a large site that, for business reasons beyond organic search, wants to split an existing site in two. So, the old domain name stays and a new one is born with some of the content from the old site, along with some new content of its own. The general idea, for more than just search reasons, is that it makes both the old site and new sites more purely about their respective subject matter. The existing content on the old site that is becoming part of the new site will be 301'd to the new site's domain. So, the old site will have a lot of 301s and links to the new site. No links coming back from the new site to the old site anticipated at this time. Would like any and all insights into any potential pitfalls and best practices for this to come off as well as it can under the circumstances. For instance, should all those links from the old site to the new site be nofollowed, kind of like a non-editorial link to an affiliate or advertiser? Is there weirdness for Google in 301ing to a new domain from some, but not all, content of the old site. Would you individually submit requests to remove from index for the hundreds and hundreds of old site pages moving to the new site or just figure that the 301 will eventually take care of that? Is there substantial organic search risk of any kind to the old site, beyond the obvious of just not having those pages to produce any more? Anything else? Any ideas about how long the new site can expect to wander the wilderness of no organic search traffic? The old site has a 45 domain authority. Thanks!
Intermediate & Advanced SEO | | 945010 -
Adult Toys Sites
Does anyone know of any changes SEOwise when running an adult toy site versus a normal eCommerce site? Is there any tips or suggestions that are worth knowing to achieve rankings faster? Thanks,
Intermediate & Advanced SEO | | the-gate-films0 -
International Sitemaps
Hey Dudes, Quick question about international sitemaps. Basically we have a mix of subfolders, subdirectories, and ccTLDs for our different international/language sites. With this in mind how do you recommend we set up the site map. I'm thinking the best solution would be to move the subfolders and subdirectories onto an index and put the ccTLD site maps on their own root only. domain.ca/sitemap (This would only contain the Canada pages) domain.com, fr.domain.com, domain.com/eu/ (These pages would all have an index on domain.com/sitemap that points to each language/nations index) OR Should all site have a site map under their area. domain.com/sitemap, fr.domain.com/sitemap, domain.com/eu/sitemap, domain.ca/sitemap? I'm very new to international SEO. I know that our current structure probably isn't ideal... but it's what I've inherited. I just want to make sure I get a good foundation going here. So any tips are much appreciated!
Intermediate & Advanced SEO | | blake.runyon0 -
My site is always in the top 4 on google, and sometimes goes to #2\. But the site at #1 is always at #1 .. how can i beat them?
So i'm sure this is a very generic question.. of course everyone wants to be #1. We are an ecommerce web site. We have all sorts of products, user ratings, and are loved by our customers. We sell over 3 million a year. So let me give you some data.. First of all one of the sites that keeps taking the #2 or #3 spot is amazons category for what we sell.. (i'm not sure if I should say who we are here.. as I don't want the #1 spot to realize we are trying to take them over!) Amazon of course has a domain authority of 100. But they never take the #1 spot. The other site that takes the #2 and #3 spot is not even selling anything. Happens to be a technical term's with the same name wikipedia page! (i wish google would figure out people aren't looking for that!) Anyways.. every day we bouce back and forth between #4 and #2.. but #1 never changes.. Here are the stats of us verse #1 from moz: #1: Page Authority: 56.8, Root Domains Linking to page: 158, Domain Authority: 54.6: root domains linking to the root domain 1.42k my site: Page Authority: 60.6, Root domains linking to the page: 562, Domain Authority: 52.8: root domains linking to the root domain: 1.03k So they beat us in domain authority SLIGHTLY and in root domains linking to the root domain. So SEO masters.. what do I do to fix this? Get better backlinks? But how.... I can't just email GQ and ask them to write about us can I? I'm open to all things.. Maybe i'm not using moz data correctly.. We should at least be #2. We get #2 every other day.
Intermediate & Advanced SEO | | 88mph0 -
Can changing dynamic url of over 2000 pages site after a year will change its ranking
Hi- Have built site in joomla The urls are dynamic in nature with over a year - all pages are well indexed and backlinks been built over with these dynamic urls Need to know if i hire an agency to change over dynamic url to static url of these 2000 pages - will it also change all Search engine ranking positions of existing urls Will all the seo effort and backlinks build over 15 months will still hold valid or this will just back to square one due to change of urls is it advisable to get the url changed from dynamic to static one - especially when site is receiving over 75,000 visitors every month Thanks in advance. Look for expert suggestions
Intermediate & Advanced SEO | | Modi0 -
Think I may have found a problem with site. Can you confirm my suspicions?
So I've been wracking my brain about a problem. I had posted earlier about our degrading rank that we haven't been able to arrest. I thought we were doing everything right. Many years ago we had a program that would allow other stores in our niche use our site as a storefront if they couldn't deal with setting up their own site. They would have their own homepage with their own domain but all links from that page would go to our site to avoid duplicate content issues (before I knew about canonical meta tags or before they existed, I don't remember). I just realize that we had dozens of these domains pointing to our site without nofollow meta tags. Is it possible that this pattern looked like we were trying to game Google and have been penalized as some kind of link farm since Panda? I've added nofollow meta tags to these domains. If we were being penalized for this, should this fix the problem?
Intermediate & Advanced SEO | | IanTheScot0