How can I penalise my own site in an international search?
-
Perhaps penalise isn't the right word, but we have two ecommerce sites.
One at .com and one at .com.au.
For the com.au site we would like only that site to appear for our brand name search in google.com.au.
For the .com site we would like only that site to appear for our brand name search in google.com.
I've targeted each site in the respective country in Google Webmaster Tools and published the Australian and English address on the respective site.
What I'm concerned about is people on Google.com.au searching our brand and clicking through to the .com site.
Is there anything I can do to lower the ranking of my .com site in Google.com.au?
-
One of the examples scenarios Google gives is:
Your pages have broadly similar content within a single language, but the content has small regional variations. For example, you might have English-language content targeted at readers in the US, GB, and Ireland.
Tough call, you might have to do some research to see if this solution will help in your particular scenario.
-
They aren't identical, they have a different design, text, almost everything.
They are similar. As in they are both book stores.
The .com.au has Australian wording / spelling, the .com has English spelling and wording.
Do we need to specify hreflang="en-au" if they are different sites?
-
Are the sites identical but just hosted on different domains to target different regions?
Is there any variation in the English used on each site, for example, do you have Australian English spelling on the .com.au and US (or other) English on the .com?
If yes, you might want to have a look into the rel="alternative" hreflang="x" meta tags.
Checkout: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077
Especially the Example configuration: rel="alternate" hreflang="x" in action section
-
Thanks Mat, that definitely sounds wise.
Penalise was definitely the wrong word, I more meant, what other signals can we put out to Google to say that this is the com.au site and we want this to appear above the com.
-
I'd be ever so careful about doing anything to deliberately try to lower you ranking. It just sounds like an approach that could go horribly wrong.
You best bet might be to live with the fact that both will appear (or better still - enjoy and encourage it), but use the sites to achieve the end goal of getting users on to the correct site.
The usual way to do this would be to check the IP address of the user against a geoip database. I've used both the paid and free versions of the database available at maxmind.com for this. That will allow you to identify users that are in Australia and direct them towards to .au site.
How you direct them is important. You could just automatically redirect those users to the new site. Some people will say that this can look like cloaking and cause issues, but I don't believe that alone will do this. However it is often better to intercept those users with a message along the lines of "It looks like you are connecting from Australia - would you like to view our dedicated Australia website?" - then list the benefits and offer a choice there.
If you do that it would be good to set a custom variable in analytics to know when that message had been shown. That would allow you to measure how many people are following the suggestion.
Once you are happy it is working then you will probably end up encouraging both domains to appear as dominating the SERP for your brand is always useful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Mobile Site Annotations
Our company has a complex mobile situation, and I'm trying to figure out the best way to implement bidirectional annotations and a mobile sitemap. Our mobile presence consists of three different "types" of mobile pages: Most of our mobile pages are mobile-specific "m." pages where the URL is completely controlled via dynamic parameter paths, rather than static mobile URLs (because of the mobile template we're using). For example: http://m.example.com/?original_path=/directory/subdirectory. We have created vanity 301 redirects for the majority of these pages, that look like http://m.example.com/product that simply redirect to the previous URL. Six one-off mobile pages that do have a static mobile URL, but are separate from the m. site above. These URLs look like http://www.example.com/product.mobile.html Two responsively designed pages with a single URL for both mobile and desktop. My questions are as follows: Mobile sitemap: Should I include all three types of mobile pages in my mobile sitemap? Should I include all the individual dynamic parameter m. URLs like http://m.example.com/?original_path=/directory/subdirectory in the sitemap, or is that against Google's recommendations? Bidirectional Annotations: We are unable to add the rel="canonical" tag to the m. URLs mentioned in section #1 above because we cannot add dynamic tags to the header of the mobile template. We can, however, add them to the .mobile.html pages. For the rel="alternate" tags on the desktop versions, though, is it correct to use the dynamic parameter URLs like http://m.example.com/?original_path=/directory/subdirectory as the mobile version target for the rel="alternate" tag? My initial thought is no, since they're dynamic parameter URLs. Is there even any benefit to doing this if we can't add the bidirectional rel="canonical" on those same m. dynamic URLs? I'd be immensely grateful for any advice! Thank you so much!
Intermediate & Advanced SEO | | Critical_Mass0 -
Site: inurl: Search
I have a site that allows for multiple filter options and some of these URL's have these have been indexed. I am in the process of adding the noindex, nofollow meta tag to these pages but I want to have an idea of how many of these URL's have been indexed so I can monitor when these have been re crawled and dropped. The structure for these URL's is: http://www.example.co.uk/category/women/shopby/brand1--brand2.html The unique identifier for the multiple filtered URL's is --, however I've tried using site:example.co.uk inurl:-- but this doesn't seem to work. I have also tried using regex but still no success. I was wondering if there is a way around this so I can get a rough idea of how many of these URL's have been indexed? Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Why is this site not ranking?
http://www.petstoreunlimited.com They get good grades from the on-page tool. The links are not amazing, but are not super spammy. Yet it ranks for nothing they target Any reason why?
Intermediate & Advanced SEO | | Atomicx0 -
How can I rank a national site for local terms
Hi All I have a website that covers all parts of the UK and I wish to be found for terms such as "car for sale London" "car for sale Manchester" and so on. In the past I have created separate landing pages for each town and city but with the quality score of a page becoming more of a ranking factor it is hard to make 300 + town pages interesting and useful. Is it best practice to do what I am doing and improve the quality of each of the pages or would I be better off removing the old pages and using some other technique to rank for the local searches? Thanks for your help
Intermediate & Advanced SEO | | MotoringSEO0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
My site links have gone from a mega site links to several small links under my SERP results in Google. Any ideas why?
A site I have currently had the mega site links on the SERP results. Recently they have updated the mega links to the smaller 4 inline links under my SERP result. Any idea what happened or how do I correct this?
Intermediate & Advanced SEO | | POSSIBLE0 -
Internal competition
If we have two different sub domain pages that talk about the same service but with different content, how will Google react while ranking? Example : xxxx.abc.com/company/solutions/service1 yyyy.abc.com/service1 Suppose if www.abc.com has good authority, which URL will be more benefited?
Intermediate & Advanced SEO | | gmk15670