What is the best way to rank well in two countries simultaneously with only one CCTLD
-
I have a .co.nz website and would like to rank on .com.au without setting up a new country specific website for .com.au. What is the best way to do this ?
-
Xnumrtik pointed out a very important aspect here. Associating your website in GWT with one country will definitely boost your rankings in that country, but it will have no effect on your SERPs in another country.
-
In GWT you are only allowed to associate your website with one country.
The first question is: should you associate your website with one country if you want to target two countries?
If you have a primary traget and a secondary one, that may be a good idea, otherwise, I would let the .co.nz do the job for NZ and try to get links from an Australian neighborhood, which means associating your website with other web properties clearly linked to Australia (content, domain, links they have...).
Hope that helps
-
Make sure to associate your website with the targeted locations in any possible ways:
- besides optimizing your site for the targeted keywords, make sure to also include "Australia" and other geographical hints on your page.
- get local websites and local bloggers to link to you.
- if your website is promoting a local business, make sure to also register it on Google+ Places
These are just a few starting points...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If I have two brands and I market one in English (BrandA.com) and one in Spanish (BrandB.com), and the websites are identical but in different languages, would that have a negative impact on SEO due to duplicate content?
I have a client who wants a website in Spanish and one in English. Typically we would use a multi-language plugin for a single site (brandA.com/en or /es), but this client markets to their Spanish-speaking constituents under a different brand. So I am wondering if we have BrandA.com in English, and the exact same content in Spanish at BrandB.com if there will be negative SEO implications and/or if it will be recognized as duplicate content by search engines?
Intermediate & Advanced SEO | | Designworks-SJ1 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Rankings Continue To Drop
Hi there I'm at wits end trying to stop the slow bleed in our rankings to our store URL's that started mid March 2017 and continues through to today. I'd appreciate some pointers and hope this will throw up a challenge to someone out there. Here is the background: 1. We run an e-commerce store on Shopify with a blog. The recent ranking decline has been almost entirely on the store URL's (catalogue and product pages) while at the same time we have seen steady growth in search volumes in the blog - this makes me think we are seeing a Penguin4 penalty of some type, because the impact is confined to the store URLs. 2. We received a linked based manual penalty back in 2014 and this was successfully removed within 3 months. We have quite a large disavow file as a result. 3. Shortly after launch of Penguin 4.0 in Sept/Oct 2016 we saw a really nice boost in traffic and ascribed this to being under a previous Penguin algo penalty, now removed. 4. Come March 2017 we see a small but steady weekly drop in rankings for our store URL's only, this steady drop continues through to today and over time has become significant. Approximately a 50% decline in visitor numbers to store URL's only as of today, since March. All of this despite: a. Initially I thought this was a Panda issue (because it seemed to coincide with Panda releases in March and May) so the entire website has been rewritten (during June and July) with thin content removed across the store and the blog. Remaining content has been given a serious content boost, being very careful to watch for over-optimisation, and for keyword cannibalisation. I think I've got this right. There are also no crawl issues being highlighted by Moz Pro or SEMRush site audits. b. Recently discovered, only last week in fact. A very low domain website, trust score (0 and 0) had been copying our blog articles steadily on a weekly basis, starting Oct 2016 (yes same time as Penguin4) and only caught last week (my fault for missing this). These articles were copied verbatim with all links and so generated nearly 400 spammy backlinks to our store URLs (about 30% of all the links we have). I've had all these articles removed from the spammy site via DMCA so none of those links exist anymore (as of 8/14/17). I've also disavowed this domain with Google. Could these spam links be the issue, and Google is still needing to crawl this site to see the links are no longer there? I'm not sure because my understanding is that Penguin4 would have devalued these links to start? c. A general review of links and anchor text. I've used Moz Pro and SEMRush backlink audit (linked to Google Search Console) and have removed all toxic links by contacting web masters and using Google disavow. This included removing any links that I think are causing over optimised anchor text. After disavow, according to SEMRush, we have no toxic backlinks left and only 50 out of 1200 links with "Money" anchor text. This exercise was completed two days ago when the last disavow file was uploaded. However I don't believe there was an issue here before as toxic links were < 1% of all links and exact match "money" anchor text in the region of 5%. d. One potential problem with our backlinks is that we have quite a few high domain/high trust links to our scholarship page with anchor text "official website". The net result is that our "Other" anchor text category is just over 50% of total links - these are mainly educational institutions with .edu domains. e. A review of internal linking. We had some what I would refer to as SEO links, linking all product and collection pages across the store, through a tagging type system. This was removed two days ago as it was probably unnecessary for user experience. Other than this I have two concerns remaining with our internal linking structure. The first is that we have quite a big static navigation on the left margin of our store collection pages. This is not faceted navigation, but static. The second is that we've internally linked from almost every blog to our "key" money page in the store, however with varied and non-money anchor text. f. There is nothing in Google Search Console indicating a problem, no manual actions, no significant HTML improvements, and Google has indexed over 90% of URL's compared to the sitemap. All broken links have been fixed - there were a lot before but all fixed as of three weeks ago. g. Checking site speed in GA. Speed has remained constant over the period and we have put in some fixes to improve it. Site speed has not got worse and scores average in Googles speed checker. That's about it. It's possible that with the recent changes made with respect to b, c, e and f above I just need to wait a couple more weeks for Google to catch up, and would appreciate thoughts on this. However I'd also like some thoughts on the static navigation on our collection pages, plus importantly on linking from blog articles to mostly a single money page in the store - of all that remains I think this is potentially a problem. Our website is located at www.thekewlshop.com Many thanks for your help. Charles
Intermediate & Advanced SEO | | charlesfitz0 -
Please select one, out of two
Which theme is more SEO friendly and Fast loading? Both on desktop and Mobile http://demo.mythemeshop.com/blogging/2014/03/26/age-steel/ Or http://demo.tagdiv.com/newsmag/td-post-cruise-2015-swim-trend-blurred-lines/
Intermediate & Advanced SEO | | Hall.Michael0 -
Handling duplicate content, whilst making both rank well
Hey MOZperts, I run a marketplace called Zibbet.com and we have 1000s of individual stores within our marketplace. We are about to launch a new initiative giving all sellers their own stand-alone websites. URL structure:
Intermediate & Advanced SEO | | relientmark
Marketplace URL: http://www.zibbet.com/pillowlink
Stand-alone site URL: http://pillowlink.zibbet.com (doesn't work yet) Essentially, their stand-alone website is a duplicate of their marketplace store. Same items (item title, description), same seller bios, same shop introduction content etc but it just has a different layout. You can scroll down and see a preview of the different pages (if that helps you visualize what we're doing), here. My Questions: My desire is for both the sellers marketplace store and their stand-alone website to have good rankings in the SERPS. Is this possible? Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one? Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue? Keen to hear your thoughts and if you have any suggestions for how we can handle this best. Thanks in advance!0 -
Best way to target multiple geographic locations
Hello Mozzers! If you are a service provider wanting to target geographic locations outside of the region where you're physically located, what's the best approach? For example, I have a service provider whose main market is not where they're located - they're based in Devon UK, yet main markets are London, Birmingham, Newcastle, Edinburgh. They have clients in all these cities, so I could definitely provide content relevant to each city - perhaps a page for each city detailing work and services (and possibly listing clients). However, does the lack of a physical presence (and local phone number) in these cities make such city pages virtually impossible to rank these days? Does Google require a physical presence/phone number? Thanks in advance, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Best strategy behind moving country subdirectory to dedicated TTLD wo/ loosing organic search volume?
Community, We are about to move one of our most popular country sub directories from brandname.com/de/.. to brandname.de . We have just purchased the domain so while the domain has been registered in 2009 the URL has zero domain authority. What is the best strategy to execute the move while being cautious about loosing too much organic search volume the subdirectory is receiving right now? Obviously it will take some time to build up DA on the TTLD so maybe it is a good idea to keep the country directory for a little longer and start on the TTLD with just a static landing page, place some links, wait until it receives some DA builds up and then perform the move. Thoughts? /TomyPro
Intermediate & Advanced SEO | | tomypro0 -
I rank well in google but poorly in Yahoo
Why is this the case? If there is a filter or penalty on my site from yahoo, how do I ask yahoo to correct this?
Intermediate & Advanced SEO | | DavidS-2820610