CcTLDs vs folders
-
My company is looking at expanding internationally, we have sudomains in the UK and Canada currently. I'm making recommendations on improving SEO and one of the parts that I'm struggling with is the benefits of ccTLDs vs using folders.
I know the basic argument about Google recognizing the ccTLDs as being geo specific so they get priority. But I'd like to know HOW much priority they get. We have unique keywords and a pretty strong domain, is having a ccTLDs so much better that'd be worth going that route rather then creating folders within our current domain?
Thanks,
Jacob
-
Hi Jacob,
Use subfolders. Remember to use the hreflag tag, inclufing the country code.
If you have the ccTLD domains, redirect them to the subfolder.
For example: If you have yoursite.co.uk point it to yoursite.com/uk/Also, remember to add every subfolder to Google Search Console (Google Web Masters Tools) and declare for each one the country that is itended to.
Hope it helps.
GR. -
There definitely is a benefit for keeping all of your content on one domain (using folders), and building up the overall Domain Authority of one domain/one site.
When it comes to making the decision on whether or not to go to a ccTLD, consider your users/visitors first. How will they interact with the site, will they trust it more if it's a ccTLD in their country? If so, then consider the fact that it will ultimately be better for your business if the users like it and trust it better.
Another consideration is the fact that you'll be creating an entirely new site on a ccTLD. You'll be starting fresh, and will need links and time to ultimately get it to rank and get the traffic to where you need it to be. Then there's the whole issue of content, you'll need unique content for the site. If you can afford the time and effort involved in creating a completely new site, and it makes sense for users then I would consider the ccTLD route.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Home page vs inner page?
do you believe that the advantage of targeting a search term on the home page is now worse off than before? as I understand it ctr is a big factor now And as far as i can see if two pages are equal on page etc the better ctr will win out, the issue with the home page is the serp stars cannot be used hence the ctr on a product page will be higher? I feel if you where able to get a home page up quicker (1 year instead of two) you still lost out in the end due to the product page winning on ctr? do you think this is correct?
Intermediate & Advanced SEO | | BobAnderson0 -
What are best page titles for sub-folders or sub-directories? Same as website?
Hi all, We always mention "brand & keyword" in every page title along with topic in the website, like "Topic | vertigo tiles". Let's say there is a sub-directory with hundreds of pages...what will be the best page title practice in mentioning "brand & keyword" across all pages of sub-directory to benefit in-terms if SEO? Can we add "vertigo tiles" to all pages of sub-directory? Or we must not give same phrase? Thanks,
Intermediate & Advanced SEO | | vtmoz0 -
TLDs vs ccTLDs?
*Was trying to get this question answered in another thread but someone marked it as "answered" and no more responses came. So the question is about best practices on TLDs vs ccTLDs. I have a .com TLD that has DA 39 which redirects to the localized ccTLDs .co.id and .com.sg that have DA 17. All link building has been done for the .com TLD. In terms of content, it sometimes overlaps as the same content shows up on both the ccTLDs. What is best practices here? It doesnt look like my ccTLDs are getting any juice from the TLD. Should I just take my ccTLDs and combine them into my TLD in subdomains? Will I see any benefits? Thanks V j3LWnOJ
Intermediate & Advanced SEO | | venkatraman0 -
International Image SEO - one host vs multiple hosts
I've got 3 sites (same name) located in Australia, US and UK. Currently these sites are all pulling images (I own) from 1 location. I'd like to create image XML sitemaps for each of these sites. As I see it, my options are: 1. Keeping the images hosted in the 1 place and creating image XML sitemaps for each of the 3 sites (which seems to be technically ok because https://support.google.com/webmasters/answer/178636?hl=en&ref_topic=20986 states that if the image URL isn't on the same domain, both domains need to be verified in Webmaster Tools). However, is there a risk here that the sitemaps will conflict because they are pulling from images on the same host? 2. Hosting the images locally (ie. the same images will be hosted in 3 locations) and applying hreflang in the sitemap. Does anyone know which of these options are best (obviously #1 would be more convenient), or whether there are any other options for attacking this issue? Thanks!
Intermediate & Advanced SEO | | oline1230 -
For a mobile website, is it better to use a 301 vs. a 302 redirect?
We are vetting a vendor for our mobile website and they are recommending using a 302 redirect with rel=canonical vs. a 301 redirect due to 301 caching issues. All the research I've done shows that a 301 is by far the better way to go do to proper indexing, which in turn will enhance our page authority. Thoughts on why a 302 would be a better fit than a 301 on our mobile site?
Intermediate & Advanced SEO | | seohdsupply1 -
Frequent FAQs vs duplicate content
It would be helpful for our visitors if we were to include an expandable list of FAQs on most pages. Each section would have its own list of FAQs specific to that section, but all the pages in that section would have the same text. It occurred to me that Google might view this as a duplicate content issue. Each page _does _have a lot of unique text, but underneath we would have lots of of text repeated throughout the site. Should I be concerned? I guess I could always load these by AJAX after page load if might penalize us.
Intermediate & Advanced SEO | | boxcarpress0 -
How do I 301 Redirect a complete folder?
Hi, I am want to delete a folder and all the contents. I then need to redirect anyone that is trying to reach a file in that folder to another page on my site. example: www.mydomain.com/folder/ (contains 50 pages) I want to delete the folder and all 50 pages. So if someone tries to reach www.mydomain.com/folder/page1.php the redirect would take them to a specific page on my site. Doing this to clean up old content. How would I do this on the .htaccess? I have redirected a page but not a folder. Thanks in advance! Force7
Intermediate & Advanced SEO | | Force70 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0