CcTLDs vs folders
-
My company is looking at expanding internationally, we have sudomains in the UK and Canada currently. I'm making recommendations on improving SEO and one of the parts that I'm struggling with is the benefits of ccTLDs vs using folders.
I know the basic argument about Google recognizing the ccTLDs as being geo specific so they get priority. But I'd like to know HOW much priority they get. We have unique keywords and a pretty strong domain, is having a ccTLDs so much better that'd be worth going that route rather then creating folders within our current domain?
Thanks,
Jacob
-
Hi Jacob,
Use subfolders. Remember to use the hreflag tag, inclufing the country code.
If you have the ccTLD domains, redirect them to the subfolder.
For example: If you have yoursite.co.uk point it to yoursite.com/uk/Also, remember to add every subfolder to Google Search Console (Google Web Masters Tools) and declare for each one the country that is itended to.
Hope it helps.
GR. -
There definitely is a benefit for keeping all of your content on one domain (using folders), and building up the overall Domain Authority of one domain/one site.
When it comes to making the decision on whether or not to go to a ccTLD, consider your users/visitors first. How will they interact with the site, will they trust it more if it's a ccTLD in their country? If so, then consider the fact that it will ultimately be better for your business if the users like it and trust it better.
Another consideration is the fact that you'll be creating an entirely new site on a ccTLD. You'll be starting fresh, and will need links and time to ultimately get it to rank and get the traffic to where you need it to be. Then there's the whole issue of content, you'll need unique content for the site. If you can afford the time and effort involved in creating a completely new site, and it makes sense for users then I would consider the ccTLD route.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Constructing the perfect META Title - Ranking vs CTR vs Search Volume
Hello Mozzers! I want to discuss the science behind the perfect META Title in terms of three factors: 1. Ranking 2. CTR 3. Search Volume Hypothetical scenario: A furniture company "Boogie Beds" wants to optimise their META Title tag for their "Cane Beds" ecommerce webpage. 1. The keywords "Cane Beds' has a search volume of 10,000 2. The keywords " Cane Beds For Sale" has a search volume of 250 3. The keywords "Buy Cane Beds" has a search volume of 25 One of Boogie Beds SEO's suggests a META Title "Buy Cane Beds For Sale Online | Boogie Beds" to target and rank for all three keywords and capture long tail searches. The other Boogie Bed SEO says no! The META Title should be "Cane Beds For Sale | Boogie Beds" to target the most important two competitive keywords and sacrifice the "Buy" keyword for the other two Which SEO would you agree more with, considering 1. Ranking ability 2. Click through rates 3. Long tail search volume 4. Keyword dilution Much appreciated! MozAddict
Intermediate & Advanced SEO | | MozAddict1 -
Subdomain or folder for a section not focused on my core business
Hello there, I'm installing your analytics tool and it seems really great. I'm gonna use it for sure but I've a question that is more strategic and it's something the tool can't help me with 😛 I've a website active from 2008 and really well known in my country as a service website... we're like your "advisor" for utilities and insurances. The reason why is "savings" but really focused on utilities (broadband, gas, electricity) and check accounts or insurances. I’ve always used folders in my URLs instead of subdomains (for example www.site.com/section1 or www.site.com/section2 ). In this period I’m planning to open a new website section related to saving but not really close with what we really do in the rest of the website. This section is about coupons, vouchers and little offers. The problem is that with that section I’m going to write really a lot (a lot) of content trying to gain a lot of external links. It’s obvious that I already have a lot of contents about my core business and I’m going to write contents for original categories too. This section is anyway secondary for my business and my worry is that Google can identify me in the future as a website mainly focused on this new product. I’m really well indexed so I don’t want this decision to have any effect on my original situation. Finally the question 😛 Is it better to maintain for this section the same website structure with folders or indentify it as a subdomain to remark that it’s going to be like a totally different site with his dedicated news and all the rest? That’s why I’m evaluating a subdomain but I’m not really convinced cause subdomains can be considered as a different approach compared to original structure and of course using folder can be useful to gain root’s site rank. On the other hand, what can Google think about my core business? Thanks a lot for your help
Intermediate & Advanced SEO | | Uby850 -
Website.com/blog/post vs website.com/post
I have clients with Wordpress sites and clients with just a Wordpress blog on the back of website. The clients with entire Wordpress sites seem to be ranking better. Do you think the URL structure could have anything to do with it? Does having that extra /blog folder decrease any SEO effectiveness? Setting up a few new blogs now...
Intermediate & Advanced SEO | | PortlandGuy0 -
Moving a database from a sub folder to a new domain
Hi Mozzers, I have a very popular property database on a subfolder of my main website. It accounts for about 2 thirds of my overall traffic. I'm moving the database to a new server, so my choices are now a sub domain or completely new domain. If I move my database to a new domain, will my main site lose SEO because of how popular the database is? Cheers Matt
Intermediate & Advanced SEO | | Horizon0 -
How to Disallow Specific Folders and Sub Folders for Crawling?
Today, I have checked indexing for my website in Google. I found very interesting result over there. You can check that result by following result of Google. Google Search Result I aware about use of robots.txt file and can disallow images folder to solve this issue. But, It may block my images to get appear in Google image search. So, How can I fix this issue?
Intermediate & Advanced SEO | | CommercePundit0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0