Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Best Practice for Inter-Linking to CCTLD brand domains
-
Team,
I am wondering what people recommend as best SEO practice to inter-link to language specific brand domains e.g. :
amazon.com
amazon.de
amazon.fr
amazon.itCurrently I have 18 CCTLDs for one brand in different languages (no DC). I am linking from each content page to each other language domain, providing a link to the equivalent content in a separate language on a different CCTLD doamin. However, with Google's discouragement of site-wide links I am reviewing this practice.
I am tending towards making the language redirects on each page javascript driven and to start linking only from my home page to the other pages with optimized link titles.
Anyone having any thoughts/opinions on this topic they are open to sharing?
/Thomas
-
Hi Thomas,
I think that what you're doing right now is fine -
"...linking from each content page to each other language domain, providing a link to the equivalent content in a separate language on a different CCTLD domain."
Seems sensible from a user perspective - I think the only potential downside is if you're implementing this using lots of anchor text - this could potentially be problematic.
Equally utilising javascript allow users to select language and location seems fine to me.
I hope this helps,
Hannah
-
When you say it works - what exactly do you mean?
-
My best practice regarding to your issue is.
I am runing the same company like yours, but i have done those steps:
1. Unique Hosting for each ccTLD Domain.
2. I placed Side-Wide Links for another languages too.
3. The Anchor Texts, are not optimized, but for example: Countries: Germany, Italy, Canada, etc..
It works.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps: Best Practice
What should and what shouldn't go in the sitemap? In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs? Thanks for any advice/ anecdotes 🙂
Intermediate & Advanced SEO | | Fubra0 -
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Changing Brand and Domain Name - SEO Impacts
Hi everyone I'm hoping a few of you can help me out... We're an online-one retailer and we're currently looking at rebranding.
Intermediate & Advanced SEO | | piazza
This is for commercial reasons: Our current name is difficult for customers to spell It's not wholly representative of what we now offer We want to push offline and social marketing to help increase or DA In a nutshell, our current name implies 'cheap' and we're moving more upmarket.
Our DA is only 10, and a re-brand will make our brand more marketable.
A stronger brand and DA will help us climb up the rankings quickly - last year we ranked no 1 for a relatively competitive term before dropping a few places. In terms of current traffic: 30% is via SEO (we have a low DA but rank ok for certain phrases) 70% is via adwords We had our website redesigned last year and it performs well.
The idea is to have a new brand logo and colours and move to a new domain.
We will keep all our existing products and content. Please could anyone let me know the implications of this move?
What are potential pitfalls, and what will we need to do to alert Google?
I have read about 301 redirects, would these be required? As always, any help is very much appreciated. Many thanks Abs0 -
Does Disavowing Links Negate Anchor Text, or Just Negates Link Juice
I'm not so sure that disavowing links also discounts the anchor texts from those links. Because nofollow links absolutely still pass anchor text values. And disavowing links is supposed to be akin to nofollowing the links. I wonder because there's a potential client I'm working on an RFP for and they have tons of spammy directory links all using keyword rich anchor texts and they lost 98% of their traffic in Pengiun 1.0 and haven't recovered. I want to know what I'm getting into. And if I just disavow those links, I'm thinking that it won't help the anchor text ratio issues. Can anyone confirm?
Intermediate & Advanced SEO | | MiguelSalcido0 -
What is the best practice for URLs for E-commerce products in multiple categories?
Hello all! I have always worked successfully with SEO on E-commerce sites, however we are currently revamping an older site for a client and so I thought I'd turn to the community to ask what the best practices that you guys are experiencing for url structures at the moment. Obviously we do not wish to create duplicate content and so the big question is, what would you guys do for the very best structure for URLs on an E-commerce site that has products in multiple categories? Let's imagine we are selling toy cars. I have a sports car for sale, so naturally it can go in the sports cars category and it could also go in to the convertibles category too. What is the best way you have found recently that works and increases rankings, but does not create duplicate content? Thanks in advance! 🙂 Kind Regards, JDM
Intermediate & Advanced SEO | | Hatfish0 -
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
Redirect old .net domain to new .com domain
I have a quick question that I think I know the answer to but I wanted to get some feedback to make sure or see if there's additional feedback. The long and short of it is that I'm working with a site that currently has a .net domain that they've been running for 6 years. They've recently bought a .com of the same name as well. So the question is: I think it's obviously preferable to keep the .net and just direct the .com to it. However, if they would prefer to have the .com domain, is 301'ing the .net to the .com going to lose a lot of the equity they've built up in the site over the past years? And are there any steps that would make such a move easier? Also, if you have any tips or insight just into a general transition of this nature it would be much appreciated. Thanks!
Intermediate & Advanced SEO | | BrandLabs0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0