How will Google deal with the crosslinks for my multiple domain site
-
Hi,
I can't find any good answer to this question so I thought, why not ask Moz.com ;-)!
-
I have a site, let's call it webshop.xx
-
For a few languages/markets, Deutsch, Dutch & Belgian, English, French.
-
I use a different TLD with a different IP for each of these languages, so I'll end up with:
-
webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr
-
They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff
-
My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening)
My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made:
-
I get full external links juice (content is translated so unique?)
-
I get a bit of the juice of an external link
-
They are actually seen as internal links
-
I'll get a penalty
Thanks in advance guys!!!
-
-
Thanks Alex, that is definitely something I'll have to look into. All pages are translated by hand.
-
It depends on how your content is translated - is it auto translated or a high-quality translation by a native speaker of the language? Google have said that auto translations can be a really bad experience - so in this case your translated content could be ignored, and maybe your sites could be penalised if it looks like you're generating the content to spam links. If your translations are good quality, you should not have a problem - but you do need to send the correct signals to the search engine crawlers.
- Markup your content or sitemaps with the hreflang attribute. Note the 2 values when targeting particular countries e.g. hreflang=”fr-fr” if your content is for French speakers in France and hreflang=”fr-be” for French speakers in Belgium. The language and country codes are linked from the above page.
- You should also have a Webmaster Tools account for each TLD, and geotarget the domains to the relevant country.
- Country-specific addresses, phone numbers and currency on each translated website can all help send the right signals about your content too.
Google's advice is here: https://support.google.com/webmasters/answer/182192
In answer to your question, I'm not sure, just make sure you do everything properly to avoid potential problems. I'd say it won't be 1 or 4.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding Elements on Mobile. Will this effect SEO.
Hey guys and gals, I am hiding elements with @media sizes on the mobile experience for this site. http://prepacademyschools.org/ My question is when hiding elements from mobile, will this have a negative effect on rankings for mobile and or desktop? Right now it is a hero banner and testimonial. My interest is because I feel responsive is now working against conversions when it comes to mobile because desktop typically has the same info several times where mobile it can be repetitive and only needed once. Thanks,
White Hat / Black Hat SEO | | brightvessel1 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
How does Google determine if a link is paid or not?
We are currently doing some outreach to bloggers to review our products and provide us with backlinks (preferably followed). The bloggers get to keep the products (usually about $30 worth). According to Google's link schemes, this is a no-no. But my question is, how would Google ever know if the blogger was paid or given freebies for their content? This is the "best" article I could find related to the subject: http://searchenginewatch.com/article/2332787/Matt-Cutts-Shares-4-Ways-Google-Evaluates-Paid-Links The article tells us what qualifies as a paid link, but it doesn't tell us how Google identifies if links were paid or not. It also says that "loans" or okay, but "gifts" are not. How would Google know the difference? For all Google knows (maybe everything?), the blogger returned the products to us after reviewing them. Does anyone have any ideas on this? Maybe Google watches over terms like, "this is a sponsored post" or "materials provided by 'x'". Even so, I hope that wouldn't be enough to warrant a penalty.
White Hat / Black Hat SEO | | jampaper0 -
People buying links to their profiles on my site
As we have a major Penguin update looming in the background, I am looking for expert advice on how to deal with professionals buying into link programs whether they are doing it deliberately or not. Our site provides detailed profile information on hundreds of 1000's of professionals and some professionals apparently believed that buying into link program will lift their profile in the SERPS. About 10 professionals have paid shady link building companies to buy links to their profiles on our site. The biggest offender bought over 1,500 links to his profile. Aside from adding the known toxic links to our disavow file, what else can we do to avoid any link penalties? I can think of three distinct options and would love to hear feedback especially based on actual experience. Option 1. 404 the existing profile - "http://www.anysite.com/jones_smith" and create a new URL "http://www.anysite.com/jones_smith_1". Option 2. Keep the existing URL and fully rely on the disavow file. Contact the professionals and kindly ask them to stop buying links and to contact their link building companies to remove the links. Any other ideas?
White Hat / Black Hat SEO | | irvingw0 -
How The HELL Is This Site Ranking So Well In Google Places?
When I do a search for this site it ranks number 2 on Google just below the official federation of master builders website for the keyword phase "builders in london" this is the site http://bit.ly/Lypo8E which is a nasty looking blog which has nothing to do with builders and they don't even have an address anywhere on the site. The only thing I can see is that they are sharing there address with a lot of other businesses and all of the citations from those other businesses are causing them to rank higher on Google places, but surely Google can't be that stupid right?
White Hat / Black Hat SEO | | penn730 -
Separate domain name for a subdomain?
I just created a subdomain to help our main TLD website. I was wondering if it's smart to create a separate TLD for this subdomain and set up a forward and build links to it. Reason I was thinking about it because it would be easier for people to remember instead of typing in subdomain.maindomain.com. But, I don't want the main website to suffer, since the purpose of creating this subdomain and it's content is to help the main domain. Any inputs on this? Thank you.
White Hat / Black Hat SEO | | FinanceSite0 -
Retail Site and Internal Linking Best Practices
I am in the process of recreating my company's website and, in addition to the normal retail pages, we are adding a "learn" section with user manuals, reviews, manufacturer info, etc. etc. It's going to be a lot of content and there will be linking to these "learn" pages from both products and other "learn" pages. I read on a SEOmoz blog post that too much internal linking with optimized anchor text can trigger down-rankings from Google as a penalty. Well, we're talking about having 6-8 links to "learn" pages from product pages and interlinking many times within the "learn" pages like Wikipedia does. And I figured they would all have optimized text because I think that is usually best for the end user (I personally like to know that I am clicking on "A Review of the Samsung XRK1234" rather than just "A Review of Televisions"). What is best practice for this? Is there a suggested limit to the number of links or how many of them should have optimized text for a retail site with thousands of products? Any help is greatly appreciated!
White Hat / Black Hat SEO | | Marketing.SCG0 -
Client Selling Links On One Site Hurt Their Other Site?
Hi, I have a client who is thinking about selling ads on one site they own via something like textlinkads.com. Do you think they run any risk of exposing their other sites to scrutiny, penalties or problems?
White Hat / Black Hat SEO | | 945010