How will Google deal with the crosslinks for my multiple domain site
-
Hi,
I can't find any good answer to this question so I thought, why not ask Moz.com ;-)!
-
I have a site, let's call it webshop.xx
-
For a few languages/markets, Deutsch, Dutch & Belgian, English, French.
-
I use a different TLD with a different IP for each of these languages, so I'll end up with:
-
webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr
-
They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff
-
My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening)
My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made:
-
I get full external links juice (content is translated so unique?)
-
I get a bit of the juice of an external link
-
They are actually seen as internal links
-
I'll get a penalty
Thanks in advance guys!!!
-
-
Thanks Alex, that is definitely something I'll have to look into. All pages are translated by hand.
-
It depends on how your content is translated - is it auto translated or a high-quality translation by a native speaker of the language? Google have said that auto translations can be a really bad experience - so in this case your translated content could be ignored, and maybe your sites could be penalised if it looks like you're generating the content to spam links. If your translations are good quality, you should not have a problem - but you do need to send the correct signals to the search engine crawlers.
- Markup your content or sitemaps with the hreflang attribute. Note the 2 values when targeting particular countries e.g. hreflang=”fr-fr” if your content is for French speakers in France and hreflang=”fr-be” for French speakers in Belgium. The language and country codes are linked from the above page.
- You should also have a Webmaster Tools account for each TLD, and geotarget the domains to the relevant country.
- Country-specific addresses, phone numbers and currency on each translated website can all help send the right signals about your content too.
Google's advice is here: https://support.google.com/webmasters/answer/182192
In answer to your question, I'm not sure, just make sure you do everything properly to avoid potential problems. I'd say it won't be 1 or 4.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Posting same content multiple blogs or multiple website - 2018
Submitting same content on multiple site or blog using original source Links. Its good or bad in term on Ranking and SEO. Can we post same content on multiple website with orginal post reference same like Press release site technique.
White Hat / Black Hat SEO | | HuptechWebseo0 -
Has our site been attacked?
Hello fellow mozers! I am having a problem you might be able to help me with and any thoughts on the issue will be greatly appreciated. Yesterday, I received an automated monthly report from Quill Engage, a tool that fetches data from Google Analytics and generates reports in a narrative format. Last month's 'referral traffic' section indicates two incredibly spammy websites driving more than 200 sessions to our website. Naturally, I checked out GWT and Open Site Explorer but couldn't find any traces of such activity. Futhermore, all our metrics seem ok. Can this possibly be a negative SEO attack that was only traced by the aforementioned tool? Can you propose any other way to test this and make sure we're not being attacked?
White Hat / Black Hat SEO | | SMD_0 -
Algorithmically penalized site
I have been doing SEO for years, but luckily have never had a client penalized or had to go through that. I see everyone talking about it at conferences and know the absolute basics of recovery, but just had someone come to me that was algorithmically penalized about two years ago. They have no actual data to show me a date and they couldn't tell me a specific date. According to them, their SEO disappeared and wouldn't give them access to the analytics. They are definitely showing just about every red flag with anchor tags and low trust links and tons of duplicate content. Just about everything. I realize you don't have the deep data to go by, but are there cases when it is just better to start over from scratch. They have literally thousands of bad links and strange site pages that they say they weren't even aware of. Whether they were or not I guess isn't the point now, but I have heard rumors that if you start over, Google will still figure it out and follow you with the penalty. Is this true or documented? Don't want to potentially recommend that if that is something that generally happens to bad offenders. Happy to do the work and try to resolve their issues, but it is a lot of work and is going to be expensive and want to present other options. Thanks and any thoughts suggestions are appreciated.
White Hat / Black Hat SEO | | jeremyskillings0 -
Redirecting location-specific domains
I am working on a project for a physician who only cares about reaching patients within a specific geographic region. He has a new technique at his practice and wants to get the word out via radio spots. I want to track the effectiveness of the radio campaigns without the use of call-tracking numbers or special promo codes. Since the physician's primary domain is very long (but well-established), my thought is to register 3-4 short domains referencing the technique and location so they would be easy for listeners to remember and type-in later. 301 these domains to the relevant landing page on the main domain. As an alternative. Each domain could be a single relevant landing page with a link to the relevant procedure on the main site. It's not as if there is anything deceptive going on, rather, I would simply be using a domain in place of a call tracking number. I think I should be able to view the type-in traffic in Analytics, but would Google have an issue with this? Thoughts and suggestions appreciated!
White Hat / Black Hat SEO | | SCW0 -
Is this website being punished by Google?
Hi, I just took over the SEO for a friend of mine's website. Is this website being punished by Google? It has a strong link score, the homepage needs work as far as Key wording goes but it does not appear in Google's top 100 for any keyword. I am not sure that the last SEO company did some harm. Can anyone give me some tips on getting my friend back into the mix? www.wallybuysell.com
White Hat / Black Hat SEO | | CKerr0 -
How Does This Site Get Away With It?
The following site is huge in the movie trailer industry: http://bit.ly/18B6tF It ranks #3 in Google for "Movie Trailers" and has high rankings for multiple other major keywords in the industry. Here's the thing; virtually all of their movie trailer pages contain copy/pasted content from other sites. The movie trailer descriptions are the ones given by the movie companies and therefor the same content is on thousands of websites/blogs. We all know Google hates duplicate content at the moment... so how does this site get a away with it? Does it's root-domain authority keep it up there?
White Hat / Black Hat SEO | | superlordme0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
What to do about all of the other domains we own?
So I had asked this question a while back in a previous thread and thought I had the correct answer to it, but just actually heard differently on a webinar by Dr. Pete. Basically, we have a large number of domains that just replicate our website. Some are brand names, some are exact match keyword domains, some are clever plays on words. This is a tactic that our marketing department thought was a good idea. Obviously its not. My question is - Some of these domains actually have a significant amount of link value coming into them. How people found them I'm not sure, but nonetheless, I want to try to take advantage of the incoming links somehow if possible. Dr. Pete recommended against 301 redirecting back to our main domain all at once because that would be a signal to Google that something fishy is going on. This is what I was going to do, but now I'm really not sure what to do now... If possible, it would be great to get Dr. Pete in this thread to get his comments. I wasn't able to get an answer on the SEO in 2012 Pro Webinar.
White Hat / Black Hat SEO | | CodyWheeler0