How will Google deal with the crosslinks for my multiple domain site
-
Hi,
I can't find any good answer to this question so I thought, why not ask Moz.com ;-)!
-
I have a site, let's call it webshop.xx
-
For a few languages/markets, Deutsch, Dutch & Belgian, English, French.
-
I use a different TLD with a different IP for each of these languages, so I'll end up with:
-
webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr
-
They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff
-
My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening)
My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made:
-
I get full external links juice (content is translated so unique?)
-
I get a bit of the juice of an external link
-
They are actually seen as internal links
-
I'll get a penalty
Thanks in advance guys!!!
-
-
Thanks Alex, that is definitely something I'll have to look into. All pages are translated by hand.
-
It depends on how your content is translated - is it auto translated or a high-quality translation by a native speaker of the language? Google have said that auto translations can be a really bad experience - so in this case your translated content could be ignored, and maybe your sites could be penalised if it looks like you're generating the content to spam links. If your translations are good quality, you should not have a problem - but you do need to send the correct signals to the search engine crawlers.
- Markup your content or sitemaps with the hreflang attribute. Note the 2 values when targeting particular countries e.g. hreflang=”fr-fr” if your content is for French speakers in France and hreflang=”fr-be” for French speakers in Belgium. The language and country codes are linked from the above page.
- You should also have a Webmaster Tools account for each TLD, and geotarget the domains to the relevant country.
- Country-specific addresses, phone numbers and currency on each translated website can all help send the right signals about your content too.
Google's advice is here: https://support.google.com/webmasters/answer/182192
In answer to your question, I'm not sure, just make sure you do everything properly to avoid potential problems. I'd say it won't be 1 or 4.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
Site Scraping and Canonical Tags
Hi, So I recently found a site (actually just one page) that has scraped my homepage. All the links to my site have been removed except the canonical tag, should this be disavowed through WMT or reported through WMT's Spam Report? Thanks in advance for any feedback.
White Hat / Black Hat SEO | | APFM0 -
Site De-Indexed except for Homepage
Hi Mozzers,
White Hat / Black Hat SEO | | emerald
Our site has suddenly been de-indexed from Google and we don't know why. All pages are de-indexed in Google Webmaster Tools (except for the homepage and sitemap), starting after 7 September: Please see screenshot attached to show this: 7 Sept 2014 - 76 pages indexed in Google Webmaster Tools 28 Sept until current - 3-4 pages indexed in Google Webmaster Tools including homepage and sitemaps. Site is: (removed) As a result all rankings for child pages have also disappeared in Moz Pro Rankings Tracker. Only homepage is still indexed and ranking. It seems like a technical issue blocking the site. I checked for robots.txt, noindex, nofollow, canonical and site crawl for any 404 errors but can't find anything. The site is online and accessible. No warnings or errors appear in Google Webmaster Tools. Some recent issues were that we moved from Shared to Dedicated Server around 7 Sept (using same host and location). Prior to the move our preferred domain was www.domain.com WITH www. However during the move, they set our domain as domain.tld WITHOUT the www. Running a site:domain.tld vs site:www.domain.tld command now finds pages indexed under non-www version, but no longer as www. version. Could this be a cause of de-indexing? Yesterday we had our host reset the domain to use www. again and we resubmitted our sitemap, but there is no change yet to the indexing. What else could be wrong? Any suggestions appeciated. Thanks. hDmSHN9.gif0 -
Buying a domain vs. renting a domain
I am considering buying and redirecting a domain that has a pretty strong, relevant link profile. However, it's very expensive. There is another option to rent the domain on a month-to-month basis. I am interested in doing this for at least a month just to see what SEO benefits are to be had and if it would ultimately be worth buying or not. Can renting a domain have any negative impacts on my primary site? Would the search engines know if I did this? Is there any harm in having those redirects appear and then disappear?
White Hat / Black Hat SEO | | jampaper0 -
Best Location to find High Page Authority/ Domain Authority Expired Domains?
Hi, I've been looking online for the best locations to purchase expired domains with existing Page Authority/ Domain Authority attached to them. So far I've found: http://www.expireddomains.net
White Hat / Black Hat SEO | | VelasquezEF
http://www.domainauthoritylinks.com
http://moonsy.com/expired_domains/ These site's are great but I'm wondering if I'm potentially missing other locations? Any other recommendations? Thanks.1 -
Domain Structure For A Network of Websites
To achieve this we need to set up a new architecture of domains and sub-websites to effectively build this network. We want to make sure we follow the right protocols for setting up the domain structures to achieve good SEO for the primary domain and local websites. Today we have our core website at www.doctorsvisioncenter.com which will ultimately will become dvceyecarenetwork.com. That website will serve as the core web presence that can be custom branded for hundreds. For example, today you can go to www.doctorsvisioncenter.com/pinehurst. Note when you start there, you can click around and it is still branded for Pinehurst or spectrum eye care. So the burning question(s). - if I am an independent doc at www.newyorkeye.com, I could do domain forwarding but Google does not index forwarded domains so that is out. I could do a 301 permanent redirect to my page www.doctorsvisioncenter.com/newyorkeye. I could then put a rule in the HT Access file that says if newyorkeye.com redirect to www.doctorsvisioncenter/newyorkeye and then have the domain show up as www.newyorkeye.com. Another way to do that is we point the newyorkeye DNS to doctorsvisioncenter.com rather than a 301 redirect with the same basic rule in the HT Access file. That means that, theoretically, every sub page would show up, for example, as www.newyorkeye.com/contact-lens-center which is actually www.doctorsvisioncenter.com/contact-lens-center. It also means, theoretically, that it will be seen as an individual domain but pointing to all the same content under that individual domain just like potentially hundreds of others. The goal is we build once, manage once and benefit many. If we do something like the above which will mean that each domain will essentially be a separate domain, but, will google see it that way or as duplicative content? While it is easy to answer "yes" it would be duplicative, it is not necessarily the case if the content is on separate domains. Is this a good way to proceed, or does anyone have another recommendation for us?
White Hat / Black Hat SEO | | JessTopps0 -
Trying to determine if my site was de-indexed...
I ran a search using the allinsite:floridainboundmarketing.com command and found that virtually all of my pages are not being returned in the results. I'm one of those who (foolishly) used ALN blog network for a few months, got the unnatural links notice in WMT and on advice of other SEOs (including some here) I ignored it based on the idea that if my SERPS dropped due to alog update that a request for reconsideration was of no value. As I watched my SERPs dropping I was confident that it was simply because those links were no longer being counted and overall link profile was poor, so the results started dropping. I've not read where G has gone back and started de-indexing pages for such sites but it may be happening as (unless I'm wrong) my site is gone... Anyone got any ideas? Am I searching correctly?
White Hat / Black Hat SEO | | sdennison0 -
Google rankings dropped like a stone
I've heard of this happening many times, but never to me. My client was Page 1 or 2 for 20 phrases, and they ALL dropped like a rock overnight. The site hasn't been banned by Google, as it's still indexed and the company name is returning results.There were no major changes done to tags or the code, and nothing black hat has been done. The only phrases that didn't drop contain the company name, and the results in Bing and Yahoo either stayed the same or moved up slightly since last week for all the terms. There's also no threat of spam, and it's very search engine friendly. The URL is http://www.universalaccounting.com. Help!
White Hat / Black Hat SEO | | JamesBSEO0