Multilingual website
-
My website is https://www.india-visa-gov.in and we are doing multilingual.
There are three options
1. TLD eg india-visa-gov.fr (French) india-visa-gov.de (German)
2. Subdomain eg: fr.india-visa-gov.in (French) de.india-visa-gov.in (German)
3. Folders https://www.india-visa-gov.in/fr/ (French) https://www.india-visa-gov.in/de/ (German)
We have tried the 3rd option but need to know whether its better or not for the long term health from SEO. Does the MOZ DA carry better in Subdomain or TLD or Folders? What does MOZ recommend to maintain DA?
Thanks
-
Andreas
Thanks for sharing your story
You did genuine outreach so that works best for both human users and also Google.
Insightful
-
Links from xyz.wordpress.com and abc.wordpress.com are 2 different links from 2 domains. Not one of them has the "DA" of www.wordpress.com because thats also a different domain.
We can say, that google is seeing it that way and not. Not because, in search console linking domain would be "wordpress.com". Noretheless they are different in a lot of points, DA is the most important one.
If you get more links from the same domain, it makes them less and less important (for every new link, not the old links). In my tests (wordpress & blogspot) it was the case, that it was less and less for the one linking subdomain, not all the sub-domains. Somehow understandable? So the answer is "it depends" The test is not 100 % accurate and hard to compare but seems like thats the way it is.
What I also realized, when I get high increase in organic traffic and I have done nothing and no update was going on, it is mostly one new organic link. You can build links as much as you want but organic links beat everything and I dont know how google can figure that out so clearly.
So when you have a good amount of links, you should focus on your user. Like I do, thatswhy I cant tell you how MOZ or any other Tool is handling it, I simply do not care that much.
One of my former competitors has 10 times more Links as we have and today I call him "former" competitor. Former because we reached a new level, he not. We have now 9times more organic traffic and he still stuck where he is since a year. Well we have a single Page with more monthly organic visitors than his domain. We started on the same level 1.5 years ago. I did not build one single link, I just focus on users. A lot users, talking about hundrethousands organic visitors, for germany in this niche a lot. Ok we are now going into other topics, but thats not the point here.
All these topics are finance topics so maybe a YMYL special thing, but I have some more domains wich work better and better without building links and non-YMYL topics. But there is a lot great content wich makes the domains earn links. At least you need an entry, a dooropener to get links coming in.
Woop - to much hah So you are somehow right, but dont think to much link Anyway - Good Luck!
-
Thanks Andreas.
Also, if one website were to get backlinks from say
back to my website.
Then will MOZ count it backlinks from two domains?
Does Google treat it in the same way?
Meaning if I get backlinks from wordpress and blogspot, then they are counted as individual domains?
-
1.) Is imho the case shouldn't chose. Or - thats what majority would tell you. And thats still the case. So the new sub-domains are new domains. Google told us that they treat subdomains like other domains. Of course they do, they always did, but still - new domains.
2.) For long term, they all work well I am sure. It is easier in short term with new TLD's eg. .de .fr
So in germany we have a lot of .de Domains in SERPs and less .com/de or de.xyz.com. But you have to manage more and more domains.3.)This is what everybody is telling you in terms of Links and Pagerank. Stuff gets links in germany, french pages benefit. But - correctly linked content is working equal with sub-domains or new TLDs.
Amazon is doing 2nd, a lot SEO-Tool-Providers (e.g. ryte) do it with .com/folder - and both are working. I mention ryte because they also had .org before, they never used .de for germany. And that should be a hint - do what fits most to you and your needs. SEO is not that important - it is, but not in wich way you do it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap international websites
Hey Mozzers,Here is the case that I would appreciate your reply for: I will build a sitemap for .com domain which has multiple domains for other countries (like Italy, Germany etc.). The question is can I put the hreflang annotations in sitemap1 only and have a sitemap 2 with all URLs for EN/default version of the website .COM. Then put 2 sitemaps in a sitemap index. The issue is that there are pages that go away quickly (like in 1-2 days), they are localised, but I prefer not to give annotations for them, I want to keep clear lang annotations in sitemap 1. In this way, I will replace only sitemap 2 and keep sitemap 1 intact. Would it work? Or I better put everything in one sitemap?The second question is whether you recommend to do the same exercise for all subdomains and other domains? I have read much on the topic, but not sure whether it worth the effort.The third question is if I have www.example.it and it.example.com, should I include both in my sitemap with hreflang annotations (the sitemap on www.example.com) and put there it for subdomain and it-it for the .it domain (to specify lang and lang + country).Thanks a lot for your time and have a great day,Ani
Technical SEO | | SBTech0 -
Website content has been scraped - recommended action
So whilst searching for link opportunities, I found a website that has scraped content from one of our websites. The website looks pretty low quality and doesn't link back. What would be the recommended course of action? Email them and ask for a link back. I've got a feeling this might not be the best idea. The website does not have much authority (yet) and a link might look a bit dodgy considering the duplicate content Ask them to remove the content. It is duplicate content and could hurt our website. Do nothing. I don't think our website will get penalised for it since it was here first and is in the better quality website. Possibly report them to google for scraping? What do you guys think?
Technical SEO | | maxweb0 -
Website Pop up
I want to install a pop up on my site for collecting email addresses. I use nopcommerce as my platform. I was quoted a price to custom develop it but thought there might just be a simple plugin. I can find them for wordpress but that is all. Does anybody have experience with this?
Technical SEO | | katazoom0 -
Duplicate page issue in website
i found duplicate pages in my website. seomoz is showing duplicate web pages this is issue or not please tell me?
Technical SEO | | learningall0 -
As a wholesale website can our independent retailer's website use (copy) our content?
As a wholesaler of villa rentals, we have descriptions, images, prices etc can our agents (independent retailers) use the content from our website for their site or will this penalize us or them in Google rankings?
Technical SEO | | ewanTHH0 -
Mysterious drop of website ranking in google
Usually, I don't want to bother anybody by posting silly questions on forums. But this time I really might need advice. My wife and I took over the website maintenance and e-marketing of a local air conditioning company end of March this year. Before that the applied SEO strategies were not very user friendly and a little too search engine focused (spammy keyword stuffed articles, confusing website structure, a lot of directory links). Yesterday night (May 15th) the website more or less stopped ranking. For search terms like "ac repair englewood fl" or "trane north port" and many more the website was on page 1. Here are some more details: I replaced the old website with a newer version end of April. Since some of old the url structure did not apply any longer, I did a setup of around 30 301-redirects in .htaccess. The new site seemed to rank more or less as expected. The homepage has a PakeRank of 1 (seomoz Page Authority is 31). I am working on that but good natural links just take some time. site:kobiecomplete.com still brings up all the pages Google Webmaster Tools notified me on May 12th that there was a possible outage: _"_While crawling your site, we have noticed an increase in the number of transient soft 404 errors around 2012-05-08 16:00 UTC (London, Dublin, Edinburgh). Your site may have experienced outages. These issues may have been resolved. Here are some sample pages that resulted in soft 404 errors:" The listed pages under "some sample pages" are only pages from the old website which do not exist any longer and the 301 redirect was not setup. But this should have been already any issue before, if at all.
Technical SEO | | grojoh
I added the missing 301 redirects and marked them as fixed in Google Webmaster Tools. I had a copy of the website on a testing webspace (root directory of brightsidewg.com). Even though I had robots.txt set to disallow everything and WordPress search engine privacy set to do not index / follow, the website appeared on the Google search results yesterday night instead of the original website (kobiecomplete.com). Even though brightsidewg was a few ranks worse than kobiecomplete.com was, it was still ranking.
To remove the duplicate content, I deleted everything on brightsidewg.com and requested the removal of the website in the Webmaster Tools. Now brightsidewg.com is not any longer indexed (good) but it didn't help the ranking of kobiecomplete.com. Especially the homepage and the service area pages were ranking pretty decent on Google before yesterday night. Now I can not find them at all. Only other less important pages rank on page 8+ No malware on website I did not do any big changes on the website yesterday (only really minor ones). I did not acquire any weird/paid links even though there is a new link from a PageRank 0 website which I did not setup: http://www.indo-karya.com/detail/news/2012/kombise But that alone I think would not be enough for a penalty. It almost looks like that Google applied a partial -950 filter!? I could submit the website for reconsideration to Google and tell them about the duplicate content issue with my testing webspace brightsidewg.com. What do you think about it and what shall I do? Thank you so much for any help!0 -
How to de-index the server location of my website
Somehow my website is indexed by it's server location. So in addition to www.[example].com, it's also indexed like this: server123.[server name].com I have no idea how that happened. But, I was wondering, does anyone know how to de-index all the urls like this? I have a lot of urls indexed using my server's address. Thanks.
Technical SEO | | webtarget0 -
Website Grader Report - Permanent Redirect Not Found
Have you ever checked HubSpot's website grader at www.websitegrader.com? I usually notice that the tool gives an error namely "Permanent Redirect Not Found" with below explanation: "Search engines may think www.example.com and example.com are two different sites.You should set up a permanent redirect (technically called a "301 redirect") between these sites. Once you do that, you will get full search engine credit for your work on these sites. :(Website Grader) Can we trust this tool?
Technical SEO | | merkal20050