Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multilingual website
-
My website is https://www.india-visa-gov.in and we are doing multilingual.
There are three options
1. TLD eg india-visa-gov.fr (French) india-visa-gov.de (German)
2. Subdomain eg: fr.india-visa-gov.in (French) de.india-visa-gov.in (German)
3. Folders https://www.india-visa-gov.in/fr/ (French) https://www.india-visa-gov.in/de/ (German)
We have tried the 3rd option but need to know whether its better or not for the long term health from SEO. Does the MOZ DA carry better in Subdomain or TLD or Folders? What does MOZ recommend to maintain DA?
Thanks
-
Andreas
Thanks for sharing your story
You did genuine outreach so that works best for both human users and also Google.
Insightful
-
Links from xyz.wordpress.com and abc.wordpress.com are 2 different links from 2 domains. Not one of them has the "DA" of www.wordpress.com because thats also a different domain.
We can say, that google is seeing it that way and not. Not because, in search console linking domain would be "wordpress.com". Noretheless they are different in a lot of points, DA is the most important one.
If you get more links from the same domain, it makes them less and less important (for every new link, not the old links). In my tests (wordpress & blogspot) it was the case, that it was less and less for the one linking subdomain, not all the sub-domains. Somehow understandable? So the answer is "it depends"
The test is not 100 % accurate and hard to compare but seems like thats the way it is.
What I also realized, when I get high increase in organic traffic and I have done nothing and no update was going on, it is mostly one new organic link. You can build links as much as you want but organic links beat everything and I dont know how google can figure that out so clearly.
So when you have a good amount of links, you should focus on your user. Like I do, thatswhy I cant tell you how MOZ or any other Tool is handling it, I simply do not care that much.
One of my former competitors has 10 times more Links as we have and today I call him "former" competitor. Former because we reached a new level, he not. We have now 9times more organic traffic and he still stuck where he is since a year. Well we have a single Page with more monthly organic visitors than his domain. We started on the same level 1.5 years ago. I did not build one single link, I just focus on users. A lot users, talking about hundrethousands organic visitors, for germany in this niche a lot. Ok we are now going into other topics, but thats not the point here.
All these topics are finance topics so maybe a YMYL special thing, but I have some more domains wich work better and better without building links and non-YMYL topics. But there is a lot great content wich makes the domains earn links. At least you need an entry, a dooropener to get links coming in.
Woop - to much hah
So you are somehow right, but dont think to much link
Anyway - Good Luck!
-
Thanks Andreas.
Also, if one website were to get backlinks from say
back to my website.
Then will MOZ count it backlinks from two domains?
Does Google treat it in the same way?
Meaning if I get backlinks from wordpress and blogspot, then they are counted as individual domains?
-
1.) Is imho the case shouldn't chose. Or - thats what majority would tell you. And thats still the case. So the new sub-domains are new domains. Google told us that they treat subdomains like other domains. Of course they do, they always did, but still - new domains.
2.) For long term, they all work well I am sure. It is easier in short term with new TLD's eg. .de .fr
So in germany we have a lot of .de Domains in SERPs and less .com/de or de.xyz.com. But you have to manage more and more domains.3.)This is what everybody is telling you in terms of Links and Pagerank. Stuff gets links in germany, french pages benefit. But - correctly linked content is working equal with sub-domains or new TLDs.
Amazon is doing 2nd, a lot SEO-Tool-Providers (e.g. ryte) do it with .com/folder - and both are working. I mention ryte because they also had .org before, they never used .de for germany. And that should be a hint - do what fits most to you and your needs. SEO is not that important - it is, but not in wich way you do it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My website is constantly decreasing
For few weeks ago my website is constantly decreasing in search position. I lost keywords and is gooooing down.
Technical SEO | | Dan_Tala
Although it is well rated on several on page and off page seo verification software that I have tried.
I checked Google search console and Analytics and found no major problems. However… from one day to another it keeps going down.
I also checked what the main competitors are doing and they are not doing well, at all.
The main competitor actually has a creepy website. Totally devoid of onpage or offpage SEO but with an enormous number of backlinks. And of a very bad quality, which should disqualify it, still…
Few weeks ago I changed something.
In the pages I had H1, 4xH2, no H3 and an H4 without content.
An unnatural H tag structure.
Now I have H1, H2, H3, 3xH4, with the coherent information.
Theoretically, Google should have been “happy” or I’m missing something. I use a SAAS platform.
I just found out that they made changes to the keywords (tags).
I am selling toner cartridges for printers.
So…
The tags are printer models and generate a url in which they have the products.
Ex. https://www.sertit.ro/cartus-imprimanta-cilindru-color-hp-laserjet-pro-m-177fw goes to the products for that printer model.
The question is… should I make tag canonical?
Is it possible for products to loose so much in Google search?0 -
Website ranking declined after connecting to CDN
Hi! We are trying to rank https://windowmart.ca for various local search terms. Our head office is in Edmonton where we try to rank https://windowmart.ca/edmonton-windows-doors/ for such terms as "windows Edmonton", "replacement windows Edmonton", "windows and doors Edmonton" as well as others. The website was the leader in its niche for around 2 years. Then we've got some server related issues, moved to a new server and connected CDN Nitropack that really improved our google speed test results. Recently we noticed that our rankings started to drop. Do you know if Nitropack can negatively effect local SEO rankings? Thank you!
Technical SEO | | vaskrupp0 -
Will Google crawl and rank our ReactJS website content?
We have 250+ products dynamically inserted and sorted on our site daily (more specifically our homepage... yes, it's a long page). Our dev team would like to explore rendering the page server-side using ReactJS. We currently use a CDN to cache all the content, which of course we would like to continue using. SO... will Google be able to crawl that content? We've read some articles with different ideas (including prerendering): http://andrewhfarmer.com/react-seo/
Technical SEO | | Jane.com
http://www.seoskeptic.com/json-ld-big-day-at-google/ If we were to only load the schema important to the page (like product title, image, price, description, etc.) from the server and then let the client render the remaining content (comments, suggested products, etc.), would that go against best practices? It seems like that might be seen as showing the googlebot 1 version and showing the site visitor a different (more complete) version.0 -
Car Dealership website - Duplicate Page Content Issues
Hi, I am currently working on a large car dealership website. I have just had a Moz crawl through and its flagging a lot of duplicate page content issues, these are mostly for used car pages. How can I get round this as the site stocks many of the same car, model, colour, age, millage etc. Only unique thing about them is the reg plate. How do I get past this duplicate issue if all the info is relatively the same? Anyone experienced this issue when working on a car dealership website? Thank you.
Technical SEO | | karl621 -
How To Cleanup the Google Index After a Website Has Been HACKED
We have a client whose website was hacked, and some troll created thousands of viagra pages, which were all indexed by Google. See the screenshot for an example. The site has been cleaned up completely, but I wanted to know if anyone can weigh in on how we can cleanup the Google index. Are there extra steps we should take? So far we have gone into webmaster tools and submitted a new site map. ^802D799E5372F02797BE19290D8987F3E248DCA6656F8D9BF6^pimgpsh_fullsize_distr.png
Technical SEO | | yoursearchteam0 -
Fixing a website redirect situation that resulted in drop in traffic
Hi, I'm trying to help someone fix the following situation: they had a website, www.domain.com, that was generating a steady amount of traffic for three years. They then redesigned the website a couple of months ago, and the website developer redirected the site to domain.com but did not set up analytics on domain.com. We noticed that there was a drop in traffic to www.domain.com but have no idea if domain.com is generating any traffic since analytics wasn't installed. To fix this situation, I was going to find out from the developer if there was a good reason to redirect the site. What would have prompted the developer to do this if www.domain.com had been used already for three years? Then, unless there was a good reason, I would change the redirect back to what it was before - domain.com redirecting to www.domain.com. Presumably this would allow us to regain the traffic to the site www.domain.com that was lost when the redirect was put in place. Does this sound like a reasonable course of action? Is there anything that I'm missing, or anything else that I should do in this situation? Thanks in advance! Carolina
Technical SEO | | csmm0 -
403 forbidden error website
Hi Mozzers, I got a question about new website from a new costumer http://www.eindexamensite.nl/. There is a 403 forbidden error on it, and I can't find what the problem is. I have checked on: http://gsitecrawler.com/tools/Server-Status.aspx
Technical SEO | | MaartenvandenBos
result:
URL=http://www.eindexamensite.nl/ **Result code: 403 (Forbidden / Forbidden)** When I delete the .htaccess from the server there is a 200 OK :-). So it is in the .htaccess. .htaccess code: ErrorDocument 404 /error.html RewriteEngine On
RewriteRule ^home$ / [L]
RewriteRule ^typo3$ - [L]
RewriteRule ^typo3/.$ - [L]
RewriteRule ^uploads/.$ - [L]
RewriteRule ^fileadmin/.$ - [L]
RewriteRule ^typo3conf/.$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-l
RewriteRule .* index.php Start rewrites for Static file caching RewriteRule ^(typo3|typo3temp|typo3conf|t3lib|tslib|fileadmin|uploads|screens|showpic.php)/ - [L]
RewriteRule ^home$ / [L] Don't pull *.xml, *.css etc. from the cache RewriteCond %{REQUEST_FILENAME} !^..xml$
RewriteCond %{REQUEST_FILENAME} !^..css$
RewriteCond %{REQUEST_FILENAME} !^.*.php$ Check for Ctrl Shift reload RewriteCond %{HTTP:Pragma} !no-cache
RewriteCond %{HTTP:Cache-Control} !no-cache NO backend user is logged in. RewriteCond %{HTTP_COOKIE} !be_typo_user [NC] NO frontend user is logged in. RewriteCond %{HTTP_COOKIE} !nc_staticfilecache [NC] We only redirect GET requests RewriteCond %{REQUEST_METHOD} GET We only redirect URI's without query strings RewriteCond %{QUERY_STRING} ^$ We only redirect if a cache file actually exists RewriteCond %{DOCUMENT_ROOT}/typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html -f
RewriteRule .* typo3temp/tx_ncstaticfilecache/%{HTTP_HOST}/%{REQUEST_URI}/index.html [L] End static file caching DirectoryIndex index.html CMS is typo3. any ideas? Thanks!
Maarten0 -
How to create a tree-like structure map of a website?
Hi all, The online marketing manager requested to make a tree-like map of the website. He means that he would like to see a graphical representation of the website and his contents. This way we will be able to see if there are internal link issues. The problem is that there are thousands of pages and many subdomains, manual labour would make this a very tedious task. If you would get this question, how would you try to solve this? Any software recommendation?
Technical SEO | | djingel10