Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Has any one seen negative SEO effects from using Google Translate API
-
We have a site currently in development that is using the Google Translate API and I am having a massive issue getting screaming frog to crawl and all of our non-native English speaking employees have read through the translated copy in their native language and the general consensus is it reads at a 5th grade level at best. My questions to the community is, has anyone implemented this API on a site and has it a) helped with gaining traffic from other languages/countires and b) has it hurt there site from an SEO standpoint.
-
Hi Bernadette, I completely agree with that translation being human. Your are correct it wasn't google translate messing with the crawl, but it was a great argument to get removed

Where in screaming frog are you able to crawl slower? I have dug around the program and can't find the option.
-
Overall, if you are going to translate your website, it really should be translated by a human rather than an API. There are certain ways things should be translated and written that an API just cannot do properly. It's more of a user experience and readability issue than anything else.
It sounds as if the screaming frog crawling issue isn't related to the translation issue--it is a crawling issue. You may want to see if you can crawl much slower, which is a setting in screaming frog.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Advise on the right way to block country specific users but not block Googlebot - and not be seen to be cloaking. Help please!
Hi, I am working on the SEO of an online gaming platform - a platform that can only be accessed by people in certain countries, where the games and content are legally allowed.
International SEO | | MarkCanning
Example: The games are not allowed in the USA, but they are allowed in Canada. Present Situation:
Presently when a user from the USA visits the site they get directed to a restricted location page with the following message: RESTRICTED LOCATION
Due to licensing restrictions, we can't currently offer our services in your location. We're working hard to expand our reach, so stay tuned for updates! Because USA visitors are blocked Google which primarily (but not always) crawls from the USA is also blocked, so the company webpages are not being crawled and indexed. Objective / What we want to achieve: The website will have multiple region and language locations. Some of these will exist as standalone websites and others will exist as folders on the domain. Examples below:
domain.com/en-ca [English Canada]
domain.com/fr-ca [french Canada]
domain.com/es-mx [spanish mexico]
domain.com/pt-br [portugese brazil]
domain.co.in/hi [hindi India] If a user from USA or another restricted location tries to access our site they should not have access but should get a restricted access message.
However we still want google to be able to access, crawl and index our pages. Can i suggest how do we do this without getting done for cloaking etc? Would this approach be ok? (please see below) We continue to work as the present situation is presently doing, showing visitors from the USA a restricted message.
However rather than redirecting these visitors to a restricted location page, we just black out the page and show them a floating message as if it were a model window.
While Googlebot would be allowed to visit and crawl the website. I have also read that it would be good to put paywall schema on each webpage to let Google know that we are not cloaking and its a restricted paid page. All public pages are accessible but only if the visitor is from a location that is not restricted Any feedback and direction that can be given would be greatly appreciated as i am new to this angle of SEO. Sincere thanks,0 -
Country subfolders showing as sitelinks in Google, country targeting for home page no longer working
Hi There, Just wondering if you can help. Our site has 3 region versions (General .com, /ie/ for Ireland and /gb/ for UK), each submitted to Google Webmaster Tools as seperate sites with hreflang tags in the head section of all pages. Google was showing the correct results for a few weeks, but I resubmitted the home pages with slight text changes last week and something strange happened, though it may have been coincidental timing. When we search for the brand name in google.ie or google.co.uk, the .com now shows as the main site, where the sitelinks still show the correct country versions. However, the country subdirectories are now appearing as sitelinks, which is likely causing the problem. I have demoted these on GWT, but unsure as to whether that will work and it seems to take a while for sitelink demotion to work. Has anyone had anything similar happen? I thought perhaps it was a markup issue breaking the head section so that Google can no longer see the hreflangs pointing to each other as alternates. I checked the source code in w3 validator and it doesn't show any errors. Anyway, any help would be much appreciated - and thanks to anyone who gets back, it's a tricky type of issue to troubleshoot. Thanks, Ro
International SEO | | romh0 -
If I redirect based on IP will Google still crawl my international sites if I implement Hreflang
We are setting up several international sites. Ideally, we wouldn't set up any redirects, but if we have to (for merchandising reasons etc) I'd like to assess what the next best option would be. A secondary option could be that we implement the redirects based on IP. However, Google then wouldn't be able to access the content for all the international sites (we're setting up 6 in total) and would only index the .com site. I'm wondering whether the Hreflang annotations would still allow Google to find the International sites? If not, that's a lot of content we are not fully benefiting from. Another option could be that we treat the Googlebot user agent differently, but this would probably be considered as cloaking by the G-Man. If there are any other options, please let me know.
International SEO | | Ben.JD0 -
International SEO Subfolders / user journey etc
Hi According to all the resources i can find on Moz and elsewhere re int seo, say in the context of having duplicate versions of US & UK site, its best to have subfolders i.e. domain.com/en-gb/ & domain.com/en-us/ however when it comes to the user journey and promoting web address seems a bit weird to say visit us at: domain.com/en-us/ !? And what happens if someone just enters in domain.com from the US or UK ? My client wants to use an IP sniffer but i've read thats bad practice and should employ above style country/language code instead, but i'm confused about both the user journey and experience in the case of multiple sub folders. Any advice much appreciated ? Cheers Dan
International SEO | | Dan-Lawrence0 -
Translating URLs worth it?
My company has content in 23 different languages in 30+ countries. We translate page content but we don't translate URLs. I am trying to figure out whether it would be worth the considerable extra overhead to translate the URLs as well. I'd really appreciate hearing the thoughts of the Moz community. Thanks in advance!
International SEO | | Logi0 -
Google US vs Google UK
I could have posted this somewhere else, but I cannot find it. So, I have keywords that rank well in Google US and many that do well in Google UK too. I thought all of my keywords ranking well in the US would also rank well the UK. I have figured out today that it is not the case. Why would I rank in the top 3 in the US and not even show up in the top 50 in the UK? It is very strange. Thanks for your help! I am not super new to SEO or web business. I have had a very good company that has been ranking well since 2004.
International SEO | | journeybeyondtravel0 -
Targeting Different Countries... One Site or Separate?
I have a client who has 3 ecommerce sites. They are somewhat differentiated but for the most part sell the same stuff. Luckily 2 of them are quite authoritative, old and rank reasonably well. Most of the visitors and sales come from the US. He wants to start targeting Europe, Mexico and Canada. What are your suggestions for doing this? Are we better targeting on the main domains? Not really sure how to do that? Should we use a subdomain and a new store front for each geo? Should we use a .co.uk .co.mx and .co.ca each with a unique storefront? It looks like we are moving to a Magento platform so setting up multiple storefronts on a single database is not a big issue. Anyone have any experience with this?
International SEO | | BlinkWeb0 -
Australia specific SEO tips?
For those who are conducting SEO here in Australia: A lot of the info I read, and there is a lot, is generally from the States or UK it seems. Are there any things in particular I should look out for when doing SEO in Australia? Are there any SEO tips that are particular to Australia only? What directories are a must in Australia?
International SEO | | iSenseWebSolutions0