How will google respond to allowing multilingual search terms for a single language website?
-
We would like to set up a website in English language only and promote this in various European countries. As said the website will only be available in English language, but we will keep translations (google translate) in backend. When a user in France then enters search query in French language in browser, a search can be done in French content, but we will present relevant content in English. Does anyone have any experience with that? Will it be allowed given the fact that the result (in English language) will probably not include any of the terms that was searched on (in French language).
-
It depends upon which service you're using to translate.
And after translate, the URL's remains the same or it changes.
If it changes then it will be discovered by Google. -
this is very useful
-
I'm also interested in this topic because im using wordpress right now and i use polylang for all my translations and till now haven't had any problems with google
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Temporarily redirecting a small website to a specific url of another website
Hi, I would like to redirect a small website that contains info about a specific project temporarily to a specific url about this project on my main website. Reason for this is that the small website doesn't contain accurate info anymore. We will adapt the content in the next few weeks and then remove the redirect again. Should I set up a 301 or a 302? Thanks
Intermediate & Advanced SEO | | Mat_C1 -
Google News Sitemap in Different Languages
Thought I'd ask this question to confirm what I already think. I'm curious that if we're publishing something in two language and both are verified by the publishing center if the group would recommend publishing two separate Google News Sitemaps (one in each language) or publishing one in each language.
Intermediate & Advanced SEO | | mattdinbrooklyn0 -
What is the difference between Multilingual and multiregional websites?
Hi all, So, I have studied about multilingual and multiregional websites. As soon as possible, we will expand the website languages to english and spanish. The urls will be like this: http://example.com/pt-br
Intermediate & Advanced SEO | | mobic
http://example.com/en-us
http://example.com/es-ar Thereby, the tags will be like this: Great! But my doubt is: To /es-ar/ The indexing will be only to spanish languages in Argentina? What about the other countries that speak the same language, like Spain, Mexico, etc.I don't know if it will be possible develop a Spanish languages especially for each region. Should I do an multiregional website or only multilingual? How Google sees this case? Thanks for any advice!!1 -
Is it better to not allow Google to index my Tumblr Blog?
Currently using a subdomain for my blog via Tumblr In my seo reports I see alot of errors. Mostly from the Tumblr blog. Made change so there are unique titles and tags. Too many errors I am wondering if it is best to just not allow it to be indexed via tumblr control panel. It certainly is doing a great job with engagement and social network follows, but i'm starting to wonder if and how much it is penalizing my domain.. Appreciate your input.. By the way this theme is not flash for the content very basic single a theme...
Intermediate & Advanced SEO | | wickerparadise0 -
What is best practice SEO approach to re structuring a website with multiple domains and associated search engine rankings for each domain?
Hello Mozzers, I'm trying to improve and establish rankings for my website which has never really been optimised. I've inherited what seems to be a mess and have a challenge for you! The website currently has 3 different www domains all pointing to the one website, two are .com domains and one is a .com.au - the business is located in Australia and the website is primarily targeting Australian traffic. In addition to this there are a number of other non www domains for the same addresses pointing to the website in the CMS which is Adobe Business Catalyst. When I check Google each of the www domains for the website has the following number of pages indexed: www.Domain1,com 5,190 pages
Intermediate & Advanced SEO | | JimmyFlorida
www.Domain2.com 1,520 pages
www,Domain3.com.au 149 pages What is best practice approach from an SEO perspective to re organising this current domain structure? 1. Do I need to use the .com.au as the primary domain given that we are in this market and targeting traffic here? Thats what I have been advised and it seems to be backed up by what I have read here. 2. Do we re direct all domains to the primary .com.au domain? This is easily done in the Adobe Business Catalyst CMS however is this the same as a 301 redirect which is the best approach from an SEO perspective? 3. How do we consolidate all of the current separate domain rankings for the 3 different domains into the one domain rankings within Google to ensure improved rankings and a best practice approach? The website is currently receiving very little organic search traffic so if its simpler and faster to start again fresh rather than go through a complicated migration or re structure and you have a suggestion here please feel free to let me know your ideas! Thank you!0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Will Google Revisit a 403 Page
Hi, We've got some pretty strict anti-scraping logic in our website, and it seems we accidentally snared a Googlebot with it. About 100 URL requests were responded to with a 403 Forbidden error. The logic has since been updated, so this should not happen again. I was just wondering if/when Googlebot will come back and try those URLs again. They are linked from other pages on the site, and they are also in our sitemap. Thanks in advance for any assistance.
Intermediate & Advanced SEO | | dbuckles0 -
Does Google crawl the pages which are generated via the site's search box queries?
For example, if I search for an 'x' item in a site's search box and if the site displays a list of results based on the query, would that page be crawled? I am asking this question because this would be a URL that is non existent on the site and hence am confused as to whether Google bots would be able to find it.
Intermediate & Advanced SEO | | pulseseo0