Best practice for multilanguage website ( PHP feature based on Browser or Geolocalisation)
-
Hi Moz Experts
I would like to know what does it the best practice for multilanguage website for the default language ?
There are several PHP features to help users to get the right language when they come from SEO and direct; present the default language by browser language, by gelolocalisation, etc. However, which one is the most appropriate for Quebec company that try to get outside Canada ? PRO and CONS.
Thank you in advance.
-
I love it Kate Moriss.
it make it simple with the small questions.
-
Thank you. So, I will need to add the hreflang for the first strategy.
For the Webmastertool, I'm not so sure, because we didn't have subdirectories. We have www.mycompany.com and www.mycompany.com/fr. So, actually, I have webmastertool for www.mycompany.com. However, we have a Google plus page, but we didn't have a Content marketing strategy.
that's all we have for the moment in place?
-
I built a tool to help people understand how to best go about international expansion. It's here: katemorris.com/issg
-
Not sure if you could really call it a best practice - but in Belgium (3 different languages) the normal configuration is not to determine the default language automatically but rather to present a first time visitor a "choose language" page and store the choice in a cookie for future visits. This is mainly for direct visits.
People coming in via search engines use queries in one of the languages, so normally Google will direct them to pages in that language. Again, on first time visit, the implicit choice of language is stored in a cookie.
All pages contain a link to switch to the other language(s) - which also changes the choice stored in the cookie.
Disadvantage of this system is that you add an additional layer to the site (choice of language) - advantage is that error margin is zero.
Systems which are based ip, browser language, ...etc are not 100% reliable - which could lead to unwanted results (in Belgium quite a sensitive issue if you server a Dutch page to a French speaking person - idem for French & Dutch speaking).
Hope this helps,
Dirk
-
Hi there
Look into hreflang attributes - if you have the same site with multiple languages, this is a best practice.
You can also look into setting up Webmaster Tool accounts for different sites (if they have different country codes or subdirectories) and geotarget those for the country they are supposed to appear to.
Does this help? Let me know - good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
NO Meta description pulling through in SERP with react website - Requesting Indexing & Submitting to Google with no luck
Hi there, A year ago I launched a website using react, which has caused Google to not read my meta descriptions. I've submitted the sitemap and there was no change in the SERP. Then, I tried "Fetch and Render" and request indexing for the homepage, which did work, however I have over 300 pages and I can't do that for every one. I have requested a fetch, render and index for "this url and linked pages," and while Google's cache has updated, the SERP listing has not. I looked in the Index Coverage report for the new GSC and it says the urls and valid and indexable, and yet there's still no meta description. I realize that Google doesn't have to index all pages, and that Google may not also take your meta description, but I want to make sure I do my due diligence in making the website crawlable. My main questions are: If Google didn't reindex ANYTHING when I submitted the sitemap, what might be wrong with my sitemap? Is submitting each url manually bad, and if so, why? Am I simply jumping the gun since it's only been a week since I requested indexing for the main url and all the linked urls? Any other suggestions?
Web Design | | DigitalMarketingSEO1 -
Can we link back from help documents to product or features pages on website?
Hi, We have all our help documents on subdirectory linked for all the features or products we provide. Like we linked website.com/help/seo-guide from website.com/services/seo-product as that is relevant guide. Do we need to link back from all help guide pages to product pages? Thanks
Web Design | | vtmoz0 -
Move To a full new Website
Hey everyone, I'm going to change my website's Domain, Server, CMS and Theme I can't find any full & detailed answer to how to do that without losing anything, is anybody here has a full resource or could tell me a how-to checklist for doing that. Thanks in advance,
Web Design | | Mahmoud.ahmad.taha0 -
How to prevent development website subdomain from being indexed?
Hello awesome MOZ Community! Our development team uses a sub-domain "dev.example.com" for our SEO clients' websites. This allows changes to be made to the dev site (U/X changes, forms testing, etc.) for client approval and testing. An embarrassing discovery was made. Naturally, when you run a "site:example.com" the "dev.example.com" is being indexed. We don't want our clients websites to get penalized or lose killer SERPs because of duplicate content. The solution that is being implemented is to edit the robots.txt file and block the dev site from being indexed by search engines. My questions is, does anyone in the MOZ Community disagree with this solution? Can you recommend another solution? Would you advise against using the sub-domain "dev." for live and ongoing development websites? Thanks!
Web Design | | SproutDigital0 -
Website Redesign and Migration to Squarespace killed my Ranking
My old website was dated, ugly, impossible to update and a mess between hard-coded pages and WP, but we were ranking #1 in the organic searches for our key words. I just redesigned my website using Squarespace. I kept most of the same text on the pages (for key words) and kept the same Meta-Tags and Title Tags for each page as much as possible. Once I was satisfied that I had done as much on-page optimization as I could, I changed the IP in our Domain Name Registry so that it would point to our new website on the Squarespace host. And our new website was live! ...Then I watched in dismay as our ranking fell into oblivion. I think this might have something to do with not doing any 301 redirects from the old website and losing all of my link juice. Is this the case? And, if so, how do I fix it? Our website url is www.kanataskinclinic.ca Thanks
Web Design | | StillLearning1 -
Wordpress SEO/Tag plugin recommendation for sports (soccer) website
I own a Wordpress website which covers soccer in the DC MD VA area called DMV Soccer http://www.dmvsoccer.com/ We write weekly recaps where we tag a player who has scored a goal or performed well in a game. For each player, obviously, a tag is created. What I'm looking for is a plugin or solution that would allow me to tag a player, but also automatically assign a team to that player so that the team name and player's name are optimized on the individual player's tag page. So if I were to tag George Murphy on a recap, and I assign him to a team, let's say DC United. The tag page would have a title, something like: George Murphy Soccer Player for DC United and the meta description: George Murphy, soccer player from MD who players for DC United archives Or something similar, if that makes sense. Should I skip using tags and instead start assigning each player as a sub-category under each team? I'd like to try to avoid that, because not each category will be based on a player. Any suggestions in terms of existing plugins or other recommendations?
Web Design | | georgetsn1 -
What is the best information architecture for developing local seo pages?
I think I have a good handle on the external local seo factors such as citations but I'd like to determine the best IA layout for starting a new site or adding new content to a local site. I see lots of small sites with duplicate content pages for each town/area which I know is poor practice. I also see sites that have unique content for each of those pages but it seems like bad design practice, from a user perspective, to create so many pages just for the search engines. To the example... My remodeling company needs to have some top level pages on its site to help the customers learn about my product, call these "Kitchen Remodeling" and "Bathroom Remodeling" for our purposes. Should I build these pages to be helpful to the customer without worrying too much about the SEO for now and focus on subfolders for my immediate area which would target keywords like "Kitchen Remodeling Mytown"? Aside from my future site, which is not a priority, I would like to be equipped to advise on best practices for the website development in situations where I am involved at the beginning of the process rather than just making the local SEO fit after the fact. Thanks in advance!
Web Design | | EthanB0 -
Best Site navigation solution
Hi there, We are getting our website redesigned and would like to know whether to increase the links on our site wide navigation or not. At the moment we have around 30 links from the navigation. We want to use exploding navigation menu and increase the links to our most important categories. Say if we increase to 60-70 would that be alright. (what will be the highest we can go for) At the moment categories that get links from navigation are ranking pretty good. If we increase would we loose those rankings. What will be the pros and cons of increasing navigation links? Second question we are also adding fooer links to top 10 categories in the footer. Would this be ok as far as seo and google concerned. Many Thanks
Web Design | | Jvalops0