One company, 3 countries, 3 sites - best solution?
-
Hi all, I'm working with a company that has 3 x websites all on separate WordPress platforms.
One is at .com, the others .fr and .de - they are essentially very similar.
I have suggested that it is worth exploring setting all of these websites up on the .com domain with country-specific directories to combine their authority and help all 3 websites naturally rank due to combining incoming links, authority etc.
Quesitons:
To ensure each country has control of their site, would you maintain a separate install of WP at each directory, i.e: .com/fr/ and .com/de or would you put it all on the same WP?
Would you go down this route of combining all 3 sites onto one domain with country-specific directories? What are the pitfalls?
-
Gaston, your answer is correct (albeit not complete... see my answer below).
Plus: when linking to posts and guides, even if they are by Moz, always explain why they can be useful.
-
The answer offered by Gaston is correct, but it's not the only option available.
You can consolidate the three websites under the .com one and geo-target the /fr/ and /de/ subfolder via Google Search Console.
However, I wouldn't install 3 different WordPress, but only one and use the WPML plugin (https://wpml.org/documentation/getting-started-guide/), which works with no issues with all major WordPress plugins usually used for SEO (like Yoast SEO).
I don't pitfalls in a consolidation, but every international strategy really is unique and, not knowing the details of yours I cannot give you a firm positive (or negative) answer.
-
Hello Bee,
As for internationalization purposes, leave them in their own ccTLD. Just add hreflangs to the other country websites and you'll be just fine.
If you choose to go down the road of having all under a .com/country/ configuration, I'd suggest you installing separate wordpress on each country. Also, if you plan on expanding on more countries and having an overall control, take a look at Wordpress Multisite Network Administration.
Also, just for clarifying, take a look at these articles:
Multi-regional and multilingual sites - Google Search Console
International checklist - Moz Blog
Using the correct hreglang tag - Moz Blog
Guide to international website expansion - Moz Blog
Tool for checking hreflang anotations - Moz BlogHope it helps.
Best Luck.
GR.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to set up redirects with a company takeover
Hi there, We are about to take over a player in the market with some good DA en PA's. We choose to redirect all the pages from the domain we take over to our main domain for now, later we want to redirect all categories to relevant and similar categories on our own domain. The company we take over is using a server which will be cancelled in a while. For now we set up the 301 redirect(s) on their server we take over. Because of the extra costs we will cancel the server in a few weeks/months. What is a common way to keep 301 redirects alive after cancelling the server of company we take over? I hope someone can give me the help I need in this one. Thanks in advance! Cheers,
Technical SEO | | MarcelMoz
Marcel0 -
Would merging a site with strong DA with one that has weak DA be a smart move?
I am working on a project for a client that has two ecommerce sites each with several thousands of products. Site A has a strong DA, is ranking well on Google for thousands of competitive keywords and generating high traffic and conversions. Site B has a poor DA, ranking poorly and much less traffic. We are considering the idea of merging the 5,000+ product pages from site B into site A. How can we evaluate whether this would be a wise move with the least risk to site A?
Technical SEO | | richdan0 -
Tracing Redirects to a Site
I wonder if anyone has used any tools where you can trace the redirects pointing to a site? I know there are a number of tools out there that can be used to check where a URL redirects to, but I was wondering if anyone has used a tool where I could trace all redirects with the final URL? I am using this for competitor research so I don't have access to Analytics or Webmaster Tools.
Technical SEO | | BeattieGroup0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
One-Pager and SEO
We're building a page that is going to feature over 31 people as difference makers in their field. We're unveiling one a day for an entire month. The very early mockup of the page has name, pic, some bio info, and a link to open up a new window with the full bio. I would love to have all of the bio content for all of the people on the page (and indexable), but I'm not sure how to do that while still being able to hide the full bios until they are expanded. Anybody have any tips that are SEO-friendly and/or examples of a page that is built like this and ranks well. Thanks!
Technical SEO | | spackle0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
Best blocking solution for Google
Posting this for Dave SottimanoI Here's the scenario: You've got a set of URLs indexed by Google, and you want them out quickly Once you've managed to remove them, you want to block Googlebot from crawling them again - for whatever reason. Below is a sample of the URLs you want blocked, but you only want to block /beerbottles/ and anything past it: www.example.com/beers/brandofbeer/beerbottles/1 www.example.com/beers/brandofbeer/beerbottles/2 www.example.com/beers/brandofbeer/beerbottles/3 etc.. To remove the pages from the index should you?: Add the Meta=noindex,follow tag to each URL you want de-indexed Use GWT to help remove the pages Wait for Google to crawl again If that's successful, to block Googlebot from crawling again - should you?: Add this line to Robots.txt: DISALLOW */beerbottles/ Or add this line: DISALLOW: /beerbottles/ "To add the * or not to add the *, that is the question" Thanks! Dave
Technical SEO | | goodnewscowboy0 -
Whats the best tools for site architecture
Look for tools that can visualise a sites architecture (idealy automated). Also looking for tools that can visualise internal linking sturures
Technical SEO | | Motionlab0