What is the best strategy for a company in various countries?
-
Hello I have to make yt SEO marketing strategy for a company that provides services in Spain, Colombia and Mexico
I'm looking at two options:-
Buy different domains (TLD): This option seems feasible but very expensive and manage each domain position it would have to have different content in each (plus you would not know that because it is put exactly the same domain)
-
Place each service and country folders eg
www.dominio.com / mexico / training-financiero.html
www.dominio.com / espana / training-financiero.html
I have understood that option 1 is no longer necessary since you can use html tags within the code to tell Google that you try to target content to customers from a different country.
In principle we would use the same content would change only a few words and of course the currency to suit the local currency of each country.However I believe that customers could rely more on a domain if their country. Plus I'm afraid I google indexed as duplicate content is another matter What country would main that could confuse the visitor?
-
-
First of all I really suggest you read this post by Aleyda Solis in the SEERInteractive Blog, which practically answers to all your doubts in a very extensive way.
Second, the subfolder option seems the best one for your case, at least from what I understood about your businesses' needs. In fact, business's objective in middle/long term should determine what option for International SEO you should choose, and not simply the mere SEO aspect.
So, if you choose to the subfolder option, you must:
-
create the subfolders (obvious)
-
geotarget them in Google Webmaster Tools
-
implement the use of rel="alternate" hreflang markup, especially because you are using the same content in different subfolders, hence, if you want to be really sure that Google shows the correct URLs for the given country, better to use the hreflang.
-
even if you do 2) and 3) I strongly suggest you to localize the content of your site. The Spanish spoken in Mexico, Argentina and Spain is quite different, as it is the English spoken in UK and USA. The more the language used is fitting the culture of the countries you are targeting, the better not only in SEO terms, but also in the conversion potentialities ones.
Finally, related to buying also the Country Level Domain versions of your main domain, actually buying them and redirecting 301 to the respective subfolders doesn't have really any SEO effect, but it may be an useful way to "reserve" those domains for a potential future use.
-
-
Hey, things are a little clearer. I'm planning to do the following
- Separate the website into folders for each country and service
www.dominio.com/argentina/seo.html
Place the respective Geolocation labels in the source code
-
Sign into Google Webmaster Tools and set the hearing for each subfolder
-
Buy the TLD domains for each country
Not to do with the TLD domains. What do you recommend me?
I was thinking of putting a landing page with a contact form where it counts in summary we do (original content of course) and that each service link to the folder. Com domain on the country
Or just that when they enter the TLD domain redirect (Redirect 301) to the subfolder of the respective country in the domain. Com
- Separate the website into folders for each country and service
-
Hi,
2 really good links below discussing this topic:
How to do SEO for different countries
International SEO - Whiteboard Session
Cheers,
-
First of all thanks for your quick response. Actually I would not know how to approach the service differently for two countries with the same language. Suppose you offer SEO service to Argentina and one for Colombia (In both countries the language is Spanish) you could say differently in service to Colombia regarding SEO SEO service to Argentina?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practices For Angular Single Page Applications & Progressive Web Apps
Hi Moz Community, Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices? In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well? Thanks!
Technical SEO | | znotes0 -
What is the best way to handle Product URLs which prepopulate options?
We are currently building a new site which has the ability to pre-populate product options based on parameters in the URL. We have done this so that we can send individual product URLs to google shopping. I don't want to create lots of duplicate pages so I was wondering what you thought was the best way to handle this? My current thoughts are: 1. Sessions and Parameters
Technical SEO | | moturner
On-site product page filters populate using sessions so no parameters are required on-site but options can still be pre-populated via parameters (product?colour=blue&size=100cm) if the user reaches the site via google shopping. We could also add "noindex, follow" to the pages with parameters and a canonical tag to the page without parameters. 2. Text base Parameters
Make the parameters in to text based URLs (product/blue/100cm/) and still use "noindex, follow" meta tag and add a canonical tag to the page without parameters. I believe this is possibly the best solution as it still allows users to link to and share pre-populated pages but they won't get indexed and the link juice would still pass to the main product page. 3. Standard Parmaters
After thinking more today I am considering the best way may be the simplest. Simply using standard parameters (product?colour=blue&size=100cm) so that I can then tell google what they do in webmaster tools and also add "noindex, follow" to the pages with parameters along with the canonical tag to the page without parameters. What do you think the best way to handle this would be?0 -
Best way to noindex long dynamic urls?
I just got a Mozcrawl back and see lots of errors for overly dynamic urls. The site is a villa rental site that gives users the ability to search by bedroom, amenities, price, etc, so I'm wondering what the best way to keep these types of dynamically generated pages with urls like /property-search-page/?location=any&status=any&type=any&bedrooms=9&bathrooms=any&min-price=any&max-price=any from indexing. Any assistance will be greatly appreciated : )
Technical SEO | | wcbuckner0 -
Best way to deal with over 1000 pages of duplicate content?
Hi Using the moz tools i have over a 1000 pages of duplicate content. Which is a bit of an issue! 95% of the issues arise from our news and news archive as its been going for sometime now. We upload around 5 full articles a day. The articles have a standalone page but can only be reached by a master archive. The master archive sits in a top level section of the site and shows snippets of the articles, which if a user clicks on them takes them to the full page article. When a news article is added the snippets moves onto the next page, and move through the page as new articles are added. The problem is that the stand alone articles can only be reached via the snippet on the master page and Google is stating this is duplicate content as the snippet is a duplicate of the article. What is the best way to solve this issue? From what i have read using a 'Meta NoIndex' seems to be the answer (not that i know what that is). from what i have read you can only use a canonical tag on a page by page basis so that going to take to long. Thanks Ben
Technical SEO | | benjmoz0 -
Robots.txt best practices & tips
Hey, I was wondering if someone could give me some advice on whether I should block the robots.txt file from the average user (not from googlebot, yandex, etc)? If so, how would I go about doing this? With .htaccess I'm guessing - but not an expert. What can people do with the information in the file? Maybe someone can give me some "best practices"? (I have a wordpress based website) Thanks in advance!
Technical SEO | | JonathanRolande0 -
Best Way To Clean Up Unruly SubDomain?
Hi, I have several subdomains that present no real SEO value, but are being indexed. They don't earn any backlinks either. What's the best way of cleaning them up? I was thinking the following: 1. Verify them all in Webmaster Tools. 2. Remove all URLs from the index via the Removal Tool in WMT 3. Add site-wide no-index, follow directive. Also, to remove the URLs in WMT, you usually have to block the URLs via /robots.txt. If I'd like to keep Google crawling through the subdomains and remove their URLs, is there a way to do so?
Technical SEO | | RocketZando0 -
Best strategy for redirecting domain authority from an acquired site...?
Hi all, I'm an in-house for a company that made several acquisitions last year prior to my starting. I'm just now hearing about several loose-ends websites that belong to companies that have been absorbed by us. The question is how to best approach the task of utilizing that site's domain authority to our site's benefit. There is already a link to the homepage in the header of the site in question (our logo's right under theirs) so we're already getting some linkjuice. Looks like the whois information never changed. Here are the options I'm considering: 1. Blanket redirect (all of their pages there into our home page) - not ideal. 2. Targeted redirect (try to "connect the dots" between content pages with similar subjects/keyword relevance - better than #1, but is it worth the extra effort? 3. More linking (add more strategically placed and keyword optimized links back to our site) - also more work, but certainly do-able if the consensus is to leave the site up. 4. Any other suggestions? Thanks for your help everyone!
Technical SEO | | TGViaWest0 -
Best free tool to check internal broken links
Question says it all I guess. What would your recommend as the best free tool to check internal broken links?
Technical SEO | | RikkiD225