Language/Country Specific Pages All in English
-
Hi Folks,
I have been checking how many pages our competitors have indexed in Google compared to our website and I noticed that one of our main competitors has over 2 million indexed pages and I have figured out that it is because they have language/country specific pages for every page on their website. That being said, these pages contain all of the same content and the language doesn't actually change, it remains in English.
Now my question is this. Will this not in fact hurt their rankings, in terms of duplicate content? Or am I missing something here?
The URL's essentially do something like www.competitor.com/fr/ for France for example but as I say the content is in English, and duplicates their main website.
Seems odd to me but would love your opinions on this. Thanks.
Gaz
-
Thanks Keszi,
Will send you a PM, appreciate your help and advice. Thanks.
Gareth
-
In general, they should have different texts for all of their targeted regions. But without having a closer look on this specific example, it is quite hard to tell what they are doing.
If you want, send me a private message (if you do not want to share it here) and we can take a look at the specific case. Ok?
Gr., Keszi
-
Thanks for the response keszi.
In terms of serving pages that are all in English though, how would this affect rankings? It seems as though this should be penalised as you're not providing content in the language targeted, so in fact it is duplicate.
Thanks
Gaz
-
Hi Gareth,
I personally do not like the sub-domain approach, so if I had to choose between sub-domain and sub-directory approach, I would go for the second one.
The company where I work is using sub-directories for each of the languages, we have content written in each of the languages and we have also implemented hreflang markup. And it works fine for us.
Each of the approaches in international targeting has their positive and negative aspects. It really depends on many factors which one to choose.
Gr., Keszi
-
After doing some research it appears that subdirectory language specific content is the least SEO friendly but it also seems that you should have the content written in the language of that specific country.
What are your thoughts on this? Would this be detrimental to rankings or would you recommend I implement a similar strategy but using subdomains like https://country.domain.com? Thanks.
Gaz
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL Structure for geo location for specific page
On hackerearth.com/challenges page, there is an option to select languages. This option is in the footer. Once you select the language the url changes. Ex - if we select French, the URL changes to hackereath.com/fr/challenges. In case we decide to change the URL of this page with Geo, what should be the URL structure which accommodates languages as well. My research says that it would good to keep the url like domainname.com/page/language.
Intermediate & Advanced SEO | | Rajnish_HE0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Pages with rel "next"/"prev" still crawling as duplicate?
Howdy! I have a site that is crawling as "duplicate content pages" that is really just pagination. The rel next/prev is in place and done correctly but Roger Bot and Google are both showing duplicated content + duplicate page titles & meta's respectively. The only thing I can think of is we have a canonical pointing back at the URL you are on - we do not have a view all option right now and would not feel comfortable recommending it given the speed implications and size of their catalog. Any experience, recommendations here? Something to be worried about? /collections/all?page=15"/>
Intermediate & Advanced SEO | | paul-bold0 -
What to do with Authoritative footer pages?
Alo everyone! The site I'm working on has had a homepage that essentially used the footer as the main form of navigation on the site and the PA of each of those pages reflects that. I'm helping them re-organize the site (I'm still a noob though), and was curious for some input on this particular situation. Some of the most authoritative pages are: 1. www.charged.fm/privacy - PA 29 2. www.charged.fm/terms - PA 29 My question: Is this just a consequence of previous mistakes that we live with, or is there something involving 301's and creation of new pages that could help us utilize the link juice on these pages. Or should we come up with ways to internally link to 'money' pages from these pages instead? Thanks for any input, Luke
Intermediate & Advanced SEO | | keL.A.xT.o0 -
Site less than 20 pages shows 1,400+ pages when crawled
Hello! I’m new to SEO, and have been soaking up as much as I can. I really love it, and feel like it could be a great fit for me – I love the challenge of figuring out the SEO puzzle, plus I have a copywriting/PR background, so I feel like that would be perfect for helping businesses get a great jump on their online competition. In fact, I was so excited about my newfound love of SEO that I offered to help a friend who owns a small business on his site. Once I started, though, I found myself hopelessly confused. The problem comes when I crawl the site. It was designed in Wordpress, and is really not very big (part of my goal in working with him was to help him get some great content added!) Even though there are only 11 pages – and 6 posts – for the entire site, when I use Screaming Frog to crawl it, it sees HUNDREDS of pages. It stops at 500, because that is the limit for their free version. In the campaign I started here at SEOmoz, and it says over 1,400 pages have been crawled…with something like 900 errors. Not good, right? So I've been trying to figure out the problem...when I look closer in Screaming Frog, I can see that some things are being repeated over and over. If I sort by the Title, the URLs look like they’re stuck in a loop somehow - one line will have /blog/category/postname…the next line will have /blog/category/category/postname…and the next line will have /blog/category/category/category/postname…and so on, with another /category/ added each time. So, with that, I have two questions Does anyone know what the problem is, and how to fix it? Do professional SEO people troubleshoot this kind of stuff all of the time? Is this the best place to get answers to questions like that? And if not, where is? Thanks so much in advance for your help! I’ve enjoyed reading all of the posts that are available here so far, it seems like a really excellent and helpful community...I'm looking forward to the day when I can actually answer the questions!! 🙂
Intermediate & Advanced SEO | | K.Walters0 -
Are there any negative effects to using a 301 redirect from a page to another internal page?
For example, from http://www.dog.com/toys to http://www.dog.com/chew-toys. In my situation, the main purpose of the 301 redirect is to replace the page with a new internal page that has a better optimized URL. This will be executed across multiple pages (about 20). None of these pages hold any search rankings but do carry a decent amount of page authority.
Intermediate & Advanced SEO | | Visually0 -
Best strategy behind moving country subdirectory to dedicated TTLD wo/ loosing organic search volume?
Community, We are about to move one of our most popular country sub directories from brandname.com/de/.. to brandname.de . We have just purchased the domain so while the domain has been registered in 2009 the URL has zero domain authority. What is the best strategy to execute the move while being cautious about loosing too much organic search volume the subdirectory is receiving right now? Obviously it will take some time to build up DA on the TTLD so maybe it is a good idea to keep the country directory for a little longer and start on the TTLD with just a static landing page, place some links, wait until it receives some DA builds up and then perform the move. Thoughts? /TomyPro
Intermediate & Advanced SEO | | tomypro0 -
Do in page links pointing to the parent page make the page more relevant for that term?
Here's a technical question. Suppose I have a page relevant to the term "Mobile Phones". I have a piece of text, on that page talking about "mobile phones", and within that text is the term "cell phones". Now if I link the text "cell phones", to the page it is already placed on (ie the parent page) - will the page gain more relevancy for the term "cell phones"?? Thanks
Intermediate & Advanced SEO | | James770