Same site serving multiple countries and duplicated content
-
Hello!
Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time:
I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country).
Basically, it looks like this:
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc.The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do.But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language.Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out?Many thanks,
Ghill -
Thanks Kristina, this is in place now!
-
I would recommend setting up each country's subdirectory as separate properties in Google Search Console. Then, go to original Search Console, and click on Search Traffic > International Targeting, click the tab Country, and identify which country you're targeting users in.
That should give GSC enough information to not flag the content as duplicate.
Good luck!
-
A quick additional question to my initial interrogation though: it seems that there is no difference between HTML tags, HTTP header and XML sitemap to include hreflangs.
But is there any difference when it comes to GCS, SEO tools, Hreflang online cherckers and so on?E.g. if [random] SEO tools spot duplicated content between two regions for a similar page whilst there is hreflang tags within the sitemap, shall I just ignore this warning (provided that the job has been done correctly) or does it mean that there is something wrong still?
Pretty much the same for GCS, if I find warnings around duplicated content whilst hreflang are in place, what does it mean?
Thanks!
-
Hi Kristina,
Reading quite a lot of literature on the topic I was confident that hreflang would not help with duplicate content and then I realized they were mainly depreciated and old blog posts.
Out of curiosity, has the hreflang utilization evolved since its introduction or is it just me going crazy?Anyway, thanks loads for your help, seems much "easier" (so to speak as the hrelang introduction is not an easy one for huge international websites) than I thought.
-
It's for different regions as well. Check out the link I shared. Google lists the reasons for hreflang. The second reason is:
"If your content has small regional variations with similar content, in a single language. For example, you might have English-language content targeted to the US, GB, and Ireland."
-
Hi Kristina,
Thanks for your reply.
But from my understanding of hreflang, it mainly helps Google understand that the content is available in different languages/other regions. It doesn't sort out duplicate content issues if the language remains the same for different regions. -
For any duplicate content you have between countries, use hreflang to differentiate regions. Google lays out how to do that here.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting rid of duplicate content remaining from old misconfiguration
Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!
Intermediate & Advanced SEO | | pissuptours0 -
Combining multiple HTTPS sites
Hi there! I am currently combining several sites (corporate brochure site and ecommerce site) for a client into one central website. All of the content and structure on the new site is set up and relevant pages have 301 redirects ready. My main concern is that the old .co.uk website has an SSL certificate and will be pointing to the new pages on the new .com website (with new SSL in place). Will this cause connection privacy issues? And if so, what's the best way to resolve them? Many thanks!
Intermediate & Advanced SEO | | Daniel_GlueMedia0 -
How to handle duplicate content with Bible verses
Have a friend that does a site with bible verses and different peoples thoughts or feelings on them. Since I'm an SEO he came to me with questions and duplicate content red flag popped up in my head. My clients all generate their own content so not familiar with this world. Since Bible verses appear all over the place, is there a way to address this from an SEO standpoint to avoid duplicate content issues? Thanks in advance.
Intermediate & Advanced SEO | | jeremyskillings0 -
Multiple Ecommerce sites, same products
We are a large catalog company with thousands of products across 2 different domains. Google clearly knows that the sites are connected. Both domains are fairly well known brands - thousands of branded searches for each site per month. Roughly half of our products overlap - they appear on both sites. We have a known duplicate content issue - both sites having exactly the same product descriptions, and we are working on it. We've seen that when a product has different content on the 2 sites, frequently, both pages get to page 2 of the SERPs, but that's as far as it goes, despite aggressive white hat link building tactics. 1. Is it possible to get the same product pages on page 1 of the SERPs for both sites? (I think I know the answer...) 2. Should we be canonicalizing (is that a word?) products across the sites? This would get tricky - both sites have roughly the same domain authority, but in different niches. Certain products and keywords naturally rank better on 1 site or the other depending on the niche.
Intermediate & Advanced SEO | | AMHC0 -
Is this ok for content on our site?
We run a printing company and as an example the grey box (at the bottom of the page) is what we have on each page http://www.discountbannerprinting.co.uk/banners/vinyl-pvc-banners.html We used to use this but tried to get most of the content on the page, but we now want to add a bit more in-depth information to each page. The question i have is - would a 1200 word document be ok in there and not look bad to Google.
Intermediate & Advanced SEO | | BobAnderson0 -
Site duplication issue....
Hi All, I have a client who has duplicated an entire section of their site onto another domain about 1 year ago. The new domain was ranking well but was hit heavily back in March by Panda. I have to say the set up isn't great and the solution I'm proposing isn't ideal, however, as an agency we have only been tasked with "performing SEO" on the new domain. Here is an illustration of the problem: http://i.imgur.com/Mfh8SLN.jpg My solution to the issue is to 301 redirect the duplicated area of the original site out (around 150 pages) to the new domain name, but I'm worried that this could be could cause a problem as I know you have to be careful with redirecting internal pages to external when it comes to SEO. The other issue I have is that the client would like to retain the menu structure on the main site, but I do not want to be putting an external link in the main navigation so my proposed solution is as follows: Implement 301 redirects for URLs from original domain to new domain Remove link out to this section from the main navigation of original site and add a boiler plate link in another area of the template for "Visit xxx for our xxx products" kind of link to the other site. Illustration of this can be found here: http://i.imgur.com/CY0ZfHS.jpg I'm sure the best solution would be to redirect in URLs from the new domain into the original site and keep all sections within the one domain and optimise the one site. My hands are somewhat tied on this one but I just wanted clarification or advice on the solution I've proposed, and that it wont dramatically affect the standing of the current sites.
Intermediate & Advanced SEO | | MiroAsh0 -
Amazing decrease of visits in a Good Content Site
Dear Sirs, contributors and aspirants of Seomoz: I have a site called General History (http://general-history.com/) that was created in 2010, and has a current PR of 3, a DA of 23 and a home page authority of 32. It also has 1.690 links, knowing that we have not invested on link building, all the links were built manually via post inserting or viral via social shares. The thing is that in only 5 months, it passed from receiving 14.000 visits/per month to only 1.500. Is that a decrease of 700% in 5 months? I must admint that I earn my life offering SEO to companies, but this is one of my own sites, a site in which my 73 year old father likes to write about General History. I really think, given that he used to be a journalist, that the content not only isn't spam but it is high quality content. As I had Analytics, I started searching for the cause. The first question was... 1.- From what source did I loose the most amount of visitors? Organic, Paid or Social. The answer is organic by far. As I discovered it was an organic loss, I tried to find what content used to have the most visitors. I found 3 posts that brought 80% of the total traffic. How did the people find the content? Well, some of them found the site in the first page of google when searching for "Holocaust facts and figures" for example, but Analytics says that the most people came from image search in Google Images. General history disappeared from the SERPs but progressively, not from one day to another. So then I thought, It can't be a penalization. I contacted google and send them a reconsideration. 5 days later they answered saying that general-history.com is not a spammy site and thus it has not been penalized. For the ones who can read Spanish, here is Google answer: "Estimado webmaster o propietario del sitio http://general-history.com/: Hemos recibido una solicitud del propietario de un sitio para que volvamos a comprobar si http://general-history.com/ cumple las directrices para webmasters de Google. Hemos revisado tu sitio y no hemos detectado acciones manuales del equipo de webspam que puedan perjudicar la clasificación del mismo en Google. No es necesario que presentes una solicitud de reconsideración para el mismo, ya que las incidencias relacionadas con la clasificación que puedan producirse no se derivan de acciones manuales realizadas por el equipo de webspam. Existen otras incidencias relacionadas con tu sitio que pueden perjudicar la clasificación del mismo. Los ordenadores de Google determinan el orden de los resultados de búsqueda a través de una serie de fórmulas denominadas algoritmos. Cada año, se realizan cientos de cambios en los algoritmos de búsqueda, y se utilizan más de 200 señales diferentes para clasificar páginas. A medida que cambian los algoritmos y la Web (incluido tu sitio), se pueden producir fluctuaciones en la clasificación, ya que se actualiza para ofrecer a los usuarios los resultados más relevantes. Si has detectado un cambio en la clasificación y consideras que no se debe simplemente a un cambio de algoritmos, te recomendamos que investigues otras posibles causas, como un cambio importante en el contenido del sitio, en el sistema de gestión de contenido o en la arquitectura del servidor. Por ejemplo, es posible que un sitio no obtenga una buena posición en los resultados de búsqueda si el servidor deja de proporcionar páginas a Googlebot o si el usuario cambia las URL de una gran parte de las páginas del sitio. En este artículo se incluye una lista de otros posibles motivos por los que tu sitio no obtiene una buena clasificación en los resultados de búsqueda. Si sigues sin poder solucionar la incidencia, accede al foro de ayuda para webmasters para obtener asistencia. Atentamente, Equipo de Calidad de búsqueda de Google" They say interesting things like it might be other problems that caused my position decrease like: Site content change, content management, server architecture or change or urls. After receiving this, I thought I should get in the admin panel in wordpress and search for bugs, html or css, php errors and I found that somebody had hijacked my site, entering the wordpress panel and adding a code of into one of my landing pages. That page does not exist anymore. I erased completely. The span code was as follows:
Intermediate & Advanced SEO | | Tintanus
General History | General-History General History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-HistoryGeneral History | General-History I thought that would be the problem ! But it was NOT, because Google did not penalize me as you can see in the letter they sent me. I erased the complete page in which the span appeared, I updated my sitemap, re-check my robots.txt, searched my folders via FTP and mucho more... Conclusion? I have no idea why I General-History has lost 700% of its traffic in 5 months.0 -
Original Site content was used for submission to article directories
I had a communication problem with my writer and she used original unspun content and posted it to Unique Article Wizard. So all UAW does is take each paragraph and mix them up. So I searched a sentence on my site where the content came from and got back a bunch of returns for that sentence. My site wasn't the first result returned. I"m wondering how bad that is going to be for me. The links from UAW are going back to an anchor layer that then links back to this site. Can anyone tell me if I need to rewrite the content on the original site? That is the only way I can think to make that not an issue. Thanks
Intermediate & Advanced SEO | | mtking.us_gmail.com0