Subdomains or Subfolders for a multilingual site?
-
What kind of structure would you propose for a site with multiple languages, subdomains or subfolders?
-
Either one works, as long as you tell G which one you choose via rel="alternative"
Either is fine - http://googlewebmastercentral.blogspot.com/2010/03/working-with-multilingual-websites.html
Mark up multilingual - http://googlewebmastercentral.blogspot.com/2011/12/new-markup-for-multilingual-content.html
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
White Label Subdomain Competing with Top Level Domain?
Hi All, We have a top level domain that is a comparison site for companies in our industry. We also manage a white label website for a specific company in the same industry, which was originally set up as a subdomain. In other words we have: "example.com" and "companyname.example.com." The sites are treated as separate websites--the subdomain site isn't filling a role like a subfolder would. It has it's own branding, navigation/url structure, etc. Since these sites are in the same industry, there is obviously a huge overlap in the keywords we want each to rank for. In fact 100% of the keywords for the subdomain, are targets for the top level domain. My question is, are we hurting ourselves in google rankings by having two sites under the same top level domain competing for the same keywords? We want both sites to be as successful as possible. Would we be better served by kicking the subdomain out into a new top level domain? Thanks!
Algorithm Updates | | Rodrigo-DC0 -
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
Moving established :COM site to a .ART domain
Hi! We have an existing website that has a .com TLD with our brand name, which is completely unrelated to any of the terms we want to rank for except for the brand search of our company of course. We have an online shop and the .com site has been online for a good few years. The business activity is related to art, in fact some of our customers would search for "name of artists + art" and we appear in results. From what I have read, Google is not going to give better rankings for a .art domain name, but will the extension be counted as a potential keyword and relevancy to users searches based on example above? Does anyone have any experience with regards to this consideration? Thanks!
Algorithm Updates | | bjs20100 -
Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
Hi guys, I wonder whether you can help me with a couple of SEO queries: So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes. The blog is on a subdomain of the site such as: blog.exampleecommercesite.com (We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time) 1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools? 2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site? If appreciate your opinions on the topic! Thank you and have a good start of the week!
Algorithm Updates | | Firebox0 -
How do I figure out what's wrong with my site?
I'm fairly new to SEO and can't pinpoint what's wrong with my site...I feel so lost. I am working on revamping www.RiverValleyGroup.com and can't figure out why it's not ranking for keywords. These keywords include 'Louisville homes', 'Homes for sale in Louisville KY', etc. Any suggestions? I write new blog posts everyday so I feel there's no shortage of fresh content. I'm signed up with Moz Analytics and Google analytics
Algorithm Updates | | gohawks77900 -
Site refuses to improve rankings. Can someone else put a set of eyes on this for me and see what I am missing?
Hello! We've been successful with over 40 clients and getting them to great results in our industry, insurance. We recently acquired a new client who had an existing website with prior SEO results a very spammy blog and many spammy links. We've removed many of the blog articles and links using the Google Disavow Tool We've been monitoring this site in a campaign on Moz, but we're seeing zero improvement week to week. Can someone put another set of eyes on this and see if we're simply just missing something? Results for all 30 of our tracked keywords, zero are in the top 50! I would guess this was an algorithm penalty, but it has been 3 months now since we've made the changes and nothing is changing... not even a little bit! Any help/suggestions would be GREATLY appreciated. Thank you and enjoy Labor Day weekend!
Algorithm Updates | | Tosten0 -
Big site SEO: To maintain html sitemaps, or scrap them in the era of xml?
We have dynamically updated xml sitemaps which we feed to Google et al. Our xml sitemap is updated constantly, and takes minimal hands on management to maintain. However we still have an html version (which we link to from our homepage), a legacy from back in the pre-xml days. As this html version is static we're finding it contains a lot of broken links and is not of much use to anyone. So my question is this - does Google (or any other search engine) still need both, or are xml sitemaps enough?
Algorithm Updates | | linklater0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0