Multi language blog
-
Does google consider it to be spam by having multiple languages posts on a blog such as English, Arabic, spanish blog posts.
-
Hi Akram,
Not if you tag them in the right way by using for example the hreflang tags to do this. A lot of Web sites are doing this, they create blog posts and translate them into multiple languages by using sub folders per country.
If you would only create these blog posts and don't make them unique to their language, by not making clear which language the blog post is written in it makes it hard for Google to distinguish if your trying to spam (or not).
Hope this helps a bit!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema for blogs
When I run a wordpress blog through the structured data testing tool I see that there is @type hentry. Is this enough for blogs etc? Is this a result of Wordpress adding in this markup? Do you recommend adding @blogposting type and if so why? What benefit to add a specific type of schema? How does it help in blogging? Thanks
Technical SEO | | AL123al4 -
How do we build rank on a domain with an offsite blog?
We have a site (domain y) that we'd like to integrate a blog into but our team doesn't have the bandwidth to do this. So, we've been exploring the option of building the blog in WP and hosting on a separate domain (domain x) and redirecting from domain y to it. My concern is how this affects or undermines rank value on domain y (effectively all the value from the blog resides on the blog domain x). How might we go about integrating an offsite blog into the core domain while maintaining search value? Is there a way? Thanks!
Technical SEO | | J-Me0 -
Multi language and multi target eCommerce site in EU
Hi, before I raised this question I red some other topics which were relevant to my question, however every topic had some slight difference between the various scenarios. I run a webshop with .eu domain (notebookscreen.eu) which currently runs on 3 languages. We use geoIP to determine the user's location for the following reasons: language currency shipping fee The site runs very slow during the tests, and most of the testers (including MOZ) fails since it has too many 302 redirects. We are rebuilding this part to fix this redirect and need some advice what is the best to optimise for multiple countries. As said in the title this is a shop mainly targeting EU countries, and next to the .eu domain we have 10 other country level TLD registration. Currently we do subfolder stile. Would it be better to do subdomain per country or separate TLD for every country. Current option is the best for backlinks, I don't think second has any gains. Having dedicated TLDs can help to local SERP for every country, however we would need a lot of back-linking. Also if someone starts with the .eu page, a 3xx redirect is needed for the designated country. Different sites do it differently. Some don't care (Apple), some stay on one page and gives you local currency and shipping rates (eBay), or moves to different TLD (Amazon). Is there any better way to determine someone's location other then GeoIP?
Technical SEO | | kukacwap0 -
Blog separate from Website
One of my clients has a well established website, and a well established blog - each with its own domain. Is there any way to move the blog to his website domain without losing the SEO and links that he has built up over time?
Technical SEO | | EchelonSEO0 -
Duplicate title/content errors for blog archives
Hi All Would love some help, fairly new at SEO and using SEOMoz, I've looked through the forums and have just managed to confuse myself. I have a customer with a lot of duplicate page title/content errors in SEOMoz. It's an umbraco CMS and a lot of the errors appear to be blog archives and pagination. i.e. http://example.com/blog http://example.com/blog/ http://example.com/blog/?page=1 http://example.com/blog/?page=2 and then also http://example.com/blog/2011/08 http://example.com/blog/2011/08?page=1 http://example.com/blog/2011/08?page=2 http://example.com/blog/2011/08?page=3 (empty page) http://example.com/blog/2011/08?page=4 (empty page) This continues for different years and months and blog entries and creates hundreds of errors. What's the best way to handle this for the SEOMoz report and the search engines. Should I rel=canonical the /blog page? I think this would probably affect the SEO of all the blog entries? Use robots.txt? Sitemaps? URL parameters in the search engines? Appreciate any assistance/recommendations Thanks in advance Ian
Technical SEO | | iragless0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Feedburner - Why Sending My Blog Posts A Day After I Post Them?
I have my feed setup through feedburner for my wife's blog ktlouise.com. Whenever she posts a new blog post, it doesn't get emailed to her subscribers until the next day. Does anyone know how to change this so that the updates go out the same day? Thanks for the help! REF
Technical SEO | | FergusonSEO0 -
What's the best way to eliminate duplicate page content caused by blog archives?
I (obviously) can't delete the archived pages regardless of how much traffic they do/don't receive. Would you recommend a meta robot or robot.txt file? I'm not sure I'll have access to the root directory so I could be stuck with utilizing a meta robot, correct? Any other suggestions to alleviate this pesky duplicate page content issue?
Technical SEO | | ICM0