Can dynamically translated pages hurt a site?
-
Hi all...looking for some insight pls...i have a site we have worked very hard on to get ranked well and it is doing well in search. The site has about 1000 pages and climbing and has about 50 of those pages in translated pages and are static pages with unique urls. I have had no problems here with duplicate content and that sort of thing and all pages were manually translated so no translation issues. We have been looking at software that can dynamically translate the complete site into a handfull of languages...lets say about 5. My problem here is these pages get produced dynamically and i have concerns that google will take issue with this aswell as the huge sudden influx of new urls....as now we could be looking at and increase of 5000 new urls. (which usually triggers an alarm)
My feeling is that it could be risking the stability of the site that we have worked so hard for and maybe just stick with the already translated static pages.
I am sure the process could be fine but fear a manual inspection and a slap on the wrist for having dynamically created content?? and also just risk a review trigger period.
These days it is hard to know what could get you in "trouble" and my gut says keep it simple and as is and dont shake it up?? Am i being overly concerned? Would love to here from others who have tried similar changes and also those who have not due to similar "fear"
thanks
-
Stumbled upon some additional information and decided to update you...
According to the internationalization FAQ...
Q: <a name="q5"></a>Can I use automated translations?
A: Yes, but they must be blocked from indexing with the “noindex” robots meta tag. We consider automated translations to be auto-generated content, so allowing them to be indexed would be a violation of our Webmaster Guidelines.So if you decide to autotranslate the text, you should use a noindex tag instead of the hreflang tag.
-
Considering they offer that service themselves, it would be hypocritical of them to penalize you for doing it. The hreflang tag would also protect you from having those pages marked as spam since you are telling G "Page a the exact same as page å, just in a different language" - avoiding "duplicate" content
-
thanks Oleg.....if the site was to get reviewed manually would there be any issues that there are thousands of pages with content being created dynamically?
thanks for your time
-
The problem with using a software to translate your content is that it will never be perfect. There will be many grammatical and/or vocabulary errors that would decrease the quality of the content. I'm not sure if Google is able to understand content quality in other languages, but a worse user experience usually leads to worse rankings. Ideal situation, you would have those pages manually translated (but I know it will cost a fortune).
In case you decide to auto translate, be sure to use the rel="alternative" hreflang="x" tag in order to tell Google that you have multiple pages with the same content, except in different languages.
I don't think you should worry about a sudden influx of pages. Ideally, you'd drip feed them in to take advantage of the freshness factor, but you shouldn't be penalized for creating a lot of new pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does anyone know the linking of hashtags on Wix sites does it negatively or postively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please?
Does anyone know the linking of hashtags on Wix sites does it negatively or positively impact SEO. It is coming up as an error in site crawls 'Pages with 404 errors' Anyone got any experience please? For example at the bottom of this blog post https://www.poppyandperle.com/post/face-painting-a-global-language the hashtags are linked, but they don't go to a page, they go to search results of all other blogs using that hashtag. Seems a bit of a strange approach to me.
Technical SEO | | Mediaholix0 -
Another client copies everything to blogspot. Is that what keeps her site from ranking? Or what? Appears to be a penalty somewhere but can't find it.
This client has a brand new site: http://www.susannoyesandersonpoems.com Her previous site was really bad for SEO, yet at one time she actually ranked on the first page for "LDS poems." She came to me because she lost rank. I checked things out and found some shoddy SEO work by a very popular Wordpress webhoste that I will leave unnamed. If you do a backlink analysis you can see the articles and backlinks they created. But there are so few, so I'm not sure if that was it, or it just was because of the fact that her site was so poorly optimized and Google made a change, and down she fell. Here's the only page she had on the LDS poems topic in her old site: https://web.archive.org/web/20130820161529/http://susannoyesandersonpoems.com/category/lds-poetry/ Even the links in the nav were bad as they were all images. And that ranked in position 2 I think she said. Even with her new site, she continues to decline. In fact she is nowhere to be found for main keywords making me think there is a penalty. To try and build rank for categories, I'm allowing google to index the category landing pages and had her write category descriptions that included keywords. We are also listing the categories on the left and linking to those category pages. Maybe those pages are watered down by the poem excerpts?? Here's an example of a page we want to rank: http://susannoyesandersonpoems.com/category/lds-poetry/ Any help from the peanut gallery?
Technical SEO | | katandmouse0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
Duplicate Page Titles and %3E, how can I avoid this?
In my crawl report I keep seeing duplicate page title warning with URL's being referenced twice: e.g. /company/ceo-message/ /company/ceo-message/%3E I'm using canonical link tags but after the new crawl report, I'm still seeing this duplicate page title crawl error. How can I avoid this? I've been looking for answers for a few days but don't seem to see this exact problem discussed. Any insight is appreciated!
Technical SEO | | mxmo0 -
How do we ensure our new dynamic site gets indexed?
Just wondering if you can point me in the right direction. We're building a 'dynamically generated' website, so basically, pages don’t technically exist until the visitor types in the URL (or clicks an on page link), the pages are then created on the fly for the visitor. The major concern I’ve got is that Google won’t be able to index the site, as the pages don't exist until they're 'visited', and to top it off, they're rendered in JSPX, which makes things tricky to ensure the bots can view the content We’re going to build/submit a sitemap.xml to signpost the site for Googlebot but are there any other options/resources/best practices Mozzers could recommend for ensuring our new dynamic website gets indexed?
Technical SEO | | Hutch_e0 -
NoIndex/NoFollow pages showing up when doing a Google search using "Site:" parameter
We recently launched a beta version of our new website in a subdomain of our existing site. The existing site is www.fonts.com with the beta living at new.fonts.com. We do not want Google to crawl the new site until it's out of beta so we have added the following on all pages: However, one of our team members noticed that google is displaying results from new.fonts.com when doing an "site:new.fonts.com" search (see attached screenshot). Is it possible that Google is indexing the content despite the noindex, nofollow tags? We have double checked the syntax and it seems correct except the trailing "/". I know Google still crawls noindexed pages, however, the fact that they're showing up in search results using the site search syntax is unsettling. Any thoughts would be appreciated! DyWRP.png
Technical SEO | | ChrisRoberts-MTI0 -
Paginated Home Page Duplicates on Wordpress Sites
A number of my websites created on WP are displaying duplicate home pages with these types of urls. http://www.example.com/page/10/ http://www.example.com/page/11/ http://www.example.com/page/12/ I found these duplicates using the site:search command. Basically, put in any number and the Home Page opens. With the above mentioned url structure. Any idea on why they are created, how they can be stopped and what kind of an impact they would have in terms of SEO and the penalty that comes with duplicate content.
Technical SEO | | AsadMemon1 -
Page not Accesible for crawler in on-page report
Hi All, We started using SEOMoz this week and ran into an issue regarding the crawler access in the on-page report module. The attached screen shot shows that the HTTP status is 200 but SEOMoz still says that the page is not accessible for crawlers. What could this be? Page in question
Technical SEO | | TiasNimbas
http://www.tiasnimbas.edu/Executive_MBA/pgeId=307 Regards, Coen SEOMoz.png0