Spellcheck necessary for user generated content?
-
We have a lot of user generated reviews on our key landing pages.
Matt Cutts recommended using correctly spelled content.
Would you perform a spellcheck of all already published user reviews or would you leave already published reviews rather intact and only perform spellcheck for new reviews before they are published?
Since reviews have been marked up using schema.org, I am not sure whether posterior editing of lots of reviews may raise a flag with google regarding manipulating reviews.
Thanks.
-
Very good point relating to credibility of review and great relief that I do not have to perform spellchecks of thousands of reviews. Thanks.
Still may follow also Cristinas suggestions as poor spelling in user review may also reflect on visitors perception of website's credibility.
-
Christina, excellent idea integrating spellcheck upon submission. Will have a look at google spellcheck API. Thanks.
-
I agree with EGOL that you shouldn't edit someone's existing review or post. I would offer a spell-check option within the review editor on your site and encourage your reviewers to use it.
-
Sum ppl tink dat a rebiew wit a lota misshpeluns kan be a qality hint. So too edet will make it wortless.
I don't think that reviews should be spell checked or edited. I think that they should be posted "as written" by the author. The quality of the writing is, in my opinion, an important part of the review. It can be used as a measure of credibility.
If you get an email with a lot of grammar and spelling problems it can be a sign that it is spam. I think that the same applies to reviews.
=================
As for Matt Cutts.... I think that he is referring to article content where the author should be taking some care.
Reviews, forum posts and blog comments are going to have spelling and grammar problems everywhere.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content update on 24hr schedule
Hello! I have a website with over 1300 landings pages for specific products. These individual pages update on a 24hr cycle through out API. Our API pulls reviews/ratings from other sources and then writes/updates that content onto the page. Is that 'bad"? Can that be viewed as spammy or dangerous in the eyes of google? (My first thought is no, its fine) Is there such a thing as "too much content". For example if we are adding roughly 20 articles to our site a week, is that ok? (I know news websites add much more than that on a daily basis but I just figured I would ask) On that note, would it be better to stagger our posting? For example 20 articles each week for a total of 80 articles, or 80 articles once a month? (I feel like trickle posting is probably preferable but I figured I would ask.) Is there any negatives to the process of an API writing/updating content? Should we have 800+ words of static content on each page? Thank you all mozzers!
Intermediate & Advanced SEO | | HashtagHustler0 -
What is the fastest way to deindex content from Google?
Yesterday we had a client discover that our staging URLs were being indexed in Google. This was due to a technical oversight from our development team (forgot to upload meta robots tags). We are trying to remove this content as quickly as possible. Are there any methods in the Google Search Console to expedite this process? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
Duplicate Content: Is a product feed/page rolled out across subdomains deemed duplicate content?
A company has a TLD (top-level-domain) which every single product: company.com/product/name.html The company also has subdomains (tailored to a range of products) which lists a choosen selection of the products from the TLD - sort of like a feed: subdomain.company.com/product/name.html The content on the TLD & subdomain product page are exactly the same and cannot be changed - CSS and HTML is slightly differant but the content (text and images) is exactly the same! My concern (and rightly so) is that Google will deem this to be duplicate content, therfore I'm going to have to add a rel cannonical tag into the header of all subdomain pages, pointing to the original product page on the TLD. Does this sound like the correct thing to do? Or is there a better solution? Moving on, not only are products fed onto subdomain, there are a handfull of other domains which list the products - again, the content (text and images) is exactly the same: other.com/product/name.html Would I be best placed to add a rel cannonical tag into the header of the product pages on other domains, pointing to the original product page on the actual TLD? Does rel cannonical work across domains? Would the product pages with a rel cannonical tag in the header still rank? Let me know if there is a better solution all-round!
Intermediate & Advanced SEO | | iam-sold0 -
Uppercase in URLs = Dupe Content
Hi Mozzers, My developers recently changed a bunch of the pages I am working on into all lower case (something I know ideally should have been done in the first place). The URLs have sat for about a week as lower case without 301 redirecting the old upper-case URLs to these pages. In Google Webmaster Tools, I'm seeing Google recognize them as duplicate meta tags, title tags, etc. See image: http://screencast.com/t/KloiZMKOYfa We're 301 redirecting the old URLs to the new ones ASAP, but is there anything else I should do? Any chance Google is going to noindex these pages because it seems them as dupes until I fix them? Sometimes I can see both pages in the SERPs if I use personalized results, and it scares me: http://screencast.com/t/4BL6iOhz4py3 Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Bi-Lingual Site: Lack of Translated Content & Duplicate Content
One of our clients has a blog with an English and Spanish version of every blog post. It's in WordPress and we're using the Q-Translate plugin. The problem is that my company is publishing blog posts in English only. The client is then responsible for having the piece translated, at which point we can add the translation to the blog. So the process is working like this: We add the post in English. We literally copy the exact same English content to the Spanish version, to serve as a placeholder until it's translated by the client. (*Question on this below) We give the Spanish page a placeholder title tag, so at least the title tags will not be duplicate in the mean time. We publish. Two pages go live with the exact same content and different title tags. A week or more later, we get the translated version of the post, and add that as the Spanish version, updating the content, links, and meta data. Our posts typically get indexed very quickly, so I'm worried that this is creating a duplicate content issue. What do you think? What we're noticing is that growth in search traffic is much flatter than it usually is after the first month of a new client blog. I'm looking for any suggestions and advice to make this process more successful for the client. *Would it be better to leave the Spanish page blank? Or add a sentence like: "This post is only available in English" with a link to the English version? Additionally, if you know of a relatively inexpensive but high-quality translation service that can turn these translations around quicker than my client can, I would love to hear about it. Thanks! David
Intermediate & Advanced SEO | | djreich0 -
What to do with WordPress generated pages?
I'm an SEOmoz Newbie and have a very specific question about the auto generated WordPress Pages. SEOmoz caught and labeled the auto generated WP pages as Crawl Warnings like: Long URL - 302 - Title Element to Long - Missing Meta Description Tag - Too Many On-Page Links So I have learned the lesson and have now made those pages "no follow" / "no idex." HOWEVER, WHAT DO I DO WITH THE ONES THAT HAVE ALREADY BEEN INDEXED? Do I... 1. Just leave them as is a hope they don't hurt me from an SEO perspective? 2. Redirect them all to a relevant page? I'm sure many people have had this issue. What do you think? Thanks Dominic
Intermediate & Advanced SEO | | amorbis0 -
BEING PROACTIVE ABOUT CONTENT DUPLICATION...
So we all know that duplicate content is bad for SEO. I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source .... so if anyone else copies it they get penalised and not me... Would appreciate your answers... 🙂 regards,
Intermediate & Advanced SEO | | TopGearMedia0