User generated content (Comments) - What impact do they have?
-
Hello MOZ stars!
I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have?
For your information:
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments.My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why!If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify.
Best regards,
Danne -
Not what you asked, but other than SEO I would say comments do have an effect. I have heard advertisers say they were looking for sites with comments. Their thinking was they wanted popular sites with followers and they is how they judged it.
-
I do think that negative comments hurt UX and eventually the bottom line. No one wants to work with a company that has ton of negative feedback. Which is exactly why user generated content is so important to the searchers. It is a candid review of a company or product. There can be in the middle reviews, like a 3 star rating because customer service was great but the product stinks. I think those kinds of comments and reviews are necessary and overall good for UX.
In my opinion as a consumer, I want to see the bad comments. I always use the example of shoes and clothes. I don't want to find out when I get a pair of shoes in the mail that the sizes run a little small. If I see that in the comments or reviews ahead of time I will know to buy a size bigger and save myself the trouble of returning the product. These kinds of "negative" reviews are useful to a searcher and I wouldn't remove them.
-
Additional to what David said, I would still consider leaving the comments option open (until there is no "over-usage").
Also a factor to consider (especially in Barry's case), what kind of comments do people post. Do they have a positive or a negative annotation? Are they on-topic or not?
If you have a community, like Moz has IMO, where I see a lot of good, complementing comments, responses to each of the posts, I'd consider indexing the comments.
What do you think? David, Monica?
-
I also read that article. Barry seemed to think that the comments were hurting the site, rather than helping. Comments can get off topic, or stray away from the original article. If I remember correctly, Barry made the comments viewable, but not readable by Google as a result.
For return traffic, I think comments are great. After seeing the results that Barry shared, I'm not sure if it is still a good idea to have them included in the page crawl.
Here is the article that he spoke about this: https://www.seroundtable.com/google-panda-ser-poll-19675.html
IMO, I would leave the comments on the pages, but block them from being indexed/use javascript for showing the comments if possible.
-
Like I have mentioned in my response, that is one case.
But I must agree with Monica, you should place the value to the searchers&User Experience.
-
User generated content in my opinion is extremely useful. It is unique, it is informative most of the time and it is valuable to future searches. In this instance I would be more concerned about the value to the searchers and to user experience than the SEO effects.
-
Hi Danne,
I remember reading a post about this from Barry Schwartz on seroundtable.com: https://www.seroundtable.com/google-panda-ser-hurt-comments-19652.html
Read it through, it quite describes the effect of user generated content (specially comments).
This is one specific case, I am sure that it is not a general rule for this.
Gr., Keszi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Translated Content on Country Domains
Hi, We have blogs set up in each of our markets, for example http://blog.telefleurs.fr, http://blog.euroflorist.nl and http://blog.euroflorist.be/nl. Each blog is localized correctly so FR has fr-FR, NL has nl-NL and BE has nl-BE and fr-BE. All our content is created or translated by our Content Managers. The question is - is it safe for us to use a piece of content on Telefleurs.fr and the French translated Euroflorist.be/fr, or Dutch content on Euroflorist.nl and Euroflorist.be/nl? We want to avoid canonicalising as neither site will take preference. Is there a solution I've missed until now? Thanks,
Intermediate & Advanced SEO | | seoeuroflorist
Sam0 -
Content Aggregation Site: How much content per aggregated piece is too much?
Let's say I set up a section of my website that aggregated content from major news outlets and bloggers around a certain topic. For each piece of aggregated content, is there a bad, fair, and good range of word count that should be stipulated? I'm asking this because I've been mulling it over—both SEO (duplicate content) issues and copyright issues—to determine what is considered best practice. Any ideas about what is considered best practice in this situation? Also, are there any other issues to consider that I didn't mention?
Intermediate & Advanced SEO | | kdaniels0 -
2 URLS pointing to the same content
Hi, We currently have 2 URL's pointing to the same website (long story why we have it) - A & B. A is our main website but we set up B as a rewrite URL to use for our Pay Per Click campaign. Now because its the same site, but B is just a URL rewrite, Google Webmaster Tools is seeing that we have thousands of links coming in from site B to site A. I want to tell Google to ignore site B url but worried it might affect site A. I can't add a no follow link on site B as its the same content so will also be applicable on Site A. I'm also worried about using Google Disavow as it might impact on site A! Can anyone make any suggestions on what to do, as I would like to hear from anyone with experience with this or can recommend a safe option. Thanks for your time!
Intermediate & Advanced SEO | | Party_Experts0 -
What are your thoughts on Content Automation?
Hi, I want to ask forum members’ opinion on content automation. And before I raise the eyebrows of many of you with this question, I’d like to state I am creating content and doing SEO for my own website so I’m not looking to cut corners with spammy tactics that could hurt my website from an organic search perspective. The goal is to automate pages in the areas of headings, Meta Titles, Meta Descriptions, and perhaps a paragraph of content. More importantly, I’d like these pages to add value to the users experience so the question is…. How do I go about automating the pages, and more specifically, how is meta title, meta descriptions etc. automated? I’d also like to hear from people that recommend steering clear of any form of content automation. I hope my question isn’t too bit vague and I look forward to hearing from other Mozzers. Regards, Russell in South Africa
Intermediate & Advanced SEO | | Shamima0 -
Keep older blog content indexed or no?
Our really old blog content still sees traffic, but engagement metrics aren't the best (little time on site), and as a result, traffic has gradually started to decrease. Should we de-index it?
Intermediate & Advanced SEO | | nicole.healthline0 -
HTTPS Duplicate Content?
I just recieved a error notification because our website is both http and https. http://www.quicklearn.com & https://www.quicklearn.com. My tech tells me that this isn't actually a problem? Is that true? If not, how can I address the duplicate content issue?
Intermediate & Advanced SEO | | QuickLearnTraining0 -
How do I index these parameter generated pages?
Hey guys, I've got an issue with a site I'm working on. A big chunk of the content (roughly 500 pages) is delivered using parameters on a dynamically generated page. For example: www.domain.com/specs/product?=example - where "example' is the product name Currently there is no way to get to these pages unless you enter the product name into the search box and access it from there. Correct me if I'm wrong, but unless we find some other way to link to these pages they're basically invisible to search engines, right? What I'm struggling with is a method to get them indexed without doing something like creating a directory map type page of all of the links on it, which I guess wouldn't be a terrible idea as long as it was done well. I've not encountered a situation like this before. Does anyone have any recommendations?
Intermediate & Advanced SEO | | CodyWheeler0 -
Should I do something about this duplicate content? If so, what?
On our real estate site we have our office listings displayed. The listings are generated from a scraping script that I wrote. As such, all of our listings have the exact same description snippet as every other agent in our office. The rest of the page consists of site-wide sidebars and a contact form. The title of the page is the address of the house and so is the H1 tag. Manually changing the descriptions is not an option. Do you think it would help to have some randomly generated stuff on the page such as "similar listings"? Any other ideas? Thanks!
Intermediate & Advanced SEO | | MarieHaynes0