How important is quality control for UGC?
-
In this post, http://www.seomoz.org/blog/ugc-gets-an-a-on-google-test-with-panda-update-12260, it mentions that UGC content performed well after the Panda.
One of Google's 23 questions mentions spelling errors. I am wondering how important it is to ensure there aren't any spelling errors or grammar errors in UGC content post-panda?
-
The best type of user generated content is where you make users much place a specific amount into the form.
It could be to win a price or something to emphasize the users to input the amount of content. Because if you have thin content which is spammy and the same all over the web it will not do much for ranks.
I don't really think spelling errors are a big issue, that falls more down into usability, if you are a big brand and you have spelling errors all over your website it will not look professional.But from a UGC point of view I would not worry too much.
-
I think that the bigger threat with UGC is that they will fill your site with spammy links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Control indexed content on Wordpress hosted blog...
I have a client with a blog setup on their domain (example: blog.clientwebsite.com) and even though it loads at that subdomain it's actually a Wordpress-hosted blog. If I attempt to add a plugin like Yoast SEO, I get the attached error message. Their technical team says this is a brick wall for them and they don't want to change how the blog is hosted. So my question is... on a subdomain blog like this... if I can't control what is in the sitemap with a plugin and can't manually add a sitemap because the content is being pulled from a Wordpress-hosted install, what can I do to control what is in the index? I can't add an SEO plugin... I can't add a custom sitemap... I can't add a robots.txt file... The blog is setup with domain mapping so the content isn't actually there. What can I do to avoid tags, categories, author pages, archive pages and other useless content ending up in the search engines? 7Zo93b2.png
Technical SEO | | ShawnW0 -
Yet Another, Yet Important URL structure query.
Massive changes to our stock media site and structure here. While we have an extensive category system previously our category pages have only been our search pages with ID numbers for sorting categories. Now we have individual category pages. We have about 600 categories with about 4 max tiers. We have about 1,000,000 total products and issues with products appearing to be duplicate. Our current URL structure for producta looks like this: http://example.com/main-category/12345/product-name.htm Here is how I was planning on doing the new structure: Cat tier 1: http://example.com/category-one/ Cat tier 2: http://example.com/category-one/category-two/ Cat tier 3: http://example.com/category-one-category-two/category-three Cat tier 4: http://example.com/category-one-category-two-category-three/category-four/ Product: http://example.com/category-one-category-two-category-three/product-name-12345.htm Thoughts? Thanks! Craig
Technical SEO | | TheCraig0 -
GWT Soft 404 count is climbing. Important to fix?
In GWT I am seeing my mobile site's soft 404 count slowly rise from 5 two weeks ago to over 100 as of today. If I do nothing I expect it will continue to rise into the thousands. This is due to there being followed links on external sites to thousands of discontinued products we used to offer. The landing page for these links simply says the product is no longer available and gives links to related areas of our site. I know I can address this by returning a 404 for these pages, but doing so will cause these pages to be de-indexed. Since these pages still have utility in redirecting people to related, available products, I want these pages to stay in the index and so I don't want to return a 404. Another way of addressing this is to add more useful content to these pages so that Google no longer classifies them as soft 404. I have images and written content for these pages that I'm not showing right now, but I could show if necessary. But before investing any time in addressing these soft 404s, does anyone know the real consequences of not addressing them? Right now I'm getting 275k pages indexed and historically crawl budget has not been an issue on my site, nor have I seen any anomalous crawl activity since the climb in soft 404s began. Unchecked, the soft 404s could climb to 20,000ish. I'm wondering if I should start expecting effects on the crawl, and also if domain authority takes a hit when there are that many soft 404s being reported. Any information is appreciated.
Technical SEO | | merch_zzounds0 -
Creating a Landing Page with a Separate Domain to Control Bounce Rate
I work with a unique situation where we have a site that gets tons of free traffic from internal free resources. We do make revenue from this traffic, but due to its nature, it has a high bounce rate. Data shows that once someone from this source does click a second page, they are engaged, so they either bounce or click multiple pages. After testing various landing pages, I've determined that the best solution would be to create a landing page on a separate domain and hide it from the search engines (to prevent duplicate content and the appearance of link farming). The theory is that once they click through to the site, they will bounce at a lower rate and improve the stats of the website. The landing page would essentially filter out this bad traffic. My question is, how sound is this theory? Will this cause any issues with Google or any other search engines?
Technical SEO | | jhacker0 -
How important is using hreflang if u have plenty of other geo signals ?
HI How important is it to use the hreflang attributes and supporting sitemaps (and do you need both) ? Since if sites are being set up on country specific tlds (but on top of WP multisite network.domain.com environment) and geotargeted in GWT, as well as country meta tags and local schema etc etc that should send enough signals shouldnt it 🙂 ? Implementation of hreflang seems like an absolute technical nightmare All Best Dan
Technical SEO | | Dan-Lawrence0 -
Domain Switch - With lost control of original domain.
Hey all, A client finally sold a domain name after being harassed to sell for many years, without talking to us about it first. They moved the site to a new domain, and the purchasing company took over the original domain. Then they called me, wondering why the site is no longer showing up in Google. I've done some initial research, and everything I find for advice assumes that you have control over the original domain. We don't. I'm hoping someone here has some creative advice, so we don't have to start from the beginning, and/or painfully update links we've acquired. My only thought was that the new company may be kind enough to post 301's for us if we provided them.... Any thoughts / advice / life rings will be greatly appreciated! 🙂
Technical SEO | | KBK0 -
Damage Control
So here's the deal.....I HAVE to link my external website www.ldnwicklesscandles.com....and any shopping/ecommerce sections of it to my company sponsored website: ukwicklessscents.scentsy.co.uk so all transactions go through that. Why do a separate website you might ask? ....our company websites are extremely limited in design and content we're allowed to add like blogs, etc, etc and things to add content to help with longtail searches. So most of us involved with the company do external sites and link to our other sites. Which means I have link juice spread over the two sites because so much of the site is linking out...which most means I've got to work twice as hard. Although my hands are really tied with a lot of things with this, I'm wondering if there's anything you might recommend to lessen the damage so to speak....maybe like changing the navigational structure of my external website so I'm only linking out when absolutely necessary? I've been reading about navigational structure and it mentions the home page should only link out to the most important pages? Would this work? Maybe none of this is something I should worry about? It seems some of the high rank sites for keywords like scentcity.com who are using the left side bar like I have and the main page with lots of links to her company sponsored site seem to rank despite all this?? Any suggestions or advice would be appreciated. x
Technical SEO | | cmjolley0 -
How important are unique titles and descriptions?
Hi there, I've recently started working on a very large travel website. One of my main duties is to get it rankings for certain terms (which it is't at the moment, at all!) A large proportion of the website is dynamic, meaning that the pages, and URLs are produced using sessions. I've already enquired with the company who provide the website about how I can get unique meta data for each page on our website. They came back and said it can be done for the static pages, but not for the dynamic pages. This leaves me with about thousands of pages with duplicated meta data. Not at all ideal. I was just wondering how damaging this is likely to be to the SEO of my site. Am I going to be able to achieve rankings even with this issue? Or do I need to get it sorted ASAP? Thanks
Technical SEO | | neilpagecruise0