Dealing with thin comment
-
Hi again! I've got a site where around 30% of URLs have less than 250 words of copy. It's big though, so that is roughly 5,000 pages. It's an ecommerce site and not feasible to bulk up each one. I'm wondering if noindexing them is a good idea, and then measuring if this has an effect on organic search?
-
Thanks guys! We'e starting to add more content to each page, looks like the only way!
-
Does your competition have more content for these products?
If so, you need to ramp it up.
Either way, no-indexing them is not going to do any good.
-
Hi Blink
What would you be hoping to gain by de indexing these pages?
-
The size of your site is important. The value these pages have as far as bulking your site up is important. If you no index them, you will significantly reduce the size of your site, which can effect your ability to rank on other pages as well. No indexing them is not best practice, and will cause more harm than good.
These pages aren't hurting your site by not ranking. They might rank for terms you aren't tracking also. The pages probably have some authority and links, getting rid of that will definitely be detrimental.
-
Hi! I agree that they won't rank, but most aren't now anyway. I'm more concerned that they are pulling everything else down. By noindexing them, I can at least see if that is the problem/.
-
If you no index the pages they will never rank. The bots will not be able to crawl them so they will essentially be useless.
Are these product pages? The best way to get content on these pages is through user generated reviews. That is the best way to accomplish adding content without spending a ton of time writing copy.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site not showing up in search - was hacked - huge comment spam - cannot connect Webmaster tools
Hi Moz Community A new client approached me yesterday for help with their site that used to rank well for their designated keywords, but now is not doing well. Actually, they are not on Google at all. It's like they were removed by Google. There are not reference to them when searching with "site: url". I investigated further and discovered the likely problem . . . 26 000 spam comments! All these comments have been removed now. I clean up this Wordpress site pretty well. However, I want to connect it now to Google webmaster tools. I have admin access to the WP site, but not ftp. So I tried using Yoast to connect. Google failed to verify the site. So the I used a file uploading console to upload the Google html code instead. I check that the code is there. And Google still fails to verify the site. It is as if Google is so angry with this domain that they have wiped it completely from search and refuse to have any dealings with it at all. That said, I did run the "malware" check or "dangerous content" check with them that did not bring back any problems. I'm leaning towards the idea that this is a "cursed" domain in Google and that my client's best course of action is to build her business around and other domain instead. And then point that old domain to the new domain, hopefully without attracting any bad karma in that process (advice on that step would be appreciated). Anyone have an idea as to what is going on here?
Intermediate & Advanced SEO | | AlistairC0 -
Do search engine consider this duplicate or thin content?
I operate an eCommerce site selling various equipment. We get product descriptions and various info from the manufacturer's websites offered to the dealers. Part of that info is in the form of User Guides and Operational Manuals downloaded in pdf format written by the manufacturer, then uploaded to our site. Also we embed and link to videos that are hosted on the manufacturer's respective YouTube or Vimeo channels. This is useful content for our customers.
Intermediate & Advanced SEO | | MichaelFactor
My questions are: Does this type of content help our site by offering useful info, or does it hurt our SEO due to it being thin and or duplicate content? Or does the original content publishers get all the benefit? Is there any benefit to us publishing this stuff? What exactly is considered "thin content"?0 -
Sitemaps during a migration - which is the best way of dealing with them?
Many SEOs I know simply upload the new sitemap once the new site is launched - some keep the old site's URLs on the new sitemap (for a while) to facilitate the migration - others upload both the old and the new website together, to support the migration. Which is the best way to proceed? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
What's the best way to deal with deleted .php files showing as 404s in WMT?
Disclaimer: I am not a developer During a recent site migration I have seen a bit of an increase in WMT of 404 errors on pages ending .php. Click on the link in WMT and it just shows as File Not Found - no 404 page. There are about 20 in total showing in webmaster tools and I want to advise the IT department what to do. What is the best way to deal with this for on-page best practice? Thanks
Intermediate & Advanced SEO | | Blaze-Communication0 -
"noindex, follow" or "robots.txt" for thin content pages
Does anyone have any testing evidence what is better to use for pages with thin content, yet important pages to keep on a website? I am referring to content shared across multiple websites (such as e-commerce, real estate etc). Imagine a website with 300 high quality pages indexed and 5,000 thin product type pages, which are pages that would not generate relevant search traffic. Question goes: Does the interlinking value achieved by "noindex, follow" outweigh the negative of Google having to crawl all those "noindex" pages? With robots.txt one has Google's crawling focus on just the important pages that are indexed and that may give ranking a boost. Any experiments with insight to this would be great. I do get the story about "make the pages unique", "get customer reviews and comments" etc....but the above question is the important question here.
Intermediate & Advanced SEO | | khi50 -
How to deal with DMCA takedown notices
How do you deal with DMCA takedown notices related to product descriptions? With Google it is simple enough for any person to submit a DMCA takedown notice irrespective if the owner holds right to the content. One such example is this http://www.chillingeffects.org/notice.cgi?sID=1012391. Although Google dealt in that particular case properly (and did not remove content), we find that nowadays more and more competitors use the DMCA takedowns as an easy way to de-index competitive content. Since the person registering the DMCA takedown does not require to provide any proof of copyright, de-indexing happens quite quickly. Try this URL: http://www.google.com/transparencyreport/removals/copyright/domains/mydomain.com/ (replace your domain) to see if you have been affected. I would like your opinion if you have been affected by takedowns on product descriptions - in my mind if product descriptions are informative and relate to the characteristics of the product then takedowns should be denied.
Intermediate & Advanced SEO | | MagicDude4Eva1 -
What is the best approach for getting comments indexed, but also providing a great UX?
The way our in-house comments system was built, it uses AJAX to call comments as the page is loaded. I'm working on a set of requirements to convert the system over to be more SEO-friendly. Today, we have a "load more comments" after the first 20 comments, then it calls the server and loads more comments. This is what I'm trying to figure out. Should we load all the comments behind the scenes in the page, then lazy load the comments or use the same "load more" and just load what was already loaded behind the scenes? Or does anyone have a better suggestion about how to make the comments crawlable for Google?
Intermediate & Advanced SEO | | JDatSB0 -
Best way to deal with multiple languages
Hey guys, I've been trying to read up on this and have found that answers vary greatly, so I figured I'd seek your expertise. When dealing with the url structure of a site that is translated into multiple languages, is it better SEO wise to structure a site like this : domain.com/en domain.com/it etc or to simply add url modifiers like domain.com/?lang=en domain.com/?lang=it In the first example, I'm afraid google might see my content as duplicate even though its in a different language.
Intermediate & Advanced SEO | | CrakJason0