How to deal with very similar (thin) content by design?
-
Hello all,
I run a website which lists direct contact details (tel. and email.) of organisations.
I have 100s of similar pages which are very thin on content (by design). Each page has a couple of lines of somewhat unique content.
People find the site useful since it simply tells them which number to dial in order to speak to a real person at any given organisation. They can't easily find the information elsewhere and I believe it satisfies search intent.
Am I at risk for being flagged for duplicate / low quality content?
Should I add more text simply to add 'unique' content to each page even though it adds no value to users? That doesn't seem right either!
Looking forward to hear where you guys stand on this,
Many thanks,
-
Personally I would find a way of thinking laterally about generating new and unique content for your existing and potential users. "Poetry is when you make new things familiar and familiar things new," according to marketing guru Rory Sutherland. This rings true with content.
-
I'd definitely consider adding a review section on each page for the quality of the information, something along the lines of what "who called me" sites do. That way you're getting user generated content around how quickly they were able to get a hold of someone and how it helped resolve their issue. That would be a value add and help differentiate the pages. Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How should I deal with this page?
Hey Mozzers, I was looking for a little guidance and advice regarding a couple of pages on my website. I have used 'shoes' for this example. I have the current structure Parent Category - Shoes Sub Categories - Blue Shoes
Intermediate & Advanced SEO | | ATP
Hard Shoes
Soft Shoes
Big Shoes etc Supporting Article - Different Types of Shoe and Their Uses There are about 12 subcategories in total - each one links back to the Parent Category with the keyword "Shoes". Every sub category has gone from ranking 50+ to 10-30th for its main keyword which is a good start and as I release supporting articles im sure each one will climb. I am happy with this. The Article ranks no1 for about 20 longtails terms around "different shoes". This page attracts around 60% of my websites traffic but we know this traffic will not convert as most are people and children looking for information only for educational purposes and are not looking to buy. Many are also looking for a type of product we dont sell. My issue is ranking for the primary category "Shoes" keyword. When i first made the changes we went from ranking nowhere to around 28th on the parent category page targeted at "Shoes". Whilst not fantastic this was good as gave us something to work off. However a few weeks later, the article page ranked 40th for this term and the main page dropped off the scale. Then another week some of the sub category pages ranked for it. And now none of my pages rank in the top 50 for it. I am fairly sure this is due to some cannibalisation - simply because of various pages ranking for it at different times.
I also think that additional content added by products on the sub category pages is giving them more content and making them rank better. The Page Itself
The Shoes page itself contains 400 good unique words, with the keyword mentioned 8 times including headings. There is an image at the top of the page with its title and alt text targeted towards the keyword. The 12 sub categories are linked to on the left navigation bar, and then again below the 400 words of content via a picture and text link. This added the keyword to the page another 18 or so times in the form of links to longtail subcaterogies. This could introduce a spam problem i guess but its in the form of nav bars or navigation tables and i understood this to be a necessary evil on eCommerce websites. There are no actual products linked from this page. - a problem? With all the basic SEO covered. All sub pages linking back to the parent category, the only solution I can think of is to add more content by Adding all shoes products to the shoe page as it currently only links out the the sub categories Merging the "Different Type of Shoe and Their Uses" article into the shoe page to make a super page and make the article pages less like to produce cannibalistic problems. However, by doing solution 2, I remove a page bringing in a lot of traffic. The traffic it brings in however is of very little use and inflates the bounce rate and lowers the conversion rate of my whole site by significant figures. It also distorts other useful reports to track my other progress. I hope i have explained well enough, thanks for sticking with me this far, i havn't posted links due to a reluctance by the company so hopefully my example will suffice. As always thanks for any input.0 -
Automated Quality Content Acceptable Even Though Looks Similar Across Pages
I have some advanced statistics modules implemented on my website, which is very high level added value for users. However, wording is similar across 1000+ pages, with difference being the statistical findings.
Intermediate & Advanced SEO | | khi5
Page Ex 1: http://www.honoluluhi5.com/oahu/honolulu-condos/
Page Ex: 2: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/ As you can see same wording is used "Median Sales Price per Year", "$ Volume of Active Listings" etc etc....difference being the findings / results are obviously different. Questions: are search engines smart enough to realize the quality in this or do they see similar wording across 1000+ pages and p-otentially consider the pages low-quality content, because search engines are unable to identify the high level added value and complexity in pulling such quality data? If that may be the case, does that mean I ought to make the pages more "unique" by including a little piece of writing about each page to make them look more unique, even though it is not of value to users?0 -
Best strategy for duplicate content?
Hi everyone, We have a site where all product pages have more or less similar text (same printing techniques, etc.) The main differences are prices and images, text is highly similar. We have around 150 products in every language. Moz's algorithm tells me to do something about duplicate content, but I don't really know what we could do, since the descriptions can't be changed to be very different. We essentially have paper bags in different colors and and from different materials.
Intermediate & Advanced SEO | | JaanMSonberg0 -
Can a website be punished by panda if content scrapers have duplicated content?
I've noticed recently that a number of content scrapers are linking to one of our websites and have the duplicate content on their web pages. Can content scrapers affect the original website's ranking? I'm concerned that having duplicated content, even if hosted by scrapers, could be a bad signal to Google. What are the best ways to prevent this happening? I'd really appreciate any help as I can't find the answer online!
Intermediate & Advanced SEO | | RG_SEO0 -
Duplicate content on sites from different countries
Hi, we have a client who currently has a lot of duplicate content with their UK and US website. Both websites are geographically targeted (via google webmaster tools) to their specific location and have the appropriate local domain extension. Is having duplicate content a major issue, since they are in two different countries and geographic regions of the world? Any statement from Google about this? Regards, Bill
Intermediate & Advanced SEO | | MBASydney0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Duplicate Content Question
My client's website is for an organization that is part of a larger organization - which has it's own website. We were given permission to use content from the larger organization's site on my client's redesigned site. The SEs will deem this as duplicate content, right? I can "re-write" the content for the new site, but it will still be closely based on the original content from the larger organization's site, due to the scientific/medical nature of the subject material. Is there a way around this dilemma so I do not get penalized? Thanks!
Intermediate & Advanced SEO | | Mills1 -
ECommerce syndication & duplicate content
We have an eCommerce website with original software products. We want to syndicate our content to partner and affiliate websites, but are worried about the effect of duplicate content all over the web. Note that this is a relatively high profile project, where thousands of sites will be listing hundreds of our products, with the exact same name, description, tags, etc. We read the wonderful and relevant post by Kate Morris on this topic (here: http://mz.cm/nXho02) and we realize the duplicate content is never the best option. Some concrete questions we're trying to figure out: 1. Are we risking penalties of any sort? 2. We can potentially get tens of thousands of links from this concept, all with duplicate content around them, but from PR3-6 sites, some with lots of authority. What will affect our site more - the quantity of mediocre links (good) or the duplicate content around them (bad)? 3. Should we sacrifice SEO for a good business idea?
Intermediate & Advanced SEO | | erangalp0