Does Automated High Quality Content Look Like Low Quality to Search Engines?
-
I have 1,000+ pages that all have very similar writing, but different results.
Example:
Nr of days on market
Average sales price
Median sales price
etc etc etcAll the results are very different for each neighborhood. However, as per the above, the wording is similar. The content is very valuable to users. However, I am concerned search engines may see it as low quality content, as wording is identical across all these pages (except the results). Any view on this? Any examples to back up such views?
-
Automated means that my my web developers has an algorithm in places that calculates changes in al those statistical fields on an ongoing basis so users always have new up to date data. From the URL I included you can on top bar change neighborhood etc and the statistics will change. Great insight for user but since writing "median price per year", "$ Volume of active listings" etc are same across all pages I wonder how I should expect search engines to treat it.
Any articles or experience to back up ideas highly appreciated.
-
Ah, OK. So, when you say "automated" content, what does that mean, exactly? And why is there thousands of pages? Are they all unique somehow? How are you deciding when it is worthwhile to create a new page?
I'd need more insight into your website hierarchy, content strategy and more to give more of an answer.
-
http://www.honoluluhi5.com/oahu/honolulu-condos/
High quality stats on the page. Many pages like that. Good for user.
-
My concern is whether your content is duplicated in ways that offer no additional value to search engines and website visitors. For example, do you have two pages that have pretty much the exact-same text except that one uses the phrase "average sales price" and another has "median sales price" instead?
While I know that "average" and "median" mean two different things, if the only difference in the text of two pages is that one uses "average" and the other uses "median," then I would be very concerned about a Panda hit from Google. Panda hits websites that have duplicated, low-quality, and/or unoriginal content on a large scale.
My question: unless a website has thousands of products or thousands of blog posts, do you really need thousands of pages? Most websites that have thousands of pages have spun content to target one specific keyword on one specific page -- and doing this many, many times over. One of my first "SEO" jobs years ago was to rewrite website pages in different words for exactly this purpose. (I know now that it was a black-hat job.) Today, Google is smart enough to know that a single page can be relevant for multiple keyword variations and themes -- so such actions are not necessary. And rightly so!
My other concern is your use of the word "automated." 99% of the time, anything automated will appear to Google -- and, more importantly, to users -- as spam. Original, authoritative, quality, human-created content is always better. Five pages of this is better than 500 pages of automated text. I would look into consolidating a lot of your pages into a smaller set of completely-original pages that each targets a set of related keyword themes.
Again, I don't know your specific case, so I could be wrong. But your post set off a bunch of warnings. If you need any clarifications, please feel free to reply!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content From API - Remove or to Redirect ?
Hi Guys,
Intermediate & Advanced SEO | | PaddyM556
I am working on a site at the moment,
Previous developer used a API to pull in HealthCare content (HSE) .
So the API basically generates landing pages within the site, and generates the content.
To date it has over 2k in pages being generated.
Some actually rank organically and some don't. New site being launch: So a new site is being launched & the "health advice" where this content used to live be not included in the new site. So this content will not have a place to be displayed. My Query: Would you allow the old content die off in the migration process & just become 404's
Or
Would you 301 redirect the all or only ranking pages to the homepage ? Other considerations, site will be moved to https:// so site will be submitted to search console & re-indexed by Google. Would love to hear if anyone had similar situation or suggestions.
Best Regards
Pat0 -
Penalty for duplicate content on the same website?
Is it possible to get a penalty for duplicate content on the same website? I have a old custom-built site with a large number of filters that are pre-generated for speed. Basically the only difference is the meta title and H1 tag, with a few text differences here and there. Obviously I could no-follow all the filter links but it would take an enormous amount of work. The site is performing well in the search. I'm trying to decide whether if there is a risk of a penalty, if not I'm loath to do anything in case it causes other issues.
Intermediate & Advanced SEO | | seoman100 -
No content in the view source, why?
Hi I have a website that you don't see the article body in the view source but if you use the inspect element tool you can see the content, do you know why? Thanks Roy
Intermediate & Advanced SEO | | kadut0 -
Duplicate Page Content
We have different plans that you can signup for - how can we rectify the duplicate page content and title issue here? Thanks. | http://signup.directiq.com/?plan=100 | 0 | 1 | 32 | 1 | 200 |
Intermediate & Advanced SEO | | directiq
| http://signup.directiq.com/?plan=104 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=116 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=117 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=102 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=119 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=101 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=103 | 0 | 1 | 32 | 1 | 200 |
| http://signup.directiq.com/?plan=5 |0 -
Opinions on Boilerplate Content
Howdy, Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please. What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings). For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name. I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it? I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites. Thanks!
Intermediate & Advanced SEO | | kirmeliux0 -
Stolen website content
Hello, recently we had a lot of content written for our new website. Unfortunately me and my partner have went separate ways, and he has used all my unique content on his own website. All our product descriptions, about us etc, he simply changed the name of the company. He has agreed to take the content down, so that i can now put this content on our new website which is currently being designed. Will google see this as duplicate content as it has been on a website before? Even though the content has been removed from the original website. I was worried as the content is no longer "fresh" so to speak. Can any one help me with this,
Intermediate & Advanced SEO | | Alexogilvie0 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0 -
Duplicate content question? thanks
Hi, Im my time as an SEO I have never come across the following two scenarios, I am an advocate of using unique content, therefore always suggest and in cases demand that all content is written or re-written. This is the scenarios I am facing right now. For Example we have www.abc.com (has over 200 original recipes) and then we have www.xyz.com with the recipes but they are translated into another language as they are targeting different audiences, will Google penalize for duplicate content? The other issue is that the client got the recipes from www.abc.com (that have been translated) and use them in www.xyz.com aswell, both sites owned by the same company so its not pleagurism they have legal rights but I am not sure how Google will see it and if it will penalize the sites. Thanks!
Intermediate & Advanced SEO | | M_81