Does Automated High Quality Content Look Like Low Quality to Search Engines?
-
I have 1,000+ pages that all have very similar writing, but different results.
Example:
Nr of days on market
Average sales price
Median sales price
etc etc etcAll the results are very different for each neighborhood. However, as per the above, the wording is similar. The content is very valuable to users. However, I am concerned search engines may see it as low quality content, as wording is identical across all these pages (except the results). Any view on this? Any examples to back up such views?
-
Automated means that my my web developers has an algorithm in places that calculates changes in al those statistical fields on an ongoing basis so users always have new up to date data. From the URL I included you can on top bar change neighborhood etc and the statistics will change. Great insight for user but since writing "median price per year", "$ Volume of active listings" etc are same across all pages I wonder how I should expect search engines to treat it.
Any articles or experience to back up ideas highly appreciated.
-
Ah, OK. So, when you say "automated" content, what does that mean, exactly? And why is there thousands of pages? Are they all unique somehow? How are you deciding when it is worthwhile to create a new page?
I'd need more insight into your website hierarchy, content strategy and more to give more of an answer.
-
http://www.honoluluhi5.com/oahu/honolulu-condos/
High quality stats on the page. Many pages like that. Good for user.
-
My concern is whether your content is duplicated in ways that offer no additional value to search engines and website visitors. For example, do you have two pages that have pretty much the exact-same text except that one uses the phrase "average sales price" and another has "median sales price" instead?
While I know that "average" and "median" mean two different things, if the only difference in the text of two pages is that one uses "average" and the other uses "median," then I would be very concerned about a Panda hit from Google. Panda hits websites that have duplicated, low-quality, and/or unoriginal content on a large scale.
My question: unless a website has thousands of products or thousands of blog posts, do you really need thousands of pages? Most websites that have thousands of pages have spun content to target one specific keyword on one specific page -- and doing this many, many times over. One of my first "SEO" jobs years ago was to rewrite website pages in different words for exactly this purpose. (I know now that it was a black-hat job.) Today, Google is smart enough to know that a single page can be relevant for multiple keyword variations and themes -- so such actions are not necessary. And rightly so!
My other concern is your use of the word "automated." 99% of the time, anything automated will appear to Google -- and, more importantly, to users -- as spam. Original, authoritative, quality, human-created content is always better. Five pages of this is better than 500 pages of automated text. I would look into consolidating a lot of your pages into a smaller set of completely-original pages that each targets a set of related keyword themes.
Again, I don't know your specific case, so I could be wrong. But your post set off a bunch of warnings. If you need any clarifications, please feel free to reply!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Copied Content - Define Canonical
Hello, The Story I am working on a news organization. Our website is the https://www.neakriti.gr My question regards copied content with source references. Sometimes a small portion of our content is based on some third article that is posted on some site (that is about 1% of our content). We always put "source" reference if that is the case. This is inevitable as "news" is something that sometimes has sources on other news sites, especially if there is something you cannot verify or don't have immediate sources, and therefore you need to state that "according to this source, something has happened". Here is one article of ours that has a source from another site: https://www.neakriti.gr/article/ellada-nea/1503363/nekros-vrethike-o-agnooumenos-arhimandritis-stin-lakonia/ if you open the above article you will see we have a link to the equivalent article of the original source site http://lakonikos.gr/epikairothta/item/133664-nekros-entopistike-o-arximandritis-p-andreas-bolovinos-synexis-enimerosi Now here is my question. I have read in other MOZ forum articles that a "canonical" approach solves this issue... How can we be legit when it comes to duplicate content in the eyes of search engines? Should we use some kind of canonical link to the source site? Should the "canonical" be inside the link in some way? Should it be on our section? Our site has AMP equivalent pages (if you add the /amp keyword at the end of the article URL). Our AMP pages have canonical to our original article. So if we have a "canonical" approach how would the AMP be effected as well? Also by applying a possible canonical solution to the source URL, does that "canonical" effect our article as not being shown in search results, thus passing all indexing to the canonical site? (I know that canonical indicates what URL is to be indexed). Additionally, does such a canonical indication make us legit in such a case in the eyes of search engines? (i.e. it eliminates any possible article duplication for original content in the eyes of search engines?). Or simply put, having a simple link to the original article (as we have it now) is enough for the search engines to understand that we have reference to original article URL? How would we approach this problem in our site based on its current structure?
Intermediate & Advanced SEO | | ioannisanif0 -
Search Causing Duplicate Content
I use Opencart and have found that a lot of my duplicate content (mainly from Products) which is caused by the Search function. Is there a simple way to tell Google to ignore the Search function pathway? Or is this particular action not recommended? Here are two examples: http://thespacecollective.com/index.php?route=product/search&tag=cloth http://thespacecollective.com/index.php?route=product/search
Intermediate & Advanced SEO | | moon-boots0 -
How to outrank a directory listing with high DA but low PA?
My site is at 4th place, 3 places above it is a gumtree (similar to yell, yelp) listing. How can you figure out how difficult it would be outrank those pages? I mean obviously the pages would have low PA and they are top based on the high DA of the site. This also seems to go back to keyword research and difficulty, when I'm doing keyword research and I see a wikipedia site in top 5 rank, or a yell.com or perhaps an article in forbes.com outranks your site. Typically the problem seems to be Google giving a lot of credit to these pages rankings based on the high DA rather than PA of the pages. How would you gauge the difficulty of that keyword then if the competition are pages with very high DA which is impossible to compete with but low PA? Thanks
Intermediate & Advanced SEO | | magusara2 -
How does google recognize original content?
Well, we wrote our own product descriptions for 99% of the products we have. They are all descriptive, has at least 4 bullet points to show best features of the product without reading the all description. So instead using a manufacturer description, we spent $$$$ and worked with a copywriter and still doing the same thing whenever we add a new product to the website. However since we are using a product datafeed and send it to amazon and google, they use our product descriptions too. I always wait couple of days until google crawl our product pages before i send recently added products to amazon or google. I believe if google crawls our product page first, we will be the owner of the content? Am i right? If not i believe amazon is taking advantage of my original content. I am asking it because we are a relatively new ecommerce store (online since feb 1st) while we didn't have a lot of organic traffic in the past, i see that our organic traffic dropped like 50% in April, seems like it was effected latest google update. Since we never bought a link or did black hat link building. Actually we didn't do any link building activity until last month. So google thought that we have a shallow or duplicated content and dropped our rankings? I see that our organic traffic is improving very very slowly since then but basically it is like between 5%-10% of our current daily traffic. What do you guys think? You think all our original content effort is going to trash?
Intermediate & Advanced SEO | | serkie1 -
Duplicate Content Question
Hey Everyone, I have a question regarding duplicate content. If your site is penalized for duplicate content, is it just the pages with the content on it that are affected or is the whole site affected? Thanks 🙂
Intermediate & Advanced SEO | | jhinchcliffe0 -
To index search results or not?
In its webmaster guidelines, Google says not to index search results " that don't add much value for users coming from search engines." I've noticed several big brands index search results, and am wondering if it is generally OK to index search results with high engagement metrics (high PVPV, time on site, etc). We have an database of content, and it seems one of the best ways to get this content in search engines would be to allow indexing of search results (to capture the long tail) rather than build thousands of static URLs. Have any smaller brands had success with allowing indexing of search results? Any best practices or recommendations?
Intermediate & Advanced SEO | | nicole.healthline0 -
Duplicate Content on Press Release?
Hi, We recently held a charity night in store. And had a few local celebs turn up etc... We created a press release to send out to various media outlets, within the press release were hyperlinks to our site and links on certain keywords to specific brands on our site. My question is, should we be sending a different press release to each outlet to stop the duplicate content thing, or is sending the same release out to everyone ok? We will be sending approx 20 of these out, some going online and some not. So far had one local paper website, a massive football website and a local magazine site. All pretty much same content and a few pics. Any help, hints or tips on how to go about this if I am going to be sending out to a load of other sites/blogs? Cheers
Intermediate & Advanced SEO | | YNWA0 -
Every seen any SERP like this?
I've never seen a SERP like this, has anyone else? Image no longer available
Intermediate & Advanced SEO | | ATShock1