Does Automated High Quality Content Look Like Low Quality to Search Engines?
-
I have 1,000+ pages that all have very similar writing, but different results.
Example:
Nr of days on market
Average sales price
Median sales price
etc etc etcAll the results are very different for each neighborhood. However, as per the above, the wording is similar. The content is very valuable to users. However, I am concerned search engines may see it as low quality content, as wording is identical across all these pages (except the results). Any view on this? Any examples to back up such views?
-
Automated means that my my web developers has an algorithm in places that calculates changes in al those statistical fields on an ongoing basis so users always have new up to date data. From the URL I included you can on top bar change neighborhood etc and the statistics will change. Great insight for user but since writing "median price per year", "$ Volume of active listings" etc are same across all pages I wonder how I should expect search engines to treat it.
Any articles or experience to back up ideas highly appreciated.
-
Ah, OK. So, when you say "automated" content, what does that mean, exactly? And why is there thousands of pages? Are they all unique somehow? How are you deciding when it is worthwhile to create a new page?
I'd need more insight into your website hierarchy, content strategy and more to give more of an answer.
-
http://www.honoluluhi5.com/oahu/honolulu-condos/
High quality stats on the page. Many pages like that. Good for user.
-
My concern is whether your content is duplicated in ways that offer no additional value to search engines and website visitors. For example, do you have two pages that have pretty much the exact-same text except that one uses the phrase "average sales price" and another has "median sales price" instead?
While I know that "average" and "median" mean two different things, if the only difference in the text of two pages is that one uses "average" and the other uses "median," then I would be very concerned about a Panda hit from Google. Panda hits websites that have duplicated, low-quality, and/or unoriginal content on a large scale.
My question: unless a website has thousands of products or thousands of blog posts, do you really need thousands of pages? Most websites that have thousands of pages have spun content to target one specific keyword on one specific page -- and doing this many, many times over. One of my first "SEO" jobs years ago was to rewrite website pages in different words for exactly this purpose. (I know now that it was a black-hat job.) Today, Google is smart enough to know that a single page can be relevant for multiple keyword variations and themes -- so such actions are not necessary. And rightly so!
My other concern is your use of the word "automated." 99% of the time, anything automated will appear to Google -- and, more importantly, to users -- as spam. Original, authoritative, quality, human-created content is always better. Five pages of this is better than 500 pages of automated text. I would look into consolidating a lot of your pages into a smaller set of completely-original pages that each targets a set of related keyword themes.
Again, I don't know your specific case, so I could be wrong. But your post set off a bunch of warnings. If you need any clarifications, please feel free to reply!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
404 vs 410 Across Search Engines
We are removing a large number of URLs permanently. We care about rankings for search engines other than Google such as Yahoo-Bing, who don't even list https status 410 code option: https://docs.microsoft.com/en-us/bingmaps/spatial-data-services/status-codes-and-error-handling Does anyone know how search engines other than Google handle 410 vs 404 status? For pages permanently being removed John Mueller at Google has stated "From our point of view, in the mid term/long term, a 404 is the same as a 410 for us. So in both of these cases, we drop those URLs from our index. We generally reduce crawling a little bit of those URLs so that we don’t spend too much time crawling things that we know don’t exist. The subtle difference here is that a 410 will sometimes fall out a little bit faster than a 404. But usually, we’re talking on the order of a couple days or so. So if you’re just removing content naturally, then that’s perfectly fine to use either one." Any information or thoughts? Thanks
Intermediate & Advanced SEO | | sb10300 -
Same content, different languages. Duplicate content issue? | international SEO
Hi, If the "content" is the same, but is written in different languages, will Google see the articles as duplicate content?
Intermediate & Advanced SEO | | chalet
If google won't see it as duplicate content. What is the profit of implementing the alternate lang tag?Kind regards,Jeroen0 -
Duplicate content. Competing for rank.
Scenario: An automotive dealer lists cars for sale on their website. The descriptions are very good and in depth at 1,200 words per car. However chunks of the copy are copied from car review websites and weaved into their original copy. Q1: This is flagged in copyscape - how much of an issue is this for Google? Q2: The same stock with the same copy is fed into a popular car listing website - the dealer's website and the classifieds website often rank in the top two positions (sometimes the dealer on top other times the classifieds site). Is this a good or a bad thing? Are you risking being seen as duplicating/scraping content? Thank you.
Intermediate & Advanced SEO | | Bee1590 -
Directory with Duplicate content? what to do?
Moz keeps finding loads of pages with duplicate content on my website. The problem is its a directory page to different locations. E.g if we were a clothes shop we would be listing our locations: www.sitename.com/locations/london www.sitename.com/locations/rome www.sitename.com/locations/germany The content on these pages is all the same, except for an embedded google map that shows the location of the place. The problem is that google thinks all these pages are duplicated content. Should i set a canonical link on every single page saying that www.sitename.com/locations/london is the main page? I don't know if i can use canonical links because the page content isn't identical because of the embedded map. Help would be appreciated. Thanks.
Intermediate & Advanced SEO | | nchlondon0 -
Content Internal Linking ?
Should we internally link new content to old content using anchor tags (keywords) related to pages from all new blogposts or should be keep rotating the blogposts like link from some blog posts & not from others. What ratio should we maintain. Right now i keep 2 links maximum from a 300 words posts or 3 in 500 words posts maximum. But linking from each new blog posts will be good?
Intermediate & Advanced SEO | | welcomecure0 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
High search volume keywords
The problem is that our index is not in serps anymore with the high volume keywords (Pfizer, Roche, johnson & johnson).
Intermediate & Advanced SEO | | bele
We still keep these keywords in title, but it brings not much results. We made page www.domain.com/pfizer , added there Pfizer products with unique descriptions.
Product pages started to drive visitors, but not the www.domain.com/pfizer page. If we add a blog to the top of this page and add unique posts about Pfizer company news, would it help?
In this case this page would be unique, refreshed with new info, and have rotating pfizer products. Maybe some other suggestions?0 -
High PR Profile backlinks
High <acronym title="Google Page Ranking">PR</acronym> Profilebacklinks still worth for SEO ?
Intermediate & Advanced SEO | | innofidelity1