Automated Quality Content Acceptable Even Though Looks Similar Across Pages
-
I have some advanced statistics modules implemented on my website, which is very high level added value for users. However, wording is similar across 1000+ pages, with difference being the statistical findings.
Page Ex 1: http://www.honoluluhi5.com/oahu/honolulu-condos/
Page Ex: 2: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/As you can see same wording is used "Median Sales Price per Year", "$ Volume of Active Listings" etc etc....difference being the findings / results are obviously different.
Questions: are search engines smart enough to realize the quality in this or do they see similar wording across 1000+ pages and p-otentially consider the pages low-quality content, because search engines are unable to identify the high level added value and complexity in pulling such quality data? If that may be the case, does that mean I ought to make the pages more "unique" by including a little piece of writing about each page to make them look more unique, even though it is not of value to users?
-
Hey Khi5 —
I just took a closer look at your webpage, as well as the related questions that you've asked before.
I think an even bigger problem than "duplicate content" is "thin content". The main body of your page is 56 words, when the general rule of thumb is to put 300+ words of content.
To answer you more specifically:
- No, I don't believe that search engines have the ability to identify very similar content, because they go by keyword. Even if the search engines DON'T categorize the content as duplicate, they're all competing with each other for the same keywords. The articles are all competing with each other in the same space. If you're trying to focus on "Honolulu" vs "Waikiki" vs <some other="" neighborhood="">, then your pages also need many more repeats of the keywords you're trying to win. </some>
- If the bulk of your page is unique (b/c you're writing about Honolulu as a category vs Waikiki as a specific neighborhood), then you don't have to worry about duplicate content; most of your content is unique
tl;dr: > 300 words, repeat desired exact match keywords several times on a page; yes, create create unique content to make the pages more unique and specific
Hope that helps more.
-
thank you, Andrew. I appreciate your help. Still looking for more conclusive / direct answer to my question.
-
Hey khi5 —
I don't have a _specific _answer, but I have something better: a way of finding related answers!
If you search on Moz for "Real Estate", you might find some SEO best practices for your real estate SEO/multi-listing type sites:
- Moz Q&A Search for "real estate"
- [Moz Q&A] Real Estate and Duplicate Content
- [Moz Q&A] Canonicals for Real Estate
- [Moz Q&A] Seo & Real Estate Site?
Hope that helps point you in a good direction.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content across different domains in different countries?
Hi Guys, We have a 4 sites One in NZ, UK, Canada and Australia. All geo-targeting their respective countries in Google Search Console. The sites are identical. We recently added the same content to all 4 sites. Will this cause duplicate content issues or any issues even though they are in different countries and geo-targeting is set? Cheers.
Intermediate & Advanced SEO | | wickstar0 -
What is the proper way to execute 'page to page redirection'
I need to redirection every page of my website to a new url of another site I've made. I intend to add:"Redirect 301 /oldpage.html http://www.example.com/newpage.html"I will use the 301 per page to redirect every page of my site, but I'm confused that if I add:"Redirect 301 / http://mt-example.com/" it will redirect all of my pages to the homepage and ignore the URLs i have separately mentioned for redirection.Please guide me.
Intermediate & Advanced SEO | | NABSID0 -
Duplicate content issue with pages that have navigation
We have a large consumer website with several sections that have navigation of several pages. How would I prevent the pages from getting duplicate content errors and how best would I handle SEO for these? For example we have about 500 events with 20 events showing on each page. What is the best way to prevent all the subsequent navigation pages from getting a duplicate content and duplicate title error?
Intermediate & Advanced SEO | | roundbrix0 -
Is it possible to have good SEO without links and with only quality content?
Is it possible to have good SEO without links and with only quality content? Have you any experience?
Intermediate & Advanced SEO | | Alex_Moravek2 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
Should I "NoIndex" Pages with Almost no Unique Content
I have a real estate site with MLS data (real estate listings shared across the Internet by Realtors, which means data exist across the Internet already). Important pages are the "MLS result pages" - the pages showing thumbnail pictures of all properties for sale in a given region or neighborhood. 1 MLS result page may be for a region and another for a neighborhood within the region:
Intermediate & Advanced SEO | | khi5
example.com/region-name and example.com/region-name/neighborhood-name
So all data on the neighborhood page will be 100% data from the region URL. Question: would it make sense to "NoIndex" such neighborhood page, since it would reduce nr of non-unique pages on my site and also reduce amount of data which could be seen as duplicate data? Will my region page have a good chance of ranking better if I "NoIndex" the neighborhood page? OR, is Google so advanced they know Realtors share MLS data and worst case simple give such pages very low value, but will NOT impact ranking of other pages on a website? I am aware I can work on making these MLS result pages more unique etc, but that isn't what my above question is about. thank you.0 -
Indexation of content from internal pages (registration) by Google
Hello, we are having quite a big amount of content on internal pages which can only be accessed as a registered member. What are the different options the get this content indexed by Google? In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons. Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page? Thanks Ben
Intermediate & Advanced SEO | | guitarslinger0 -
Pop Up Pages Being Indexed, Seen As Duplicate Content
I offer users the opportunity to email and embed images from my website. (See this page http://www.andertoons.com/cartoon/6246/ and look under the large image for "Email to a Friend" and "Get Embed HTML" links.) But I'm seeing the ensuing pop-up pages (Ex: http://www.andertoons.com/embed/5231/?KeepThis=true&TB_iframe=true&height=370&width=700&modal=true and http://www.andertoons.com/email/6246/?KeepThis=true&TB_iframe=true&height=432&width=700&modal=true) showing up in Google. Even worse, I think they're seen as duplicate content. How should I deal with this?
Intermediate & Advanced SEO | | andertoons0