Automated Quality Content Acceptable Even Though Looks Similar Across Pages
-
I have some advanced statistics modules implemented on my website, which is very high level added value for users. However, wording is similar across 1000+ pages, with difference being the statistical findings.
Page Ex 1: http://www.honoluluhi5.com/oahu/honolulu-condos/
Page Ex: 2: http://www.honoluluhi5.com/oahu/honolulu/metro/waikiki-condos/As you can see same wording is used "Median Sales Price per Year", "$ Volume of Active Listings" etc etc....difference being the findings / results are obviously different.
Questions: are search engines smart enough to realize the quality in this or do they see similar wording across 1000+ pages and p-otentially consider the pages low-quality content, because search engines are unable to identify the high level added value and complexity in pulling such quality data? If that may be the case, does that mean I ought to make the pages more "unique" by including a little piece of writing about each page to make them look more unique, even though it is not of value to users?
-
Hey Khi5 —
I just took a closer look at your webpage, as well as the related questions that you've asked before.
I think an even bigger problem than "duplicate content" is "thin content". The main body of your page is 56 words, when the general rule of thumb is to put 300+ words of content.
To answer you more specifically:
- No, I don't believe that search engines have the ability to identify very similar content, because they go by keyword. Even if the search engines DON'T categorize the content as duplicate, they're all competing with each other for the same keywords. The articles are all competing with each other in the same space. If you're trying to focus on "Honolulu" vs "Waikiki" vs <some other="" neighborhood="">, then your pages also need many more repeats of the keywords you're trying to win. </some>
- If the bulk of your page is unique (b/c you're writing about Honolulu as a category vs Waikiki as a specific neighborhood), then you don't have to worry about duplicate content; most of your content is unique
tl;dr: > 300 words, repeat desired exact match keywords several times on a page; yes, create create unique content to make the pages more unique and specific
Hope that helps more.
-
thank you, Andrew. I appreciate your help. Still looking for more conclusive / direct answer to my question.
-
Hey khi5 —
I don't have a _specific _answer, but I have something better: a way of finding related answers!
If you search on Moz for "Real Estate", you might find some SEO best practices for your real estate SEO/multi-listing type sites:
- Moz Q&A Search for "real estate"
- [Moz Q&A] Real Estate and Duplicate Content
- [Moz Q&A] Canonicals for Real Estate
- [Moz Q&A] Seo & Real Estate Site?
Hope that helps point you in a good direction.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO implications of using Marketing Automation landing pages vs on-site content
Hi there, I'm hoping someone can help here... I'm new to a company where due to the limitations of their Wordpress instance they've been creating what would ordinarily be considered pages in the standard sitemap as landing pages in their Pardot marketing automation platform. The URL subdomain is slightly different. Just wondering if anybody could quickly outline the SEO implications of doing this externally instead of directly on their site? Hope I'm making some sense... Thanks,
Intermediate & Advanced SEO | | philremington
Phil1 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Dislodged own ranking with poorer quality page?
Hi, So last week we where ranking within the top two pages for google for a results page at something like rank 21 - https://www.whichledlight.com/t/g9-led-bulbs We have been tracking this page and earned a few decent domain authority links here. Today when the new rankings came in it looks like a blog page we wrote a while back has jumped several pages to number 13 in the rankings - ok so not the end of the world BUT our results page seems to have disappeared from the rankings. You can see from the screen shot in the attached link at some point Moz / Google has noted that both the pages are relevent in some capaicty. Has Google decided that because both pages are from the site and both are showing ranking signals for the same traffic that one should go? It also seems weird that this blog one would come from nowhere to rank so highly in the course of a week. Perhaps they have changed their algorithm? This is a bit of a concern for us as it is unclear whats happened - especially after we where so close to getting to page one for a valued search term. SBZRaIq
Intermediate & Advanced SEO | | TrueluxGroup0 -
Pagination on a product page with reviews spread out on multiple pages
Our current product pages markup only have the canonical URL on the first page (each page loads more user reviews). Since we don't want to increase load times, we don't currently have a canonical view all product page. Do we need to mark up each subsequent page with its own canonical URL? My understanding was that canonical and rel next prev tags are independent of each other. So that if we mark up the middle pages with a paginated URL, e.g: Product page #1http://www.example.co.uk/Product.aspx?p=2692"/>http://www.example.co.uk/Product.aspx?p=2692&pageid=2" />**Product page #2 **http://www.example.co.uk/Product.aspx?p=2692&pageid=2"/>http://www.example.co.uk/Product.aspx?p=2692" />http://www.example.co.uk/Product.aspx?p=2692&pageid=3" />Would mean that each canonical page would suggest to google another piece of unique content, which this obviously isn't. Is the PREV NEXT able to "override" the canonical and explain to Googlebot that its part of a series? Wouldn't the canonical then be redundant?Thanks
Intermediate & Advanced SEO | | Don340 -
Significantly reducing number of pages (and overall content) on new site - is it a bad idea?
Hi Mozzers - I am looking at new site (not launched yet) - it contains significantly fewer pages than the previous site - 35 pages rather than 107 before - content on the remaining pages is plentiful but I am worried about the sudden loss of a significant "chunk" of the website - significantly cutting the size of a website must surely increase the risks of post-migration performance problems? Further info - the site has run an SEO contract with a large SEO firm for several years. They don't appear to have done anything beyond tinkering with homepage content - all the header and description tags are the same across the current website. 90% of site traffic currently arrives on the homepage. Content quality/volume isn't bad across most of the current site. Thanks in advance for your input!
Intermediate & Advanced SEO | | McTaggart0 -
Blocking poor quality content areas with robots.txt
I found an interesting discussion on seoroundtable where Barry Schwartz and others were discussing using robots.txt to block low quality content areas affected by Panda. http://www.seroundtable.com/google-farmer-advice-13090.html The article is a bit dated. I was wondering what current opinions are on this. We have some dynamically generated content pages which we tried to improve after panda. Resources have been limited and alas, they are still there. Until we can officially remove them I thought it may be a good idea to just block the entire directory. I would also remove them from my sitemaps and resubmit. There are links coming in but I could redirect the important ones (was going to do that anyway). Thoughts?
Intermediate & Advanced SEO | | Eric_edvisors0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640 -
Will having image lightbox with content on a web page SEO friendly?
This website is done in CMS. Will having lightbox pop up with content be SEO friendly? If you go to the web page and click on the images at the bottom of the page. There are lightbox that will display information. Will these lightbox content information be crawl by Google? Will it be consider as content for the url http://jennlee.com/portfolio/bran.. Thanks, John
Intermediate & Advanced SEO | | VizionSEO990