Opinions on Boilerplate Content
-
Howdy,
Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please.
What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings).
For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name.
I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it?
I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites.
Thanks!
-
The SEO of the site is probably fine. The problem with the site is that it takes one page of content and smears it across dozens of thin content, duplicate content, cookie cutter pages. The SEO is lipstick on a pig.
-
Thanks again for the response, EGOL. It is appreciated.
Can you point to any examples of large-scale sites like this with better SEO for these pages? I mean, any site that targets every city, neighborhood, park, etc. with content like this should theoretically run into duplicate content and display thin result pages quite often.
And even so, these pages are helpful. I Google "restaurant + small cities near me" and Yelp pages come up, which benefit me.
Yelp is one of the biggest review sites on the web and their filtered search result pages are indexed and ranking ultra high all over the place. What are they doing so special?
This page and this page both offer nearly the same exact results, just shuffled a bit. Beyond simply being too big to get slapped, why is it okay when Yelp does this?
-
I agree. It is on a very thin line. I believe that Google's Panda algo will eventually hit it. I look at lots of site that people say lost traffic. This one has a similar design and content Style.
-
That's interesting. It seems to have been around for quite a while and ranks well. Of all the similar sites I've seen, Houzz seems to walk the thinnest line on bad-SEO though. Their filter creates nearly identical pages, all of which get indexed, and they have no canonicals for any of them and virtually the same on-page SEO as well. Not to mention the same blurbs across millions of pages, etc.
It's weird to me though that a reasonably targeted blurb is such bad business when the rest of the site is so helpful to users. One would think Google would allow it since the blurbs apply to each page and the "results" are the real meat and potatoes of the site.
-
This site has lots of duplicate content from page to page and lots of thin content on a repeating template. It will be hit by Panda.
-
EGOL,
I think you're making unfair assumptions about our site. Each page visible to Google will have helpful information and content on the site. The one's that don't will not be "published" for Google or our users.
I assure you, the site will be worthwhile and helpful to the end user, especially as time progresses. In fact, if you read above, I am asking specifically about adding additional helpful content to the user, but trying to avoid DC issues by posting it throughout each site.
I am not trying to shortcut anything, I'm curious why some sites are able to seemingly circumvent SEO tenets and was hoping for a helpful discussion.
And again, I'll reiterate, I am not interested in boilerplate content to shortcut anything. It would be in addition to existing useful content. The boilerplate content on similar pages would also be beneficial to the end user. Using the examples above, I believe the small blurbs above _can _be helpful to the user. Do you agree?
Thanks for the response.
-
The problem that you face is that you are trying to make a website with millions of pages for which you do not have adequate content. You are trying to take shortcuts by using a cookiecutter instead of doing the work to make a worthy and unique website.
If you continue with your current business plan, I believe that Google will not treat your site very well. These sites used to work in Google over ten years ago and at that time they were ingenious. Today they are spam.
-
The paragraph of helpful content is identical (beyond a city being swapped out) but it still helps their searches. If you tailor a search with one of their cities and a cousin keyword within the text, they pop-up on the front page usually. That's what I'm asking about. Why is Google ignoring this obvious DC?
I'm assuming the business listings are making the page unique enough to override the duplicate paragraph + the site is huge and has TONS of authority.
-
They're not identical, and I notice many directories are set-up like this. Two individual users with different interests would find unique information from both of these samples. The only issue is how your competition has setup their page. For instance, if someone is just targeting Phoenix, and really goes to town with unique information and links, that may rank better because they may be views as more of an authority on the subject.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Of Dead Websites Can be resused?
I have 2 websites. One website links are from spamy techniques (wrong guy hired) which still has massive links so I started a new website with a fresh domain. Now when the new website (only white hate methods used) has started to show positive movements I feel like its the right time to shut the other website down. Since, I have a lot of content on my first site (spamy links) can i reuse the content again on my new site after I shut down my first site?
Intermediate & Advanced SEO | | welcomecure0 -
Thin Content to Quality Content
How should i modify content from thin to high quality content. Somehow i realized that my pages where targetted keywords didn't had the keyword density lost a massive ranking after the last update whereas all pages which had the keyword density are ranking good. But my concern is all pages which are ranking good had all the keyword in a single statement like. Get ABC pens, ABC pencils, ABC colors, etc. at the end of a 300 word content describing ABC. Whereas the pages which dropped the rankings had a single keyword repeated just twice in a 500 word article. Can this be the reason for a massive drop. Should i add the single statement like the one which is there on pages ranking good? Is it good to add just a single line once the page is indexed or do i need to get a fresh content once again along with a sentence of keyword i mentioned above?
Intermediate & Advanced SEO | | welcomecure1 -
Page structure and how to optimize old content
SITE STRUCTURE I am trying to optimize the structure of our site Dreamestatehuahin.com. Getting a visible sitemap of my page make me realized it was not a pyramid as I expected it to be but instead very flat. I Would be happy for some advise on how to structure my site in future aswell how to optimize certain place on the page that i think need a change. 1: structure on posts. Maybe I misunderstand how post works in wordpress or something happen with my theme. When I look at my page sitemap my page is VERY flat because permalinks setting I chose the setting as post name (recommended in most articles). http://www.dreamestatehuahin.com/sample-post What I actually believed was that post name was place after /blog/ like: http://www.dreamestatehuahin.com/blog/sample-post I would be a good idea to do like this right? Should I add some SEO text on the top of my blog page before the actually posts. Or would this be a bad idea due to pagination causing double content? Could one do 4 blogs in one site and replace the name “blog” in the url with a keywords http://www.dreamestatehuahin.com/real-estate-announcement/sample-post http://www.dreamestatehuahin.com/hua-hin-attractions/sample-post 2) Pages Based on property type From our top menu, i have made links under for sael using wordpress property types http://www.dreamestatehuahin.com/property-type/villa/ http://www.dreamestatehuahin.com/property-type/hot-deals/ http://www.dreamestatehuahin.com/property-type/condominium/ Earlier I found that these pages created duplictaon of titles due to pagenation so I deleted the h1 What would you do with these pages. Should I optimize them with a text and h1. maybe it is possible to add some title and text content for the top of the first page only (the one page that are linked to our top menu) http://www.dreamestatehuahin.com/property-type/villa and not to page 2, 3, 4….. http://www.dreamestatehuahin.com/property-type/villa/page/2/ b) Also maybe I should rename the property types WOuld it make sence to change name of the property types from etc villa to villas for sale or even better villas for sale hua hin Then the above urls will look like this instead: http://www.dreamestatehuahin.com/property-type/villas-for-sale/ Or Maybe renaming a property type would result in many 404 errors and not be worth the effort? 3) LINKING + REPOSTING OUR “PROPERTY” PAGES AND DO A 301 REDIRECT? a) Would It be good idea to link back from all properties description to one of our 5 optimized landingpages (for the keyword home/house/condo/villa) for sale in Hua Hin? http://www.dreamestatehuahin.com/property-hua-hin/ http://www.dreamestatehuahin.com/house-for-sale-hua-hin/ b) Also so far we haven’t been really good about optimizing each property (no keywords, optimized titles or descriptions) etc. http://www.dreamestatehuahin.com/property/baan-suksamran/ I wonder if it would be worth the effort to optimize content of each of the old properties )photos-text) on our page? Or maybe post the old properties again in a new optimized version and do a 301 redirect from the old post?
Intermediate & Advanced SEO | | nm19770 -
Parameter Strings & Duplicate Page Content
I'm managing a site that has thousands of pages due to all of the dynamic parameter strings that are being generated. It's a real estate listing site that allows people to create a listing, and is generating lots of new listings everyday. The Moz crawl report is continually flagging A LOT (25k+) of the site pages for duplicate content due to all of these parameter string URLs. Example: sitename.com/listings & sitename.com/listings/?addr=street name Do I really need to do anything about those pages? I have researched the topic quite a bit, but can't seem to find anything too concrete as to what the best course of action is. My original thinking was to add the rel=canonical tag to each of the main URLs that have parameters attached. I have also read that you can bypass that by telling Google what parameters to ignore in Webmaster tools. We want these listings to show up in search results, though, so I don't know if either of these options is ideal, since each would cause the listing pages (pages with parameter strings) to stop being indexed, right? Which is why I'm wondering if doing nothing at all will hurt the site? I should also mention that I originally recommend the rel=canonical option to the web developer, who has pushed back in saying that "search engines ignore parameter strings." Naturally, he doesn't want the extra work load of setting up the canonical tags, which I can understand, but I want to make sure I'm both giving him the most feasible option for implementation as well as the best option to fix the issues.
Intermediate & Advanced SEO | | garrettkite0 -
Content Above The Fold (strategies)
Does anyone know if using a wide responsive layout that brings content well above the fold on big screens (but still pushes it down on small screens or mobile devices) is a good option? We have an adsense site that just got destroyed and I'm assuming its this new Google algo that's looking at sites with too big of ads above the fold.
Intermediate & Advanced SEO | | iAnalyst.com0 -
Tabs and duplicate content?
We own this site http://www.discountstickerprinting.co.uk/ and just a little concerned as I right clicked open in new tab on the tab content section and it went to a new page For example if you right click on the price tab and click open in new tab you will end up with the url
Intermediate & Advanced SEO | | BobAnderson
http://www.discountstickerprinting.co.uk/#tabThree Does this mean that our content is being duplicated onto another page? If so what should I do?0 -
Best tools for identifying internal duplicate content
Hello again Mozzers! Other than the Moz tool, are there any other tools out there for identifying internal duplicate content? Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
How to deal with category browsing and duplicate content
On an ecommerce site there are typically a lot of pages that may appear to be duplications due to category browse results where the only difference may be the sorting by price or number of products per page. How best to deal with this? Add nofollow to the sorting links? Set canonical values that ignore these variables? Set cononical values that match the category home page? Is this even a possible problem with Panda or spiders in general?
Intermediate & Advanced SEO | | IanTheScot0