Opinions on Boilerplate Content
-
Howdy,
Ideally, uniqueness for every page's title, description, and content is desired. But when a site is very, very large, it becomes impossible. I don't believe our site can avoid boilerplate content for title tags or meta-descriptions. We will, however, markup the pages with proper microdata so Google can use this information as they please.
What I am curious about is boilerplate content repeated throughout the site for the purpose of helping the user, as well as to tell Google what the page is about (rankings).
For instance, this page and this page offer the same type of services, but in different areas. Both pages (and millions of others) offer the exact same paragraph on each page. The information is helpful to the user, but it's definitely duplicate content. All they've changed is the city name.
I'm curious, what's making this obvious duplicate content issue okay? The additional unique content throughout (in the form of different businesses), the small, yet obvious differences in on-site content (title tags clearly represent different locations), or just the fact that the site is HUGELY authorative and gets away with it?
I'm very curious to hear your opinions on this practice, potential ways to avoid it, and whether or not it's a passable practice for large, but new sites.
Thanks!
-
The SEO of the site is probably fine. The problem with the site is that it takes one page of content and smears it across dozens of thin content, duplicate content, cookie cutter pages. The SEO is lipstick on a pig.
-
Thanks again for the response, EGOL. It is appreciated.
Can you point to any examples of large-scale sites like this with better SEO for these pages? I mean, any site that targets every city, neighborhood, park, etc. with content like this should theoretically run into duplicate content and display thin result pages quite often.
And even so, these pages are helpful. I Google "restaurant + small cities near me" and Yelp pages come up, which benefit me.
Yelp is one of the biggest review sites on the web and their filtered search result pages are indexed and ranking ultra high all over the place. What are they doing so special?
This page and this page both offer nearly the same exact results, just shuffled a bit. Beyond simply being too big to get slapped, why is it okay when Yelp does this?
-
I agree. It is on a very thin line. I believe that Google's Panda algo will eventually hit it. I look at lots of site that people say lost traffic. This one has a similar design and content Style.
-
That's interesting. It seems to have been around for quite a while and ranks well. Of all the similar sites I've seen, Houzz seems to walk the thinnest line on bad-SEO though. Their filter creates nearly identical pages, all of which get indexed, and they have no canonicals for any of them and virtually the same on-page SEO as well. Not to mention the same blurbs across millions of pages, etc.
It's weird to me though that a reasonably targeted blurb is such bad business when the rest of the site is so helpful to users. One would think Google would allow it since the blurbs apply to each page and the "results" are the real meat and potatoes of the site.
-
This site has lots of duplicate content from page to page and lots of thin content on a repeating template. It will be hit by Panda.
-
EGOL,
I think you're making unfair assumptions about our site. Each page visible to Google will have helpful information and content on the site. The one's that don't will not be "published" for Google or our users.
I assure you, the site will be worthwhile and helpful to the end user, especially as time progresses. In fact, if you read above, I am asking specifically about adding additional helpful content to the user, but trying to avoid DC issues by posting it throughout each site.
I am not trying to shortcut anything, I'm curious why some sites are able to seemingly circumvent SEO tenets and was hoping for a helpful discussion.
And again, I'll reiterate, I am not interested in boilerplate content to shortcut anything. It would be in addition to existing useful content. The boilerplate content on similar pages would also be beneficial to the end user. Using the examples above, I believe the small blurbs above _can _be helpful to the user. Do you agree?
Thanks for the response.
-
The problem that you face is that you are trying to make a website with millions of pages for which you do not have adequate content. You are trying to take shortcuts by using a cookiecutter instead of doing the work to make a worthy and unique website.
If you continue with your current business plan, I believe that Google will not treat your site very well. These sites used to work in Google over ten years ago and at that time they were ingenious. Today they are spam.
-
The paragraph of helpful content is identical (beyond a city being swapped out) but it still helps their searches. If you tailor a search with one of their cities and a cousin keyword within the text, they pop-up on the front page usually. That's what I'm asking about. Why is Google ignoring this obvious DC?
I'm assuming the business listings are making the page unique enough to override the duplicate paragraph + the site is huge and has TONS of authority.
-
They're not identical, and I notice many directories are set-up like this. Two individual users with different interests would find unique information from both of these samples. The only issue is how your competition has setup their page. For instance, if someone is just targeting Phoenix, and really goes to town with unique information and links, that may rank better because they may be views as more of an authority on the subject.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Could duplicate (copied) content actually hurt a domain?
Hi 🙂 I run a small wordpress multisite network where the main site which is an informative portal about the Langhe region in Italy, and the subsites are websites of small local companies in the tourism and wine/food niche. As an additional service for those who build a website with us, I was thinking about giving them the possibility to use some ouf our portal's content (such as sights, events etc) on their website, in an automatic way. Not as an "SEO" plus, but more as a service for their current users/visitors base: so if you have a B&B you can have on your site an "events" section with curated content, or a section about thing to see (monuments, parks, museums, etc) in that area, so that your visitors can enjoy reading some content about the territory. I was wondering if, apart from NOT being benefical, it would be BAD from an SEO point of view... ie: if they could be actually penlized by google. Thanks 🙂 Best
Intermediate & Advanced SEO | | Enrico_Cassinelli0 -
How to solve this issue and avoid duplicated content?
My marketing team would like to serve up 3 pages of similar content; www.example.com/one, www.example.com/two and www.example.com/three; however the challenge here is, they'd like to have only one page whith three different titles and images based on the user's entry point (one, two, or three). To avoid duplicated pages, how would suggest this best be handled?
Intermediate & Advanced SEO | | JoelHer0 -
Blog content and panda?
If we want to create a blog to keep in front of our customers (via email and posting on social) and the posts will be around 300 - 1000 words like this site http://www.solopress.com/blog/ are we going to be asking for a panda slap as the issue would be the very little shares and traction after the first day or two. Also would panda only affect the blogs that are crap if we mix in a couple of really good posts or would it affect theses as well and possibly even the site? Any help would be appreciated.
Intermediate & Advanced SEO | | BobAnderson0 -
WMT Index Status - Possible Duplicate Content
Hi everyone. A little background: I have a website that is 3 years old. For a period of 8 months I was in the top 5 for my main targeted keyword. I seemed to have survived the man eating panda but not so sure about the blood thirsty penguin. Anyway; my homepage, along with other important pages, have been wiped of the face of Google's planet. First I got rid of some links that may not have been helping and disavowed them. When this didn't work I decided to do a complete redesign of my site with better content, cleaner design, removed ads (only had 1) and incorporated social integration. This has had no effect at all. I filed a reconsideration request and was told that I have NOT had any manual spam penalties made against me, by the way I never received any warning messages in WMT. SO, what could be the problem? Maybe it's duplicate content? In WMT the Index Status indicates that there are 260 pages indexed. However; I have only 47 pages in my sitemap and when I do a site: search on Google it only retrieves 44 pages. So what are all these other pages? Before I uploaded the redesign I removed all the current pages from the index and cache using the remove URL tool in WMT. I should mention that I have a blog on Blogger that is linked to a subdomain on my hosting account i.e. http://blog.mydomain.co.uk. Are the blog posts counted as pages on my site or on Blogger's servers? Ahhhh this is too complicated lol Any help will be much appreciated! Many thanks, Mark.
Intermediate & Advanced SEO | | Nortski0 -
How do you archive content?
In this video from Google Webmasters about content, https://www.youtube.com/watch?v=y8s6Y4mx9Vw around 0:57 it is advised to "archive any content that is no longer relevant". My question is how do you exactly do that? By adding noindex to those pages, by removing all internal links to that page, by completely removing those from the website? How do you technically archive content? watch?v=y8s6Y4mx9Vw
Intermediate & Advanced SEO | | SorinaDascalu1 -
Http and https duplicate content?
Hello, This is a quick one or two. 🙂 If I have a page accessible on http and https count as duplicate content? What about external links pointing to my website to the http or https page. Regards, Cornel
Intermediate & Advanced SEO | | Cornel_Ilea0 -
Duplicate content on sub-domains?
I have 2 subdamains intented for 2 different countries (Colombia and Venezuela) ve.domain.com and co.domain.com. The site it's an e-commerce with over a million products available so they have the same page with the same content on both sub-domains....the only differences are the prices a payment options. Does google take that as duplicate content? Thanks
Intermediate & Advanced SEO | | daniel.alvarez0 -
Mobile version creating duplicate content
Hi We have a mobile site which is a subfolder within our site. Therefore our desktop site is www.mysite.com and the mobile version is www.mysite.com/m/. All URL's for specific pages are the same with the exception of /m/ in them for the mobile version. The mobile version has the specific user agent detection capabilities. I never saw this as being duplicate content initially as I did some research and found the following links
Intermediate & Advanced SEO | | peterkn
http://www.youtube.com/watch?v=mY9h3G8Lv4k
http://searchengineland.com/dont-penalize-yourself-mobile-sites-are-not-duplicate-content-40380
http://www.seroundtable.com/archives/022109.html What I am finding now is that when I look into Google Webmaster Tools, Google shows that there are 2 pages with the same Page title and therefore Im concerned if Google sees this as duplicate content. The reason why the page title and meta description is the same is simply because the content on the 2 verrsions are the exact same. Only layout changes due to handheld specific browsing. Are there any speficific precausions I could take or best practices to ensure that Google does not see the mobile pages as duplicates of the desktop pages Does anyone know solid best practices to achieve maximum results for running an idential mobile version of your main site?1