Artist Bios on Multiple Pages: Duplicate Content or not?
-
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print.
My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google.
Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future.
Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution.
Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
-
Hi Darin,
Let me add my 2 cents:
If it makes sense from a usuability standpoint to have the author bio on the page, then by all means leave it there.
What's most important, from a search engine point of view, is that the unique content on the page is the most important.
This means placing the paragraphs about the print description front and center on the page. Since Panda, Google seems to treat page content using more of a Reasonable Surfer model in a similar manner as they handle links. That is, the higher up and more prominent the content, the more likely that weighs into their calculations to what the page is "about."
Matt Cutts has previously said it only takes 2-3 sentences to make a page unique, but personally I think closer to a couple hundred words is a safer number.
Hope this helps! Best of luck with your SEO.
-
The <iframe>makes the most sense for this company's requirements. Do I need to do anything regarding noindex or nofollow if we create a dedicated page for each artist's bio and then pull the bio into the <iframe> on each print's page? Or does simply pulling that data via the iframe from the original "source" (that being the proposed artist bio page) eliminate the duplicate content concern?</p></iframe>
-
Well, according to this post from a Google employee on a Google forum, Google ignores the noindex or nofollow in an <iframe>:</p> <p>http://productforums.google.com/forum/#!topic/webmasters/tSHq764AA0A</p> <p>He also references this link on the robots.txt file:</p> <p>http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710</p></iframe>
-
Chad, while posting a link instead of the dupe content makes sense logically, it dramatically reduces the amount of content on the page, so from a usability standpoint to the visitor (as well as the directive of the site owner), the bios need to remain on each print's page.
-
If the artist bio is not the major content on the page and there is other content available which is unique so there are less chances that Google will take this in to play but you never know Google... so it’s better to play safe.
Now if you want to play safe you have two choices, either to have a dedicated page for each artist and on that painting’s page just put the clickable image of the article that will take people to the artist’s bio page (not really helpful from conversion point of view)
The other idea is to use the iframe to show the content on each page and this way Google will count that a different page.
-
Why can't you just have a link to a artist bio page.
For example:
Click to read: John Doe's bio
This seems to solve the issue of usability as well as the issue with duplicate content. Just a suggestions. Learning more myself.
-
I was actually going to suggest putting the artist's info into a graphic before I finished reading your post. If that is going to be too much of an undertaking, then yes, an iframe would be a reasonable solution. Instead of using robots.txt, I'd suggest putting the noindex tag into the head of the iframed content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are feeds bad for duplicate content?
One of my clients has been invited to feature his blog posts here https://app.mindsettlers.com/. Here is an example of what his author page would look like: https://app.mindsettlers.com/author/6rs0WXbbqwqsgEO0sWuIQU. I like that he would get the exposure however I am concerned about duplicate content with the feed. If he has a canonical tag on each blog post to itself, would that be sufficient for the search engines? Is there something else that could be done? Or should he decline? Would love your thoughts! Thanks.
Intermediate & Advanced SEO | | cindyt-17038
Cindy T.0 -
Could duplicate (copied) content actually hurt a domain?
Hi 🙂 I run a small wordpress multisite network where the main site which is an informative portal about the Langhe region in Italy, and the subsites are websites of small local companies in the tourism and wine/food niche. As an additional service for those who build a website with us, I was thinking about giving them the possibility to use some ouf our portal's content (such as sights, events etc) on their website, in an automatic way. Not as an "SEO" plus, but more as a service for their current users/visitors base: so if you have a B&B you can have on your site an "events" section with curated content, or a section about thing to see (monuments, parks, museums, etc) in that area, so that your visitors can enjoy reading some content about the territory. I was wondering if, apart from NOT being benefical, it would be BAD from an SEO point of view... ie: if they could be actually penlized by google. Thanks 🙂 Best
Intermediate & Advanced SEO | | Enrico_Cassinelli0 -
Duplicate content across domains?
Does anyone have suggestions for managing duplicate product/solution website content across domains? (specifically parent/child company domains) Is it advisable to do this? Will it hurt either domain? Any best practices when going down this path?
Intermediate & Advanced SEO | | pilgrimquality0 -
How to rank same page for multiple related keywords
We need some of our pages to rank for multiple related keywords. But we cannot optimise one page for multiple keywords which might end up ranking for none of them; at the same time we cannot optimise it for one keyword as we ignore other keywords. I think creating multiple landing pages for very related keywords will confuse users and search engines as well. How to handle this?
Intermediate & Advanced SEO | | vtmoz0 -
Duplicate Content
Hi, So I have my great content (that contains a link to our site) that I want to distribute to high quality relevant sites in my niche as part of a link building campaign. Can I distribute this to lots of sites? The reason I ask is that those sites will then have duplicate content to all the other sites I distribute the content to won;t they? I this duplication bad for them and\or us? Thanks
Intermediate & Advanced SEO | | Studio330 -
Duplicate content in Webmaster tools, is this bad?
We launched a new site, and we did a 301 redirect to every page. I have over 5k duplicate meta tags and title tags. It shows the old page and the new page as having the same title tag and meta description. This isn't true, we changed the titles and meta description, but it still shows up like that. What would cause that?
Intermediate & Advanced SEO | | EcommerceSite0 -
Duplicated Pages and Forums
Does duplicate content hurt that particular duplicated content, or the entire site? There are some parts of my site that I don’t care about getting high rankings on search engines. For example, I have a forum and there are certain links that only logged in people can see. If you aren’t logged in, they will take you to a page where it tells u to log in. google, obviously not logged in, interprets this as lots and lots of the same duplicated page. Should I just leave it alone cause I dont care if those pages makes it to search engines. Will it not hurt the entire site? For example, can my homepage search rankings decrase? That leads to my next question. What is the best way to optimize a forum? Whenever someone posts a new post, it seems another url for the same forum thread is created..... which is obviously duplicated….in other words, if like 20 people post on a thread, i believe my site adds 20 urls for that page...anyone know how to fix this?
Intermediate & Advanced SEO | | waltergah0 -
Duplicate content clarity required
Hi, I have access to a masive resource of journals that we have been given the all clear to use the abstract on our site and link back to the journal. These will be really useful links for our visitors. E.g. http://www.springerlink.com/content/59210832213382K2 Simply, if we copy the abstract and then link back to the journal source will this be treated as duplicate content and damage the site or is the link to the source enough for search engines to realise that we aren't trying anything untoward. Would it help if we added an introduction so in effect we are sort of following the curating content model? We are thinking of linking back internally to a relevant page using a keyword too. Will this approach give any benefit to our site at all or will the content be ignored due to it being duplicate and thus render the internal links useless? Thanks Jason
Intermediate & Advanced SEO | | jayderby0