Artist Bios on Multiple Pages: Duplicate Content or not?
-
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print.
My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google.
Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future.
Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution.
Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
-
Hi Darin,
Let me add my 2 cents:
If it makes sense from a usuability standpoint to have the author bio on the page, then by all means leave it there.
What's most important, from a search engine point of view, is that the unique content on the page is the most important.
This means placing the paragraphs about the print description front and center on the page. Since Panda, Google seems to treat page content using more of a Reasonable Surfer model in a similar manner as they handle links. That is, the higher up and more prominent the content, the more likely that weighs into their calculations to what the page is "about."
Matt Cutts has previously said it only takes 2-3 sentences to make a page unique, but personally I think closer to a couple hundred words is a safer number.
Hope this helps! Best of luck with your SEO.
-
The <iframe>makes the most sense for this company's requirements. Do I need to do anything regarding noindex or nofollow if we create a dedicated page for each artist's bio and then pull the bio into the <iframe> on each print's page? Or does simply pulling that data via the iframe from the original "source" (that being the proposed artist bio page) eliminate the duplicate content concern?</p></iframe>
-
Well, according to this post from a Google employee on a Google forum, Google ignores the noindex or nofollow in an <iframe>:</p> <p>http://productforums.google.com/forum/#!topic/webmasters/tSHq764AA0A</p> <p>He also references this link on the robots.txt file:</p> <p>http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710</p></iframe>
-
Chad, while posting a link instead of the dupe content makes sense logically, it dramatically reduces the amount of content on the page, so from a usability standpoint to the visitor (as well as the directive of the site owner), the bios need to remain on each print's page.
-
If the artist bio is not the major content on the page and there is other content available which is unique so there are less chances that Google will take this in to play but you never know Google... so it’s better to play safe.
Now if you want to play safe you have two choices, either to have a dedicated page for each artist and on that painting’s page just put the clickable image of the article that will take people to the artist’s bio page (not really helpful from conversion point of view)
The other idea is to use the iframe to show the content on each page and this way Google will count that a different page.
-
Why can't you just have a link to a artist bio page.
For example:
Click to read: John Doe's bio
This seems to solve the issue of usability as well as the issue with duplicate content. Just a suggestions. Learning more myself.
-
I was actually going to suggest putting the artist's info into a graphic before I finished reading your post. If that is going to be too much of an undertaking, then yes, an iframe would be a reasonable solution. Instead of using robots.txt, I'd suggest putting the noindex tag into the head of the iframed content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Strategy/Duplicate Content Issue, rel=canonical question
Hi Mozzers: We have a client who regularly pays to have high-quality content produced for their company blog. When I say 'high quality' I mean 1000 - 2000 word posts written to a technical audience by a lawyer. We recently found out that, prior to the content going on their blog, they're shipping it off to two syndication sites, both of which slap rel=canonical on them. By the time the content makes it to the blog, it has probably appeared in two other places. What are some thoughts about how 'awful' a practice this is? Of course, I'm arguing to them that the ranking of the content on their blog is bound to be suffering and that, at least, they should post to their own site first and, if at all, only post to other sites several weeks out. Does anyone have deeper thinking about this?
Intermediate & Advanced SEO | | Daaveey0 -
Backup Server causing duplicate content flag?
Hi, Google is indexing pages from our backup server. Is this a duplicate content issue? There are essentially two versions of our entire domain indexed by Google. How do people typically handle this? Any thoughts are appreciated. Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
Duplicate content hidden behind tabs
Just looking at an ecommerce website and they've hidden their product page's duplicate content behind tabs on the product pages - not on purpose, I might add. Is this a legitimate way to hide duplicate content, now that Google has lowered the importance and crawlability of content hidden behind tabs? Is this a legitimate tactic to tackle duplicate content? Your thoughts would be welcome. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
After adding a ssl certificate to my site I encountered problems with duplicate pages and page titles
Hey everyone! After adding a ssl certificate to my site it seems that every page on my site has duplicated it's self. I think that is because it has combined the www.domainname.com and domainname.com. I would really hate to add a rel canonical to every page to solve this issue. I am sure there is another way but I am not sure how to do it. Has anyone else ran into this problem and if so how did you solve it? Thanks and any and all ideas are very appreciated.
Intermediate & Advanced SEO | | LovingatYourBest0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Interlinking from unique content page to limited content page
I have a page (page 1) with a lot of unique content which may rank for "Example for sale". On this page I Interlink to a page (page 2) with very limited unique content, but a page I believe is better for the user with anchor "See all Example for sale". In other words, the 1st page is more like a guide with items for sale mixed, whereas the 2nd page is purely a "for sale" page with almost no unique content, but very engaging for users. Questions: Is it risky that I interlink with "Example for sale" to a page with limited unique content, as I risk not being able to rank for either of these 2 pages Would it make sense to "no index, follow" page 2 as there is limited unique content, and is actually a page that exist across the web on other websites in different formats (it is real estate MLS listings), but I can still keep the "Example for sale" link leading to page 2 without risking losing ranking of page 1 for "Example for sale"keyword phrase I am basically trying to work out best solution to rank for "Keyword for sale" and dilemma is page 2 is best for users, but is not a very unique page and page 2 is very unique and OK for users but mixed up writing, pictures and more with properties for sale.
Intermediate & Advanced SEO | | khi50 -
Renaming multiple web pages
I'd like to rename about 20 web pages on one of my websites. If I do a 301 redirect for each of the pages will I get dinged by google for all of the renames?
Intermediate & Advanced SEO | | dpdeleon10 -
Indexation of content from internal pages (registration) by Google
Hello, we are having quite a big amount of content on internal pages which can only be accessed as a registered member. What are the different options the get this content indexed by Google? In certain cases we might be able to show a preview to visitors. In other cases this is not possible for legal reasons. Somebody told me that there is an option to send the content of pages directly to google for indexation. Unfortunately he couldn't give me more details. I only know that this possible for URLs (sitemap). Is there really a possibility to do this for the entire content of a page without giving google access to crawl this page? Thanks Ben
Intermediate & Advanced SEO | | guitarslinger0