Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Artist Bios on Multiple Pages: Duplicate Content or not?
-
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print.
My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google.
Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future.
Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution.
Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
-
Hi Darin,
Let me add my 2 cents:
If it makes sense from a usuability standpoint to have the author bio on the page, then by all means leave it there.
What's most important, from a search engine point of view, is that the unique content on the page is the most important.
This means placing the paragraphs about the print description front and center on the page. Since Panda, Google seems to treat page content using more of a Reasonable Surfer model in a similar manner as they handle links. That is, the higher up and more prominent the content, the more likely that weighs into their calculations to what the page is "about."
Matt Cutts has previously said it only takes 2-3 sentences to make a page unique, but personally I think closer to a couple hundred words is a safer number.
Hope this helps! Best of luck with your SEO.
-
The <iframe>makes the most sense for this company's requirements. Do I need to do anything regarding noindex or nofollow if we create a dedicated page for each artist's bio and then pull the bio into the <iframe> on each print's page? Or does simply pulling that data via the iframe from the original "source" (that being the proposed artist bio page) eliminate the duplicate content concern?</p></iframe>
-
Well, according to this post from a Google employee on a Google forum, Google ignores the noindex or nofollow in an <iframe>:</p> <p>http://productforums.google.com/forum/#!topic/webmasters/tSHq764AA0A</p> <p>He also references this link on the robots.txt file:</p> <p>http://support.google.com/webmasters/bin/answer.py?hl=en&answer=93710</p></iframe>
-
Chad, while posting a link instead of the dupe content makes sense logically, it dramatically reduces the amount of content on the page, so from a usability standpoint to the visitor (as well as the directive of the site owner), the bios need to remain on each print's page.
-
If the artist bio is not the major content on the page and there is other content available which is unique so there are less chances that Google will take this in to play but you never know Google... so it’s better to play safe.
Now if you want to play safe you have two choices, either to have a dedicated page for each artist and on that painting’s page just put the clickable image of the article that will take people to the artist’s bio page (not really helpful from conversion point of view)
The other idea is to use the iframe to show the content on each page and this way Google will count that a different page.
-
Why can't you just have a link to a artist bio page.
For example:
Click to read: John Doe's bio
This seems to solve the issue of usability as well as the issue with duplicate content. Just a suggestions. Learning more myself.
-
I was actually going to suggest putting the artist's info into a graphic before I finished reading your post. If that is going to be too much of an undertaking, then yes, an iframe would be a reasonable solution. Instead of using robots.txt, I'd suggest putting the noindex tag into the head of the iframed content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Local SEO - ranking the same page for multiple locations
Hi everyone, I am aware that issue of local SEO has been approached numerous times, but the situation that I'm dealing with is slightly different, so I'd love to receive your expert advice. I'm running the website of a property management company which services multiple locations (www.homevault.com). From our local offices in the city center, we also service neighboring towns and communities ( ex: we have an office in Charlotte NC, from which we service Charlotte plus a dozen other towns nearby). We wanted to avoid creating dozens of extra local service pages, particularly since our offers are identical per metropolitan area and we're talking of 20-30 additional local pages for each area. Instead, we decided to create local service pages only for the main locations. Needless to say, we're now ranking for the main locations, but we're missing on all searches for property management in neighboring towns (we're doing good on searches such as 'charlotte property management', but we're practically invisible for 'davidson property management', although we're searvicing that area as well). What we've done so far to try and fix the situation: 1. The current location pages do include descriptions of areas that we serve. 2. We've included 1-2 keywords for the sattelite locations in the main location pages, but we're nowhere near the optimization needed to rank for local searches in neighboring towns (ie, some main local service pages rank on pages 2-4 for sattelite towns, so not good enough). 3. We've included the searviced areas in our local GMBs, directories, social media profiles etc. None of these solutions appear to work great. Should I go ahead and create the classic local pages for each and every town and optimize them on those particular keywords, even if the offer is practically the same, and the number of pages risks going out of control? Any other better ideas? Many thanks in advance!
Intermediate & Advanced SEO | | HomeVaultPM0 -
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Duplicate content on URL trailing slash
Hello, Some time ago, we accidentally made changes to our site which modified the way urls in links are generated. At once, trailing slashes were added to many urls (only in links). Links that used to send to
Intermediate & Advanced SEO | | yacpro13
example.com/webpage.html Were now linking to
example.com/webpage.html/ Urls in the xml sitemap remained unchanged (no trailing slash). We started noticing duplicate content (because our site renders the same page with or without the trailing shash). We corrected the problematic php url function so that now, all links on the site link to a url without trailing slash. However, Google had time to index these pages. Is implementing 301 redirects required in this case?1 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Does rel=canonical fix duplicate page titles?
I implemented rel=canonical on our pages which helped a lot, but my latest Moz crawl is still showing lots of duplicate page titles (2,000+). There are other ways to get to this page (depending on what feature you clicked, it will have a different URL) but will have the same page title. Does having rel=canonical in place fix the duplicate page title problem, or do I need to change something else? I was under the impression that the canonical tag would address this by telling the crawler which URL was the URL and the crawler would only use that one for the page title.
Intermediate & Advanced SEO | | askotzko0 -
Can PDF be seen as duplicate content? If so, how to prevent it?
I see no reason why PDF couldn't be considered duplicate content but I haven't seen any threads about it. We publish loads of product documentation provided by manufacturers as well as White Papers and Case Studies. These give our customers and prospects a better idea off our solutions and help them along their buying process. However, I'm not sure if it would be better to make them non-indexable to prevent duplicate content issues. Clearly we would prefer a solutions where we benefit from to keywords in the documents. Any one has insight on how to deal with PDF provided by third parties? Thanks in advance.
Intermediate & Advanced SEO | | Gestisoft-Qc1 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0