Duplicate content
-
Are images considered duplicate content too?
Example:
I've got a size chart on each my lingerie pages. All written content is unique but I'm using the same chart for all those pages. -
I agree with Tom. Using images several times is quite common on the web and penalizing for this wouldn't make any sense.
But I disagree with the fact that Googlebot cannot see images. Google's image detection is quite advanced by now and they can definitely see if an image has already been indexed (even just comparing the checksum already gives good hints). Also, Google's "search by image" shows us that this is technically possible.
-
Thanks Tom!
Also a good idea to use images for long disclaimers, I will have to test this too. -
Hi Sylvana
No, images won't be seen as duplicate content - simply for the fact that they won't be seen by the Googlebot!
As the bot is not able to crawl images (which is why we use alt tags), you don't have to worry about it as duplicate content. It's something I've toyed with before for webpages that need to have long disclaimers in the footer of every page - make them an image so the content isn't duplicated.
In addition, you can also test how your pages look in the eyes of Google by using the SEO Browser. That way you'll be able to see what content, if any, is duplicated.
Hope this helps!
-
Sylvana, you won't have any problems with that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have an eCommerce Site with in some cases, 100s of versions of the same product. How do I avoid "duplicate content" without writing literally 100s of unique product descriptions for the exact same product?
For instance, one item where the only difference is the Sports Team Logo is different, etc... or It comes in a variety of color Variants. I'm using Shopify.
On-Page Optimization | | pstone291 -
Duplicate product content/disclaimers for non-e-commerce sites
This is more a follow-up to Rand's recent Whiteboard "Handling User-Generated & Manufacturer-Required Duplicate Content Across Large Numbers of URLs." I posed my question in the comments, but unsure it will get picked up. My situation isn't exactly the same, but it's similar: Our site isn't an e-commerce site and doesn't have user reviews yet, but we do have maybe 8 pages across 2 product categories featuring very similar product features with duplicate verbiage. However, we don't want to re-write it because we want to make it easy for users to compare apples-to-apples to easily see which features are actually different. We also have to run disclaimers at the bottom of each page.\ Would i-framing the product descriptions and disclaimers be beneficial in this scenario, with the addition of good content? It would still be nice to have some crawlable content on those pages, so the i-framing makes me nervous unless we compensate with at least some above-the-fold, useful content that could be indexed. Thanks, Sarah
On-Page Optimization | | sbs2190 -
Duplicate links from forum what to do?
After a crawl it found over 5k errors and over 5k warnings. Those are: Duplicate page content; Duplicate page title; Overly-Dynamic URLs; Missing Meta descr; Title Element too long. All those come from domain.com/forum/ I don't need SEO on forum so what should I do? What could be an easy solution to this? No index? No follow? Please help
On-Page Optimization | | OVJ0 -
Dealing with thin content/95% duplicate content - canonical vs 301 vs noindex
My client's got 14 physical locations around the country but has a webpage for each "service area" they operate in. They have a Croydon location. But a separate page for London, Croydon, Essex, Luton, Stevenage and many other places (areas near Croydon) that the Croydon location serves. Each of these pages is a near duplicate of the Croydon page with the word Croydon swapped for the area. I'm told this was a SEO tactic circa 2001. Obviously this is an issue. So the question - should I 301 redirect each of the links to the Croydon page? Or (what I believe to be the best answer) set a rel=canonical tag on the duplicate pages). Creating "real and meaningful content" on each page isn't quite an option, sorry!
On-Page Optimization | | JamesFx0 -
Meta descriptions better empty or with duplicate content?
I am working with a yahoo store. Somehow all of the meta description fields were filled in with random content from throughout the store. For example, a black cabinet knob product page might have in its description field the specifications for a drawer slide. I don't know how this happened. We have had a programmer auto populate certain fields to get them ready for product feeds, etc. It's possible they screwed something up during that, this was a long time ago. My question. Regardless of how it happened. Is it better for me to have them wipe these fields entirely clean? Or, is it better for me to have them populate the fields with a duplicate of our text from the body. The site has about 6,500 pages so I have and will make custom descriptions for the more important pages after this process, but the workload to do them all is too much. So, nothing or duplicate content for the pages that likely won't receive personal attention?
On-Page Optimization | | dellcos1 -
Duplicate Page Titles?
I'm running a campaign report within SEOmoz & am getting 9 pages that appear on this report. They all happen to be our author pages www.example.com//author/admin We have multiple authors. Is there a proper way that I should take care of this? Also as a side note, I'm using Yoast Wordpress SEO plugin, is there a setting on their I should change that will fix this issue? Or is it an issue at all? Thanks, BJ
On-Page Optimization | | seointern0 -
How woud you deal with Blog TAGS & CATEGORY listings that are marked a 'duplicate content' in SEOmoz campaign reports?
We're seeing "Duplicate Content" warnings / errors in some of our clients' sites for blog / event calendar tags and category listings. For example the link to http://www.aavawhistlerhotel.com/news/?category=1098 provides all event listings tagged to the category "Whistler Events". The Meta Title and Meta Description for the "Whistler Events" category is the same as another other category listing. We use Umbraco, a .NET CMS, and we're working on adding some custom programming within Umbraco to develop a unique Meta Title and Meta Description for each page using the tag and/or category and post date in each Meta field to make it more "unique". But my question is .... in the REAL WORLD will taking the time to create this programming really positively impact our overall site performance? I understand that while Google, BING, etc are constantly tweaking their algorithms as of now having duplicate content primarily means that this content won't get indexed and there won't be any really 'fatal' penalties for having this content on our site. If we don't find a way to generate unique Meta Titles and Meta Descriptions we could 'no-follow' these links (for tag and category pages) or just not use these within our blogs. I am confused about this. Any insight others have about this and recommendations on what action you would take is greatly appreciated.
On-Page Optimization | | RoyMcClean0 -
Would it be bad to change the canonical URL to the most recent page that has duplicate content, or should we just 301 redirect to the new page?
Is it bad to change the canonical URL in the tag, meaning does it lose it's stats? If we add a new page that may have duplicate content, but we want that page to be indexed over the older pages, should we just change the canonical page or redirect from the original canonical page? Thanks so much! -Amy
On-Page Optimization | | MeghanPrudencio0