Questions about duplicate photo content?
-
I know that Google is a mystery, so I am not sure if there are answers to these questions, but I'm going to ask anyway! I recently realized that Google is not happy with duplicate photo content. I'm a photographer and have sold many photos in the past (but retained the rights for) that I am now using on my site. My recent revelations means that I'm now taking down all of these photos. So I've been reverse image searching all of my photos to see if I let anyone else use it first, and in the course of this I found out that there are many of my photos being used by other sites on the web. So my questions are:
With photos that I used first and others have stolen, If I edit these photos (to add copyright info) and then re-upload them, will the sites that are using these images then get credit for using the original image first?
If I have a photo on another one of my own sites and I take it down, can I safely use that photo on my main site, or will Google retain the knowledge that it's been used somewhere else first?
If I sold a photo and it's being used on another site, can I safely use a different photo from the same series that is almost exactly the same? I am unclear what data from the photo Google is matching, and if they can tell the difference between photos that were taken a few seconds apart.
-
I do now see where he did also claim visa versa that using originals wont help either.
why would matt or rand mention it? they don't work for google they will not know everything.
But he also said to the best of his knowledge, at that time, and that in the future that may change and it was 2 years ago. So I would still try to get original images if you can.
-
I don't know where you get the idea that Google is not happy with duplicate photo content but there no such thing as that. If there is, Matt or Rand should have already mentioned it. But there is none, and at the very least I already provided you an evidence to support my claim. Would you rather rely on other presumptions?
FYI, even duplicate content has its own stand on Google. You see, contents and photos are meant to be shared, so as long as you're not doing spammy things on the web, Google has no reason to punish you. Cheer up!
-
But that video is two years old and he says that using duplicate photo content is a good idea for a future quality signal. So I'm not sure that it's still relevant. But I guess we can't prove it one way or the other!
-
But they may reward you for unique images.
-
I think you might want to read this: https://www.plagiarismtoday.com/2013/07/29/google-we-dont-penalize-duplicate-images/.
-
Why do you think Google doesn't penalize for duplicate photo content? It seems like it would be a great way to find scrapers and those with low-value content.
-
Hi Lina,
I understand your trouble Lina but worry not because Google doesn't penalize duplicate photo content. On the other hand, you can optimize so it can be found easily by adding short but concise meta tag title and meta description.
You don't need to edit the Photos, but you're free to add copyright info. However, I don't think it is necessary for the photos you sold.
It is also not necessary to take down existing photos from one of your sites. But if it were me, I would add a referral link (stating that they can access more photos: or original version of:) to your main site. Not only your audience will take notice of your main site but it will also improve your main site's ranking in search results.
True, Google can tell the difference between photos that were taken a few seconds apart, but there is no reason it should be a big issue. Photos were meant to be shared from the start.
Uhmm I'm getting the image of you wanting to gather audience for your main site immediately. Regarding this, you might try Google Adwords for getting audience faster.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Same site serving multiple countries and duplicated content
Hello! Though I browse MoZ resources every day, I've decided to directly ask you a question despite the numerous questions (and answers!) about this topic as there are few specific variants each time: I've a site serving content (and products) to different countries built using subfolders (1 subfolder per country). Basically, it looks like this:
Intermediate & Advanced SEO | | GhillC
site.com/us/
site.com/gb/
site.com/fr/
site.com/it/
etc. The first problem was fairly easy to solve:
Avoid duplicated content issues across the board considering that both the ecommerce part of the site and the blog bit are being replicated for each subfolders in their own language. Correct me if I'm wrong but using our copywriters to translate the content and adding the right hreflang tags should do. But then comes the second problem: how to deal with duplicated content when it's written in the same language? E.g. /us/, /gb/, /au/ and so on.
Given the following requirements/constraints, I can't see any positive resolution to this issue:
1. Need for such structure to be maintained (it's not possible to consolidate same language within one single subfolders for example),
2. Articles from one subfolder to another can't be canonicalized as it would mess up with our internal tracking tools,
3. The amount of content being published prevents us to get bespoke content for each region of the world with the same spoken language. Given those constraints, I can't see a way to solve that out and it seems that I'm cursed to live with those duplicated content red flags right up my nose.
Am I right or can you think about anything to sort that out? Many thanks,
Ghill0 -
Identifying Duplicate Content
Hi looking for tools (beside Copyscape or Grammarly) which can scan a list of URLs (e.g. 100 pages) and find duplicate content quite quickly. Specifically, small batches of duplicate content, see attached image as an example. Does anyone have any suggestions? Cheers. 5v591k.jpg
Intermediate & Advanced SEO | | jayoliverwright0 -
Duplicate content issue
Hello! We have a lot of duplicate content issues on our website. Most of the pages with these issues are dictionary pages (about 1200 of them). They're not exactly duplicate, but they contain a different word with a translation, picture and audio pronunciation (example http://anglu24.lt/zodynas/a-suitcase-lagaminas). What's the better way of solving this? We probably shouldn't disallow dictionary pages in robots.txt, right? Thanks!
Intermediate & Advanced SEO | | jpuzakov0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0 -
Duplicate Content/ Indexing Question
I have a real estate Wordpress site that uses an IDX provider to add real estate listings to my site. A new page is created as a new property comes to market and then the page is deleted when the property is sold. I like the functionality of the service but it creates a significant amount of 404's and I'm also concerned about duplicate content because anyone else using the same service here in Las Vegas will have 1000's of the exact same property pages that I do. Any thoughts on this and is there a way that I can have the search engines only index the core 20 pages of my site and ignore future property pages? Your advice is greatly appreciated. See link for example http://www.mylvcondosales.com/mandarin-las-vegas/
Intermediate & Advanced SEO | | AnthonyLasVegas0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Duplicate Content / 301 redirect Ariticle issue
Hello, We've got some articles floating around on our site nlpca(dot)com like this article: http://www.nlpca.com/what-is-dynamic-spin-release.html that's is not linked to from anywhere else. The article exists how it's supposed to be here: http://www.dynamicspinrelease.com/what-is-dsr/ (our other website) Would it be safe in eyes of both google's algorithm (as much as you know) and with Panda to just 301 redirect from http://www.nlpca.com/what-is-dynamic-spin-release.html to http://www.dynamicspinrelease.com/what-is-dsr/ or would no-indexing be better? Thank you!
Intermediate & Advanced SEO | | BobGW0 -
Duplicate Content - Panda Question
Question: Will duplicate informational content at the bottom of indexed pages violate the panda update? **Total Page Ratio: ** 1/50 of total pages will have duplicate content at the bottom off the page. For example...on 20 pages in 50 different instances there would be common information on the bottom of a page. (On a total of 1000 pages). Basically I just wanted to add informational data to help clients get a broader perspective on making a decision regarding "specific and unique" information that will be at the top of the page. Content ratio per page? : What percentage of duplicate content is allowed per page before you are dinged or penalized. Thank you, Utah Tiger
Intermediate & Advanced SEO | | Boodreaux0