Image Optimization & Duplicate Content Issues
-
Hello Everyone,
I have a new site that we're building which will incorporate some product thumbnail images cut and pasted from other sites and I would like some advice on how to properly manage those images on our site. Here's one sample scenario from the new website:
We're building furniture and the client has the option of selecting 50 plastic laminate finish options from the Formica company. We'll cut and paste those 50 thumbnails of the various plastic laminate finishes and incorporate them into our site. Rather than sending our website visitors over to the Formica site, we want them to stay put on our site, and select the finishes from our pages.
The borrowed thumbnail images will not represent the majority of the site's content and we have plenty of our own images and original content. As it does not make sense for us to order 50 samples from Formica & photograph them ourselves, what is the best way to handle to issue?
Thanks in advance,
Scott
-
If you have permission to use their images, just get images from them, name them accurately, and give them accurate alt-text. Duplicate content has to do with your own content, in general. Since the point of naming images and alt-text is to help Google understand them, it's not a big issue if an image has the same alt-text as another or appears multiple times on the site (especially since they should all be coming from an images directory, no matter where they are on the website). Also, images are much more likely to be naturally reused than text, as licensing photos is a long accepted practice.
-
Google does "see" a lot more than just the alt text. To decide which keywords an image should rank for they take into account amongst other things:
- The text surrounding the image (caption, article it illustrates, etc.)
- Which images it is similar to
- The filename of the image
- Text recognition
In this video google shows how much they can "see" when it comes to images: http://youtu.be/t99BfDnBZcI
-
Arjen, Thanks for your reply.
You are correct that we're not looking to rank for images of Formica samples (or any of our other samples for that matter), in fact we're just providing the sample images to help our clients better decide which one of our products to order. The sample tiles are just a means to an end.
Do you have any knowledge as to the extent to which Google can "see" an image the same way a human user sees an image? Does Google just rely on the alt text that that you provide them with?
Thanks in advance,
Scott
-
Hello Keri,
Thanks for your reply. We do have an account with them and permission to use their images.
Do you have any opinions as to the best way to manage the images - ie title, alt text, etc - so as not to run into any duplicate content issues? I'm not clear if Google has the ability to somehow scan the images themselves, or if they just rely on the alt text, titles, etc that you provide along with the images. Any thoughts are appreciated.
Scott
-
I do not think using some images from another website will hurt your SEO. Logo's on a 'our clients' page, news photography delivered through news agencies, icon sets and stock images are by definition used on more than one site. The fact that this form of 'duplicate content' is so omni present, proofs that Google cannot devaluate sites using it.
If you your goal is to rank high in image search for formica in different colours, you should make sure to get your own high res images. If this is not one of your primary SEO goals, you should not worry about using copied images.
My advice would be to focus on really good photography of the furniture you are building and do not worry to much about the thumbnails of formica samples.
PS: I agree with KeriMorget. You should get permission to use the photo's before using them on your site.
-
The first thing I would do would be to look at the copyright on the Formica site to see their policy on copying their content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help finding website content scraping
Hi, I need a tool to help me review sites that are plagiarising / directly copying content from my site. But tools that I'm aware, such as Copyscape, appear to work with individual URLs and not a root domain. That's great if you have a particular post or page you want to check. But in this case, some sites are scraping 1000s of product pages. So I need to submit the root domain rather than an individual URL. In some cases, other sites are being listed in SERPs above or even instead of our site for product search terms. But so far I have stumbled across this, rather than proactively researched offending sites. So I want to insert my root domain & then for the tool to review all my internal site pages before providing information on other domains where an individual page has a certain amount of duplicated copy. Working in the same way as Moz crawls the site for internal duplicate pages - I need a list of duplicate content by domain & URL, externally that I can then contact the offending sites to request they remove the content and send to Google as evidence, if they don't. Any help would be gratefully appreciated. Terry
White Hat / Black Hat SEO | | MFCommunications0 -
Technical : Duplicate content and domain name change
Hi guys, So, this is a tricky one. My server team just made quite a big mistake :We are a big We are a big magento ecommerce website, selling well, with about 6000 products. And we are about to change our domaine name for administrative reasons. Let's call the current site : current.com and the future one : future.com Right, here is the issue Connecting to the search console, I saw future.com sending 11.000 links to current.com. At the same time DA was hit by 7 points. I realized future.com was uncorrectly redirected and showed a duplicated site or current.com. We corrected this, and future.com now shows a landing page until we make the domain name change. I was wondering what is the best way to avoid the penalty now and what can be the consequences when changing domain name. Should I set an alias on search console or something ? Thanks
White Hat / Black Hat SEO | | Kepass0 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
Does IP Blacklist cause SEO issues?
Hi, Our IP was recently blacklisted - we had a malicious script sending out bulk mail in a Joomla installation. Does it hurt our SEO if we have a domain hosted on that IP? Any solid evidence? Thanks.
White Hat / Black Hat SEO | | bjs20100 -
Link Wheel & Unnatural Links - Undoing Damage
Client spent almost a year with link wheels and mass link blasts - end result was getting caught by google. I have taken over, we;ve revamped the site and I'm finishing up with onsite optimization. Would anyone have any suggestions how to undo the damage of the unnatural links and get back into googles favour a little quicker? Or the best next steps to undo the damage.
White Hat / Black Hat SEO | | ravynn0