PDF Instructions come up in Crawl report as Duplicate Content
-
Hello,
My ecommerce site has many PDF instruction pages that are being marked as duplicate content in the site crawl.
Each page has a different title, and then a PDF displayed in an iframe with a link back to the previous page & to the category that the product is placed in. Should I add text to the pages to help differentiate them?
I included a screenshot of the code that is on all the pages.
Thanks!
Justin
-
Yes, you absolutely should add unique text to each of these pages. Not only so that they aren't flagged as duplicate, but because it's always an SEO benefit to have more good content. If you don't have the capacity to write such content, however, you may want to remove them from indexation.
The reason that these pages are being flagged as duplicates is that Google isn't parsing these PDFs. Which means that, all Google and others see are pages with no content and an iframe. It's also pertinent to note that Moz will flag anything with more than 90% overlap as a duplication.
I hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Potential duplicate content issue?
We have a category on our website for PVC rolls to buy as standard 50m rolls (this includes 15 products in the category). We're also releasing PVC rolls to buy per metre (10m roll/25m roll etc...), again with 15 products, which we are adding as a separate category as it makes more sense for our customers and removes the risk of having too many options. Would using the same description be bad practice for SEO? The product is exactly the same just available in different roll sizes, but we definitely do not want to combine categories as it doesn't work for our customers. Any help or suggestions would be appreciated, thanks.
On-Page Optimization | | RayflexGroup0 -
How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
How does Indeed.com make it to the top of every single search despite of having duplicate content. I mean somewhere google says they will prefer original content & will give preference to them who have original content but this statement contradict when I see Indeed.com as they aggregate content from other sites but still rank higher than original content provider side. How does Indeed.com make it to the top of every single search despite of having aggregated content or duplicate content
On-Page Optimization | | vivekrathore0 -
Content with changing URL and duplicate content
Hi everyone, I have a question regarding content (user reviews), that are changing URL all the time. We get a lot of reviews from users that have been dining at our partner restaurants, which get posted on our site under (new) “reviews”. My worry however is that the URL for these reviews is changing all the time. The reason for this is that they start on page 1, and then get pushed down to page 2, and so on when new reviews come in. http://www.r2n.dk/restaurant-anmeldelser I’m guessing that this could cause for serious indexing problems? I can see in google that some reviews are indexed multiple times with different URLs, and some are not indexed at all. We further more have the specific reviews under each restaurant profile. I’m not sure if this could be considered duplicate content? Maybe we should tell google not to index the “new reviews section” by using robots.txt. We don’t get much traffic on these URLs anyways, and all reviews are still under each restaurant-profile. Or maybe the canonical tag can be used? I look forward to your input. Cheers, Christian
On-Page Optimization | | Christian_T2 -
WordPress - duplicate content
I'm using WordPress for my website. However, whenever I use the post section for news, I get a report back from SEOmoz saying that there's duplicate content. What it does is it posts them in the Category and Archive section. Does anyone know if Google sees this as duplicate content and if so how to stop it? Thanks
On-Page Optimization | | AAttias0 -
Tags creating duplicated content issue?
Hello i believe a lot of us use tags in our blogs as a way to categorize content and make it easy searchable but this usually (at lease in my case) cause duplicate content creation. For example, if one article has 2 tags like "SEO" & "Marketing", then this article will be visible and listed in 2 urls inside the blog like this domain.com/blog/seo and domain.com/blog/marketing In case of a blog with 300+ posts and dozens of different tags this is creating a huge issue. My question is 1. Is this really bad? 2. If yes how to fix it without removing tags?
On-Page Optimization | | Lakiscy0 -
Why does the on page report reports a full path link as Cannibalize link?
On the seomoz on page report i get a cannibalize error. This is due to a link being full path. When i change the link to relative path then there is no Cannibalize error. Should i change the internal links of the site to relative path? I would appreciate your help.
On-Page Optimization | | pickaweb0 -
Duplicate content on video pages
Hi guys, We have a video section on our site containing about 50 videos, grouped by category/difficulty. On each video page except for the embedded player, a sentence or two describing the video and a list of related video links, there's pretty much nothing else. All of those appear as duplicate content by category. What should we do here? How long a description should be for those pages to appear unique for crawlers? Thanks!
On-Page Optimization | | lgrozeva0 -
Is the www and non www isue realy seen by Google as duplicate content?
I realy don't understand how Google could posibly devaluate a link because the site displays the same content with www and without www. I mean did somebody recently saw a devaluation of a domain because of this isue? I somehow can not belive this because it is the standard when geting a new webspace that the new website display the same content with and without www. Is a redirect realy necessary?
On-Page Optimization | | MichaelJanik0