PDF's - Dupe Content
-
Hi
I have some pdfs linked to from a page with little content. Hence thinking best to extract the copy from the pdf and have on-page as body text, and the pdf will still be linked too. Will this count as dupe content ?
Or is it best to use a pdf plugin so page opens pdf automatically and hence gives page content that way ?
Cheers
Dan
-
Should be different, but you would have to look at them to make sure.
-
ps - is a pdf to html coverter different from a plugin that loads the pdf as an open page when you click it ? or same thing ?
-
That is what I was going to suggest - setting up a canonical in the http header of the PDF back to the article
https://support.google.com/webmasters/answer/139394?hl=en
As another option, you can just block access to the PDFs to keep them out of the index as well.
-
thanks Chris
yes you can canonicalise the pdf to the html (according to the comments of that article i just linked to anyway)
-
Hi Dan,
Yes PDFs are crawlable (sorry for confusion!) if you were to put it into say a .zip or .rar (or similar) it wouldn't be crawled or you could no index the link i guess. You would need to stick the PDF (download) behind some thing that couldn't be crawled. You could try rel= canonical but I've never tried it with a PDF so i'm not sure how that would go.
Hope that enlightens you a bit.
-
Thanks Chris although i thought PDFS were crawlable??: http://www.lunametrics.com/blog/2013/01/10/seo-pdfs/
Hence why im worried about dupe content if use content of pdf as body text too OR are you saying should no-follow the link to the pdf if use its content as body text because it is considered dupe content in that scenario ?
Ideally i want both - the copy on it used as body text copy on page and the pdf a linkable download, or page as embed of open pdf via a plugin.
-
What would give the user the best experience is the really question,I would;d say put it on page then if the user is lacking a plugin they can still read it, if you have it as a downloadable PDF is shouldn't be able to get crawled and thus avoiding the problem.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate page content
These two URLs are being flagged as 98% similar in the code. We're a large ecommerce site, and while it would be ideal to have unique product descriptions on each page we currently don't have the bandwith. Thoughts on what else might be triggering this duplicate content? https://www.etundra.com/restaurant-parts/cooking-equipment-parts/fryers/scoops-skimmers/fmp-175-1081-fryer-crumb-scoop/ https://www.etundra.com/restaurant-equipment/concession-equipment/condiment-pumps/tablecraft-664-wide-mouth-condiment-pump/ Thanks, Natalie
On-Page Optimization | | eTundra0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
Over 12.000 302's?
Hi. I'm monitoringssystem a magento webshop. It has more 12.000 temp 302 redirects. Is it also a problem if the redirects are for a nonimportant subpage, such as an enable-cookies page?
On-Page Optimization | | Budskab0 -
Duplicate Content - Potential Issue.
Hello, here we go again, If I write an article somewhere, lets say Squidoo for instance, then post it to my blog on my website will google see this as duplicate content and probably credit Squidoo for it or is there soemthing I can do to prevent this, maybe a linkk back to Squidoo from my website or a dontfollow on my website? Im not sure so any help here would be great, Also If I use other peoples material in my blog and link back to them, obviously I dont want the credit for the original material I am simply collating some of this on my blog for others to have a specific library if you like. Is this going to damage my websites reputation? Thanks again peeps. Craig Fenton IT
On-Page Optimization | | craigyboy0 -
Why there's a full-stop in the title of SEOMOZ's home page?
Hello, I see there's a full-stop (.) in the title of SEOMOZ's home page. Why is it so? Regards
On-Page Optimization | | IM_Learner0 -
Duplicate content
Hi everybody, I am thrown into a SEO project of a website with a duplicate content problem because of a version with and a version without 'www' . The strange thing is that the version with www. has got more than 10 times more Backlings but is not in the organic index. Here are my questions: 1. Should I go on using the "without www" version as the primary resource? 2. Which kind of redirect is best for passing most of the link juice? Thanks in advance, Sebastian
On-Page Optimization | | Naturalmente0 -
Duplicate content on video pages
Hi guys, We have a video section on our site containing about 50 videos, grouped by category/difficulty. On each video page except for the embedded player, a sentence or two describing the video and a list of related video links, there's pretty much nothing else. All of those appear as duplicate content by category. What should we do here? How long a description should be for those pages to appear unique for crawlers? Thanks!
On-Page Optimization | | lgrozeva0 -
Another SEO's point of view
Hiya fellow SEO's I have been working on a site - www.hplmotors.co.uk and I must say it has become difficult due to flaws with the content management system . We are speaking with the web site makers to be able to add a unique title, description to all pages. I know what is wrong but I would also like some 2nd opinions on this and welcome any suggestions for the site. A burnt out seo 🙂 thanks
On-Page Optimization | | onlinemediadirect0