Dealing with PDFs?
-
Hello fellow mozzers!
One of our clients does an excellent job of providing excellent content, and we don't even have to nag them about it (imagine that!). This content is usually centered around industry reports, financial analyses, and economic forcasts; however, they always post them in the form of pdfs.
How does Google view PDF's, and is there a way to optimize them? Ideally, I am going to try to get this client set up with a blog-like plateform that will use HTML text, rather than PDF's, but I wanted to see what info was out there for PDF's.
Thanks!
-
Thank you Keri for the helpful resource. I actually ended up doing all of those things for our client. Also, I found out that the default Drupal 6 robot.txt file, does not allow the search engines to index pdf's, images, and flash. Therefore, one must eliminate the disallow: /sites/ from the Robot.txt file.
-
This doesn't address ranking, but the YOUmoz post does talk about best practices for optimizing PDF content and may help you. http://www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search
-
To be honest Dana, outside of the basics mentioned, I tended not to go overboard and many of them started to rank naturally as Google spidered the site. Just remember to give the link to the PDF a strong anchor text and if possible, add a little content around it to explain what visitors can expect in the document. Also remember to add a link to Adobe so that they can download the free reader if they dont have it already.
Hope this helps,
Regards,
Andy
-
Thank you iNet SEO, Excellent resource...
I was also wondering if anyone had any posts / experience with understanding the indexing and ranking of PDF content?
-
Yes, you can optomise PDF's - have a read of this as it seems to cover most points
http://www.seoconsultants.com/pdf/seo
Sorry, I forgot to add that PDF's are useful for those who are wishing to download something to read at a later stage or whilst offline. Don't rush to advise them that HTML is the way to go unless it actually is. I have printed off many a PDF and taken it into meetings with me.
Regards,
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to deal with Pages not present anymore in the site
Hi, we need to cut out from the catalog some destinations for our tour operator, so basically we need to deal with destination pages and tour pages not present anymore on the site. What do you think is the best approach to deal with this pages to not loose ranking? Do you think is a good approach to redirect with 301's these pages to the home page or to the general catalog page or do you suggest another approach? tx for your help!
Technical SEO | | Dreamrealemedia0 -
How do I deal with /mobile/ page after responsive re-design?
Hi guys, One of our clients used to have a website that would redirect mobile traffic to a /mobile/ page. Thankfully we've finally gone fully responsive and there is no need for this /mobile/ page. Trouble is, www.clientsite.com.au**/mobile/** is still in the Google index and going to a 404 right now. What is the best way to deal with it? Should we be 301 redirecting /mobile/ to / (the home page)? Would be most grateful for any ideas. Thanks!
Technical SEO | | WCR0 -
What is the best way to deal with an event calendar
I have an event calendar that has multiple repeating items into the future. They are classes that typically all have the same titles but will occasionally have different information. I don't know what is the best way to deal with them and am open to suggestions. Currently Moz anayltics is showing multiple errors (duplicate page titles, descriptions and overly dynamic urls). I'm assuming that it's showing duplicate elements way into the future. I thought of having the calendar no followed at all but the content for the classes seems valuable. Thanks,
Technical SEO | | categorycode0 -
Deal with links that need login to view
Hi All, Deal with links that need login to view We have member names in the site in many places and when clicked it takes the user to the login page As just logged in members can view the details The redirection type is 302 and Moz Campaign says we have many and need to make them 301 What is the best way as we have a drupal website Thanks
Technical SEO | | mtthompsons0 -
Best way to deal with these urls?
Found overly dynamic urls in the crawl report. http://www.trespass.co.uk/camping/festivals-friendly/clothing?Product_sort=PriceDesc&utm_campaign=banner&utm_medium=blog&utm_source=Roslyn Best way to deal with these? Cheers Guys
Technical SEO | | Trespass0 -
How best to deal with www.home.com and www.home.com/index.html
Firstly, this is for an .asp site - and all my usual ways of fixing this (e.g. via htaccess) don't seem to work. I'm working on a site which has www.home.com and www.home.com/index.html - both URL's resolve to the same page/content. If I simply drop a rel canonical into the page, will this solve my dupe content woes? The canonical tag would then appear in both www.home.com and www.home.com/index.html cases. If the above is Ok, which version should I be going with? - or - Thanks in advance folks,
Technical SEO | | Creatomatic
James @ Creatomatic0 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0 -
How do i deal with duplicate content on the same domain?
I'm trying to find out if there's a way we can combat similar content on different pages on the same site, without having to re write the whole lot? Any ideas?
Technical SEO | | indurain0