PDF or HTML Page?
-
One of our sales team members has created a 25 page word document as a topical page. The plan was to make this into an html page with a table of contents. My thoughts were why not make it a pdf? Is there any con to using a PDF vs an html page? If the PDF was properly optimized would it perform just as well? The goal is to have folks click back to our products and hopefully by after reading about how they work.
-
This is what I came say. Have the html document, then the link to the pdf download. That way the html document can rank and also the PDF can too. I think some people over look the fact that the page a pdf is downloaded from can rank AS WELL as the pdf itself.
-
Create the page(s) in HTML and offer a downloadable PDF format. Win/Win
-
As Kevin said, HTML is a better format for the web.
Perhaps you can offer this as a downloadable PDF on a lead generation page? You can certainly use this asset in more than one way.
-
Pdf's are great--but html is better. For a few reasons: Usability (html pages look better in a browser), lack of navigation (however, we can have hyperlinks in pdf's), PDF Download/View is an extra step (ways around this, but for the most part) and etc.
Matt Cutts sums it up pretty well during an interview w/Eric Enge (from 2010):
"We absolutely do process PDF files. I am not going to talk about whether links in PDF files pass PageRank. But, a good way to think about PDFs is that they are kind of like Flash in that they aren't a file format that's inherent and native to the web, but they can be very useful. In the same way that we try to find useful content within a Flash file, we try to find the useful content within a PDF file. At the same time, users don't always like being sent to a PDF. If you can make your content in a Web-Native format, such as pure HTML, that's often a little more useful to users than just a pure PDF file."
**"**People can certainly use that if they want to, but typically I think of PDF files as the last thing that people encounter, and users find it to be a little more work to open them. People need to be mindful of how that can affect the user experience."
Certainly, you can have both and there are many best-practices you need to implement before doing this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Validated pages on GSC displays 5x more pages than when performing site:domain.com?
Hi mozzers, When checking the coverage report on GSC I am seeing over 649,000 valid pages https://cl.ly/ae46ec25f494 but when performing site:domain.com I am only seeing 130,000 pages. Which one is more the source of truth especially I have checked some of these "valid" pages and noticed they're not even indexed?
Intermediate & Advanced SEO | | Ty19860 -
Massive Amount of Pages Deindexed
On or about 12/1/17 a massive amount of my site's pages were deindexed. I have done the following: Ensured all pages are "index,follow" Ensured there are no manual penalites Ensured the sitemap correlates to all the pages Resubmitted to Google ALL pages are gone from Bing as well In the new SC interface, there are 661 pages that are Excluded with 252 being "Crawled - currently not indexed: The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling." What in the world does this mean and how the heck do I fix this. This is CRITICAL. Please help! The url is https://www.hkqpc.com
Intermediate & Advanced SEO | | D.J.Hanchett0 -
Imange name and html page name same are count spammy contents ?
Imange name and html page name same is count spammy contents ex. watertreatment - plan.jpg watertreatment - plan.html
Intermediate & Advanced SEO | | Poojath0 -
Will Using Attributes For Landing Pages In Magento Dilute Page Rank?
Hello Mozzers! We have an ecommerce site built on Magento. We would like to use attribute filters in our layered navigation for landing page purposes. Each page will have a unique URL, Meta Title and Meta Description. For example: URL: domain.com/art/abstract (category is Art, attribute is Abstract) Title: Abstract Art For Sale Meta: Blah Blah Blah Currently these attribute pages are not being indexed by google as they are set in google parameters. We would like to edit google parameters to start indexing some of the attribute filters that users search for, so they can be used as landing pages. Does anyone have experience with this? Is this a good idea? What are the consequences? Will this dilute Page Rank? Could this destroy the world? Cheers! MozAddict
Intermediate & Advanced SEO | | MozAddict0 -
Orphan My Home Page
I want to orphan a home page on a site that I own so that the start page becomes site.com/home (or whatever) as opposed to site.com/. I need to accomplish this without associating the former with the latter...meaning no 301. Since this will not be a temporary move, 302 does not seem to work either. And even if I could use it, I don't want to credit / with anything from /home. Is there any way to default the Apache handler to /home without rewriting the URL? Or is there any other solution? The bottom line is, at the end of the day, I need Google to forget about / and anything associated with it, without interrupting the user experience when they request /. Thanks in advance.
Intermediate & Advanced SEO | | NTGproducts0 -
I have removed over 2000+ pages but Google still says i have 3000+ pages indexed
Good Afternoon, I run a office equipment website called top4office.co.uk. My predecessor decided that he would make an exact copy of the content on our existing site top4office.com and place it on the top4office.co.uk domain which included over 2k of thin pages. Since coming in i have hired a copywriter who has rewritten all the important content and I have removed over 2k pages of thin pages. I have set up 301's and blocked the thin pages using robots.txt and then used Google's removal tool to remove the pages from the index which was successfully done. But, although they were removed and can now longer be found in Google, when i use site:top4office.co.uk i still have over 3k of indexed pages (Originally i had 3700). Does anyone have any ideas why this is happening and more importantly how i can fix it? Our ranking on this site is woeful in comparison to what it was in 2011. I have a deadline and was wondering how quickly, in your opinion, do you think all these changes will impact my SERPs rankings? Look forward to your responses!
Intermediate & Advanced SEO | | apogeecorp0 -
How do I index these parameter generated pages?
Hey guys, I've got an issue with a site I'm working on. A big chunk of the content (roughly 500 pages) is delivered using parameters on a dynamically generated page. For example: www.domain.com/specs/product?=example - where "example' is the product name Currently there is no way to get to these pages unless you enter the product name into the search box and access it from there. Correct me if I'm wrong, but unless we find some other way to link to these pages they're basically invisible to search engines, right? What I'm struggling with is a method to get them indexed without doing something like creating a directory map type page of all of the links on it, which I guess wouldn't be a terrible idea as long as it was done well. I've not encountered a situation like this before. Does anyone have any recommendations?
Intermediate & Advanced SEO | | CodyWheeler0 -
How to Preserve PageRank for Disappearing Pages?
Pretend that USA Today has a section of their site where the sell electronics, which they locate at http://electronics.usatoday.com. The subdomain is powered by an online electronics store called NewCo via a white label. Many of the pages on this subdomain have relatively high PageRank. But few, if any, external sites link to the subdomain--the PageRank of the subdomain is largely due to internal links from the usatoday.com root domain. USA Today's deal with NewCo expires and they decide to partner with my startup instead. But, unlike NewCo, we won't be providing a white-label solution; rather, USA Today will be redirecting all of the electronics-related links on their root domain to my site instead of the electronics.usatoday.com subdomain. They also agree to direct all of the pages on electronics.usatoday.com to me. Ideally USA Today would add 301's to all of their pages on electronics.usatoday.com that direct to the corresponding pages on my site, but they don't have the engineering wherewithal or resources to do this. Therefore, what is the best way to pass the PageRank from the electronics.usatoday.com pages to my site? Would it work to have USA Today change the CNAME for electronics.usatoday.com to my site and then create pages on my site that mimic the USA today URL structure? For example, let's say there was a page located at electronics.usatoday.com/ipods. Could we give electronics.usatoday.com a CNAME form my site and then create a page on my site located at mysite.com/ipods that 301'ed to the ipod page on my site? Would that preserve the PageRank?
Intermediate & Advanced SEO | | jack789078900