Will a PDF Pass PageRank?
-
I created a PDF - will it pass PageRank?
-
Adding more...
The buttons from some shopping carts will work in .pdf documents. So if you write one with a parts list, you can place a buy button beside each part to make it really easy for the person to purchase.
Also, type your domain name in the pdf. That way, if people print it and want to go back to your website you might get a navigation query. Typing the URL where it can be found might do the same thing.
-
Very interesting discussion indeed. I wonder if "creating a workpath" for text would make it easier to read. But I guess a OCR is going to have a difficult time associating the actual text with the link regardless.
-
We make our links very obvious. I would not try to hide them because I want them clicked (it is hard to monetize a .pdf but easy to monetize an .html page - so I want the visitor to get onto my .html pages).
You can lock .pdf documents so that they can not be edited. Then other webmasters are free to post them on their own domain and give me backlinks. Of, course they could rewrite their own just as any other content can be spun or rewritten.
-
I'll help you by adding one.
... And a thumbs up well spent.
-
What an interesting topic. Has anyone done any testing around effectiveness of PDF vs HTML resources and whether it treats anchor text in the same way?
-
Thanks a ton EGOL, I have been looking around for more info on this subject for quite a while. What are your thoughts on how to create the links? Is it considered black hat tactics to place invisible links in those pdf's? My thinking here is that I know competitors will start stealing our pdf documents to use for their own websites. I was thinking of placing invisible links on some key phrases that link to product pages, and then when competitors upload our pdf's to their site, we get backlinks from their websites. Does that make sense, and does it seem like a viable strategy or a potentially penalizing one?
-
Thank you. I really like this subject and enjoyed preparing that answer!
-
Wow, way to give an absolutely excellent answer. I wish I could give more than 1 thumbs up!
-
Links in .pdf documents will be displayed in your Google Webmaster Tools Backlinks, they will accumulate pagerank (I have some PR6 pdf documents), and they will pass pagerank.
It is a good idea to place links into .pdf documents that you give away on the web not only for pagerank reasons but also to give users an easy link to visit your site for more information. Think about usability when you create .pdf documents in the same ways that you think about usability for your website.
Also, if you complete the "properties" attributes of .pdf documents you can give them a title that will appear in the SERPs like a title tag on an .html webpage. I get lots of traffic from the SERPs that come straight into my .pdf documents and then click a link in the document that takes them to a relevant page on my website.
Finally... in addition to .pdf documents you can also get viable backlinks and clickthroughs from .ppt (PowerPoint) .xls (Excel) and other types of files. Consider allowing other webmasters to include them on their site. That way they can bring you links from other domains.
-
From what i have seen its a little unclear, and would largely be dependant on how you create the pdf. Provided your pdf has been created using a text editor for the text (and not mad up of a bunch of images) then if pdf's are crawled - at least you stand a chance of the text and so on ranking in the first place. (You can google search by file type, including pdf, so one would assume they should rank in their own right if tagged and as text).
Whether a pdf will pass rank or not? I would suggest it should - provided it actually ranks itself in the first place (sounds obvious I know).
-
Depending on how it's uploaded on to the page, the page can still build links to it and gain in authority and trust. The content of the PDF will likely not be able to be crawled by the engines though.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Javascript links pass equity? (AngularJS)
Hi Moz Community, I am starting a link building campaign for one of my customers and I'm wondering if Google is able to pass link equity between javascript links. When I'm looking at my crawl report with Screaming Frog, I can see the number of links each page has so I can conclude that Google is able to see all the links. I've read that if my CSS & JS files are available for Google crawlers, ultimately, Google can crawl those URLs, but what about passing link equity? Before I start the link building campaign, Do you have any recommendations, case studies? Is it possible to include natural href links that pass link equity in a website that is entirely made with AngularJS? Thanks for all your inputs!
Intermediate & Advanced SEO | | alexrbrg0 -
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
Will Reducing Number of Low Page Authority Page Increase Domain Authority?
Our commercial real estate site (www.nyc-officespace-leader.com) contains about 800 URLs. Since 2012 the domain authority has dropped from 35 to about 20. Ranking and traffic dropped significantly since then. The site has about 791 URLs. Many are set to noindex. A large percentage of these pages have a Moz page authority of only "1". It is puzzling that some pages that have similar content to "1" page rank pages rank much better, in some cases "15". If we remove or consolidate the poorly ranked pages will the overall page authority and ranking of the site improve? Would taking the following steps help?: 1. Remove or consolidate poorly ranking unnecessary URLs?
Intermediate & Advanced SEO | | Kingalan1
2. Update content on poorly ranking URLs that are important?
3. Create internal text links (as opposed to links from menus) to critical pages? A MOZ crawl of our site's URLs is visible at the link below. I am wondering if the structure of the site is just not optimized for ranking and what can be done to improve it. THANKS. https://www.dropbox.com/s/oqchfqveelm1q11/CRAWL www.nyc-officespace-leader.com (1).csv?dl=0 Thanks,
Alan0 -
How long will this Site be punished? (place your bets!)
Got hit with a manual penalty in Feb2014. Got it removed in 6 months (before the Penguin refresh). Whole site got deindexed at one point including BRAND searches. (Brand searches has since come back) Disavowed over 20k domains (yea spam was bad). But still have a good amount of authority links such as huff post, edu, wiki, apple apps, reddit etc. About 97% of our links are on page 2 or beyond. Cant get past that spot 11 'Wall'. The suppression machine is not kind. We even had a very popular article get tons of shares, media pick up, and the original article would not rank on the first page for its title. Our 'brand + keyword' gets about 2k searches a month. Just 'keyword' gets nothing, which i find amusing. So whats the prognosis doc? Another year, 3 or never? Anyone else in same boat? Wait for the next penguin refresh and hope for the best? Cheers eXfPjzX.png
Intermediate & Advanced SEO | | IsHot0 -
If we remove all of the content for a branch office in one city from a web site, will it harm rankings for the other branches?
We have a client with a large, multi-city home services business. The service offerings vary from city to city, so each branch has it's own section on a fairly large (~6,000 pages) web site. Each branch drives a significant amount of revenue from organic searches specific to its geographic location (ex: Houston plumbers or Fort Worth landscaping). Recently, one of the larger branches has decided that it wants its own web site on a new domain because they have been convinced by an SEO firm that they can get better results with a standalone site. That branch wants us to remove all of its content (700-800 pages) on the current site and has said we can 301 all inbound links to the removed content to other pages on the existing site to mitigate any loss to domain authority. The other branch managers want to know if removing this city-specific content could negatively impact search rankings for their cities. On the surface it seems like as long as we have proper redirects in place, the other branches should be okay. Am I missing something?
Intermediate & Advanced SEO | | monkeeboy0 -
CMS generating thousands of links, will it hurt my SEO?
I've shifted my static (HTML) eCommerce website to Magento. I am facing serious problem, my website has total 20 products (each product has canonical URL) , I was surprised to see thousands of links indexed in Google as well as in my webmaster Crawler stats, later on I removed all from webmaster tool and marked as fixed, also blocked crawlers to crawl on those specific directories through robots.txt file. Now my question is will these urls still effect my website's SEO? As they still exist and accessible but blocked for crawlers. And is there any better way to block them other than robots.txt.Thanks
Intermediate & Advanced SEO | | clarybusinessmachines0 -
Will changing my wordpress permalinks add SEO value?
I'm considering changing my permalinks from: http://www.musicliveuk.com/corporate-entertainment-1 http://www.musicliveuk.com/corporate-entertainment-2 to: http://www.musicliveuk.com/corporate-entertainment-london etc... (these are example pages and don't actually exist) as I want to optimise pages for specific cities. This will create a load of 404 errors which I will have to 301 redirect (I presume that's the best way of doing it?). Does having the keyword in the url help and is the added SEO value (if their is any) worth it?
Intermediate & Advanced SEO | | SamCUK0 -
Best way to stop pages being indexed and keeping PageRank
If for example on a discussion forum, what would be the best way to stop pages such as the posting page (where a user posts a topic or message) from being indexed AND not diluting PageRank too? If we added them to the Disallow on robots.txt, would pagerank still flow through the links to those blocked pages or would it stay concentrated on the linking page? Your ideas and suggestions will be greatly appreciated.
Intermediate & Advanced SEO | | Peter2640