Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do Backlinks to a PDF help with overall authority/link juice for the rest of the domain?
-
We are working on a website that has some high-quality industry articles available on their website.
For each article, there is an abstract with a link to the PDF which is hosted on the domain.
We have found in Analytics that a lot of sites link directly to the PDF and not the webpage that has the abstract of the article.
Can we get any benefit from a direct PDF link? Or do we need to modify our strategy?
-
You can/need to build links from within the domain to thePDF report you want to share also. Links within the PFD file will also help for people who share the file via email, and not within the site. This will bring people back to your domain.
-
Also have a Google for optimising pdfs for search, i know there are number of ways & options
http://www.seomoz.org/ugc/how-to-optimize-pdf-documents-for-search
http://searchengineland.com/eleven-tips-for-optimizing-pdfs-for-search-engines-12156
-
Add links to your site within the .pdf
You can also add your branding to the .pdf
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Backlink quality vs quantity: Should I keep spammy backlinks?
Regarding backlinks, I'm wondering which is more advantageous for domain authority and Google reputation: Option 1: More backlinks including a lot of spammy links Option 2: Fewer backlinks but only reliable, non-spam links I've researched this topic around the web a bit and understand that the answer is somewhere in the middle, but given my site's specific backlink volume, the answer might lean one way or the other. For context, my site has a spam score of 2%, and when I did a quick backlink audit, roughly 20% are ones I want to disavow. However, I don't want to eliminate so many backlinks that my DA goes down. As always, we are working to build quality backlinks, but I'm interested in whether eliminating 20% of backlinks will hurt my DA. Thank you!
Technical SEO | | LianaLewis1 -
Old domain to new domain
Hi, A website on server A is no longer required. The owner has redirected some URLS of this website (via plugin) to his new website on server B -but not all URLS. So when I use COMMAND site:website A , I see a mixture of redirected URLS and not redirected URLS.Therefore two websites are still being indexed in some form and causing duplication. However, weirdly when I crawl with Screaming Frog I only see one URL which is 301 redirected to the new website. I would have thought I'd see lots of URLs which hadn't been redirected. How come it is different to using the site:command? Anyway, how do I move to the new website completely without the old one being indexed anymore. I thought I knew this but have read so many blogs I've confused myself! Should I: Redirect all URLS via the HTACESS file on old website on server A? There are lots of pages indexed so a lot of URLs. What if I miss some? or Point the old domain via DNS to server B and do the redirects in website B HTaccess file? This seems more sensible but does this method still retain the website rankings? Thanks for any help
Technical SEO | | AL123al0 -
Help: domain name change and Google News
Hi. I work for a regional news source, and our (separate) Spanish-language news publication recently changed its domain name. The publication lost its Google News inclusion. Most of their traffic came from Google News, so traffic tanked. They're trying to get back in. They reapplied but didn't get approved. They're now in the 30-day waiting period to reapply again. The website is run by a third-party company, which handled the domain name change in April (2015). That company has been running their site for a couple of years. Our in-house devs' hands are tied on helping, because we (at the mother company) don't manage their site. This third party has not been responsive. The Spanish pub folks have reached out to me to help them prepare for Round 2 of reapplication. I'm the mothership in-house SEO, but I've never experienced this situation before. Because everything seems to be in order besides the ham-handed changes, my best advice to them so far is: You'll have to wait until Google gets to know you again, unfortunately. Does that sound right? Any pointers out there for bringing their best possible A-game to the next round?
Technical SEO | | christyrobinson1 -
Remove page with PA of 69 and 300 root domain links?
Hi We have a few pages within our website which were at one time a focus for us, but due to developing the other areas of the website, they are now defunct (better content elsewhere) and in some ways slightly duplicate so we're merging two areas into one. We have removed the links to the main hub page from our navigation, and were going to 301 this main page to the main hub page of the section which replaces it. However I've just noticed the page due to be removed has a PA of 69 and 15,000 incoming links from 300 root domains. So not bad! It's actually stronger than the page we are 301'ing it to (but not really an option to swap as the URL structure will look messy) With this in mind, is the strategy to redirect still the best or should we keep the page and turn it into a landing page, with links off to the other section? It just feels as though we would be doing this just for the sake of google, im not sure how much decent content we could put on it as we've already done that on the destination page. The incoming links to that page will still be relevant to the new section (they are both v similar hence the merging) Any suggestions welcome, thanks
Technical SEO | | benseb0 -
How much domain authority is passed on through a link from a page with low authority?
Hello, Let's say that there is a link to site A from site B. The domain authority of site B is 85, but the link is on a page that has a page authority of only 1. Does much authority get passed along from site B to site A? (Let's assume site A has a domain authority of 35, if that's relevant.) Thank you!
Technical SEO | | nyc-seo0 -
How does a search engine bot navigate past a .PDF link?
We have a large number of product pages that contain links to a .pdf of the technical specs for that product. These are all set up to open in a new window when the end user clicks. If these pages are being crawled, and a bot follows the link for the .pdf, is there any way for that bot to continue to crawl the site, or does it get stuck on that dangling page because it doesn't contain any links back to the site (it's a .pdf) and the "back" button doesn't work because the page opened in a new window? If this situation effectively stops the bot in its tracks and it can't crawl any further, what's the best way to fix this? 1. Add a rel="nofollow" attribute 2. Don't open the link in a new window so the back button remains finctional 3. Both 1 and 2 or 4. Create specs on the page instead of relying on a .pdf Here's an example page: http://www.ccisolutions.com/StoreFront/product/mackie-cfx12-mkii-compact-mixer - The technical spec .pdf is located under the "Downloads" tab [the content is all on one page in the source code - the tabs are just a design element] Thoughts and suggestions would be greatly appreciated. Dana
Technical SEO | | danatanseo0 -
Domain authority and keyword difficulty
I know there are too many variables for a certain answer, however do people take their domain authority into account when using keyword difficulty tool? I have a new domain which only has a score of seven at the moment. When using the keyword searching tool what is the maximum difficulty level keywords people would target initially? Obviously I would seek to increase the difficulty of the words over time but to start off its a hard choice between keywords which can be ranked for in a reasonable period of time and the keywords which are getting enough traffic to make the effort worthwhile.
Technical SEO | | Grumpy_Carl0 -
Drop Down Menu - Link Juice Depletion
Hi, We have a site with 7 top level sections all of which contain a large number of subsections which may then contain further sub sections. To try and ensure the best user experience we have a top navigation with the 7 top level sections and when hovered a selection of the key sub sections. Although I like this format for the user as it makes it easier for them to find the most important sections / sub sections it does lead to a lot of links within every page on the site. In general each top section has a drop down with approx 10 - 15 subsections. This has therefore lead to SeoMoz's tools issuing its too many internal links warning. Then alongside this I am left wondering if I shouldn’t have to many links to my subsections and whether I would be better off being more selective of when I link to them. For instance I could choose the top 5 sub sections and place a link to them from our homepage and by doing so I would be passing a greater amount of link juice down the line. So I guess my dilemma is between ensuring the user has as easy a time traversing the site as possible whilst I try to keep a close watch on where, and how, our link juice is distributed. One solution I am considering is whether no-follow links could be utilised within the drop down menus? This way I could then have the desired user navigation and I would be in greater control of what pages link to which sub sections. Would that even work? Any advice would be greatly appreciated, Regards, Guy
Technical SEO | | guycampbell1