How not to lose link juice when linking to thousands of PDF guides?
-
Hi All,
I run an e-commerce website with thousands of products.
In each product page I have a link to a PDF guide of that product.Currently we link to it with a "nofollow" <a href="">tag.</a>
<a href="">Should we change it to window.open in order not to lose link juice?
Thanks</a>
-
Dear Egol,
I'm assuming by your answer that PDF's take link juice and JPG's don't (please correct me if I'm wrong).
Is that the case even if I have <a href="">to enlarge the image? (I use href with JQuery and not regular target=_blank)</a>
<a href="">Also, what to do about the certificates that are on other sites? (too many and rapidly changing for me to get it to my site)</a>
-
Thanks,
I was assuming that these had product dimensions, how to use, etc information. I would want all of that content on my site.
If these are certificates then I would link to them as .jpg images and that eliminates concerns about link juice.
-
Dear Egol,
Thank for the reply. I'm guessing my example for product guides was not exact.
These are not guides but rather certificates of authentication which the customers expect to see. Some I have in PDF's, some in JPG's and some I refer to a cetificate page in another site. The question is what to do...I currently use "nofollow" links on all types of certificates (jpg, pdf and links to other site). What do you suggest to do?
I was considering either window.open (which I fear would look spammy) or leaving it as is...
Thanks
-
Just saying what I would do if this was my website.
My employees would be told that getting that pdf data onto the product sales page is a top priority job... and I would call the car dealer and order my new Jaguar.
-
I don't think having them indexed would do me any good.
The important page is the product page itself - the way I see it, there is no good reason to lose link juice for these guides...
I'm trying to improve my sites overall performance.
Important note...
Also, I have some cases where the manuals are pages on the manufacturer's site. Should I change these links from follow to JS links in order not to lose the juice from my site to other sites?
-
Are you just trying to streamline your product pages, or are you experiencing a specific issue with your product pages that you're hoping this will solve?
Linking to the PDF's via a javascript function should help preserve link juice and crawl budget.
You should only add those PDF's that have been indexed to the robots.txt if you absolutely do not want them in the index, otherwise it won't really do anything to help.
-
Due to your answer I checked and I see that some of the PDF's are indexed.
I didn't know that Google does that.
It is probably from the time that the links were not "nofollow".Should I exclude them using robots.txt?
-
Are the PDF's currently being crawled and indexed?
If you want to hide the PDF's from search engines, and preserve link juice, then a javascript method like you've mentioned ought to keep things wrapped up.
You could also consider some data capture, such as requiring an e-mail address to be entered before you see the download link.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How does link juice flow through hreflang?
We want to use the hreflang tag on our site (direct users searching for the Spanish version of spanishdict.com to spanishdict.com/traductor). Before doing so, we were wondering how link juice flows through hreflang? Any insight or resources on this would be very helpful. Thanks!
Technical SEO | | CuriosityMedia0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Too many links? Do links to named anchors count (ie page#nameanchor)?
Hi, I have an internal search results page that contains approx 200 links in total. This links to approx 50 pages. Each result listing contains a link to the page in the format /page.html and also has 3 more links (for each listing) to named anchors within the page. eg /page.html#section1, /page.html#section2, /page.html#section3 etc. Should i remove the named anchors to keep my links per page under the Seomoz suggested max of 100? Will it impact crawl-ability or link juice being passed? Thanks in advance for your response.
Technical SEO | | blackrails0 -
Link Detox
Hey guys, I'm currently working on cleaning up our link profile and have been looking at several tools. Has any one used this from http://www.linkresearchtools.com do you think its worth investing in? Matthew
Technical SEO | | EwanFisher0 -
Press release not giving me my link juice
The other day we released a press release, see it here http://www.businesswire.com/news/home/20120717006087/en/Rapid7-Metasploit-Pro-Increases-Vulnerability-Management-Efficiency. I asked them to include two links (seen in the first paragraph) with targeted anchor text (vulnerability management and penetration testing). The press release was published and when I check the open site explorer to see if I got any link juice from the press release, I am not seeing the link...ugh I noticed that they are using some sort of tracking code that seems to be redirecting the link, is this the problem? I talked to our sales rep at businesswire and he told me that they could take the code off if that is what needs to be done. Do you have any insight into this or have you ever ran into this problem?
Technical SEO | | PatBausemer0 -
Link Volume - calculate what you need?
Hi everyone, an interesting question here. How do you determien what link volume you should try and get into your website? What analysis do you do to determine the number of links you feel is right to go into a back-link profiel every month? obviously there is no magic number but its an interesting question to know what others do. Obviously you don't want to build too many or too little. If you have been penalised for bad links in the past and are now back on track - how do you calculate the volume? Do you take links dropping out into consideration?
Technical SEO | | pauledwards0 -
Removing inbound Spam Links
Hello, Last February one of my clients websites was delisted. It turns out that some time ago that had attempted to launch a social network along time lines of ning. The project had fallen apart of the was still up. At some point spammers found it and started using it as part of a link farm. Once it was discovered, the subdomain it was posted on was removed and the website returned to search within 2 weeks. Last week, the website disappeared again OSE shows that in the last 2 months the website has got 2000 (There are about 16,000 total spam links) additional spam links now pointing and the root domain. On top of that, Google Webmaster Tools is reporting about 15,000 404 errors. I have blocked Google from crawling the path where the path were the spam pages used to be. If there a way to block the 1000s of inbound spam links?
Technical SEO | | Simple_Machines0 -
Do any short url's pass link juice? googles own? twitters?
I've read a few posts saying not shorten links at all but we have a lot to tweet and need to. Is googles shortener the best option? I've considered linking to the category index page the article is on and expect the user to find the article and click on the article, I don't like the experience that creates though. I've considered making the article permalink tiny but I would lose the page title being in the url. Is this the best option?
Technical SEO | | Aviawest0