Javascript void and PageRank
-
Do javascript void links to on-page elements (not to a new page) consume PageRank?
I'm paring down links on a client's homepage, and we have javascript void links (wrapped in <a href="">) that load videos, elements of a slider, etc. on the page itself.</a>
<a href="">Basically, if I have a bunch of these, is it going to weaken the power of the other links on the page?</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What are best options for website built with navigation drop-down menus in JavaScript, to get those menus indexed by Google?
This concerns f5.com, a large website with navigation menus that drop down when hovered over. The sub nav items (example: “DDoS Protection”) are not cached by Google and therefore do not distribute internal links properly to help those sub-pages rank well. Best option naturally is to change the nav menus from JS to CSS but barring that, is there another option? Will Schema SiteNavigationElement work as an alternate?
Technical SEO | | CarlLarson0 -
Javascript redirects -- what are the SEO pitfalls?
Q: Is it dangerous (SEO fallout) to use javascript redirects? Tech team built a browser side tool for me to easily redirect old/broken links. This is essentially a glorified 400 page -- pops a quick message that the page requested no longer exists and that we're automatically sending you to a page that has the content you are looking. Tech team does not have the bandwidth to handle this via apache and this tool is what they came up with for me to deliver a better customer experience. Back story: very large site and I'm dealing with thousands of pages that could/should/need to be redirected. My issue is incredibly similar to what Rand mentioned way back in a post from 2009: Are 404 Pages Always Bad for SEO? We've also decided to let these pages 404 and monitor for anything that needs an apache redirect. Tool mentioned above was tech's idea to give me "the power" to manage redirects. What do you think?
Technical SEO | | FR1230 -
Infinite Pagerank Dilemma
Hey MOZ fans, I have a exciting question for you today. http://i.imgur.com/dl0r9s1.png
Technical SEO | | atakala
I try to visualize but let me explain too. In the pagerank algorithm, as you know the pagerank flows to the links **no matter they are internal or external. **And the link juice that pass can be found by pagerank of the site, times %85 divided by total outlinks ( no matter they are nofollow attribitu or not.)
Everything is okay here now. But what would be if it would be in the as image below. http://i.imgur.com/dl0r9s1.png Does it also become a loop and, each of the page makes their pagerank 10? Thanks for your help. dl0r9s1.png0 -
Would using javascript onclick functions to override href target be ok?
Hi all, I am currently working on a new search facility for me ecommerce site... it has very quickly dawned on me that this new facility is far better than my standard product pages - from a user point of view - i.e lots of product attributes for customers to find what they need faster, ability to compare products etc... All in all just better. BUT NO SEO VALUE!!! i want to use this search facility instead of my category/product pages... however as they are search pages i have "robots noindex them" and dont think its wise to change that... I have spoken to the developers of this software and they suggested i could use some javascript in the navigation to change the onlclick function to take the user to the search equivelant of the page... They said this way my normal pages are the ones that are still indexed by google etc, but the user has the benefit of using the improved search pages... This sounds perfect, however it also sounds a little deceptive... and i know google has loads of rules about these kinds of things, the last thing i want is to get any kind of penalty or any negative reaction from an SEO point of view... I am only considering this as it will improve the user experience on my website... Can any one advise if this is OK, or a "no no"... P.s for those wondering i use an "off the shelf" cart system and it would cost me an arm and a leg to have these features built into my actual category / product pages.
Technical SEO | | isntworkdull0 -
Javascript to manipulate Google's bounce rate and time on site?
I was referred to this "awesome" solution to high bounce rates. It is suppose to "fix" bounce rates and lower them through this simple script. When the bounce rate goes way down then rankings dramatically increase (interesting study but not my question). I don't know javascript but simply adding a script to the footer and watch everything fall into place seems a bit iffy to me. Can someone with experience in JS help me by explaining what this script does? I think it manipulates the reporting it does to GA but I'm not sure. It was supposed to be placed in the footer of the page and then sit back and watch the dollars fly in. 🙂
Technical SEO | | BenRWoodard1 -
Optimizing a website which uses JavaScript and jQuery
Just a quick question (or 2) If I have divs which are hidden on my page, but are displayed when a user clicks on a p tag and the hidden div is displayed using jquery a user clicks on an a tag and the hidden div is displayed using jquery with the href being cancelled in both examples, will the hidden content be optimized, or will the fact it is initially hidden make it harder to optimize? Thanks for any answers!
Technical SEO | | PhatJP0 -
Javascript
Hi mozzers, For my website I use various affiliate programs on commission junction. Some of the text ads are in javascript. Will google read the text ads or not? Cheers, Peter
Technical SEO | | PeterM220 -
PageRank Dropped?
The Symptoms About a year ago, our site EZWatch-Security-Cameras.com had a PageRank of 5. Several months ago it sunk to a 4 and we were a little worried, but it wasn't anything to really sweat over. At the end of january we noticed it had dropped again to a PR3, again we were a little more worried. When the farmer update hit we suddenly dropped to a PR1 but our traffic wasn't seriously affected, and in march most of the pages regained their PageRank. I noticed this morning that our homepage rank has once again dropped to a PR1. I am waiting to see if there has been any significant drop in traffic, but I haven't spotted anything that stands out significant, aside from an increase in the average cost for our paid search account of about 5%. The Problems We've Spotted Keep in mind that our current website is fairly old (2005) and we are ready to launch a new one. Our current website is running on X-Cart, and we have a few modules added on. Problem 1 - One such module handles a custom kit builder, this area has not been restricted by crawlers and it could be generating a large amount of needless page crawls. Problem 2 - Another module allows "SEO friendly URL's" according to the developer, but what actually happens is a visitor could type in any-url-they-like-for-product-id**-p-11111.html**, where the underlined section is any character string (or lack of), followed by either a product or category indicator and the id for said item. This causes a massive amount of virtual page duplications, and the module is encrypted so we aren't able to modify it to include rel="canonical" tags. Obviously this causes massive amounts of seemingly duplicate content. Problem 3 - In addition to the regular URL duplication, we also recently acquired the domain EZWatch.com (our brand name, easier to remember). That domain name responds with the content from our regular website, and it will be the primary domain name when we change shopping carts. With the second domain name the content could also be considered a duplication. The Solutions We're Working On The website we use was designed in 2005, and we believe that it's reached the end of its useful life. Over the past several months we have been working on an entirely new shopping cart platform, designed from the ground up to be more efficient operationally-speaking, and to provide more SEO control. The new site will be ready to launch within days, and we will start using the new Domain name at the same time. We are planning on doing page-to-page301 redirects for all pages with at least 1 visit within the past 180+ days, according to our Google Analytics reports. We are also including rel="canonical" on all pages. We will also be restricting dynamic sections of our website via the robots.txt file. So What More Can We Do? With your collective SEO experience, what other factors could also be contributing to this decline?
Technical SEO | | EZWatchPro0