href="#" and href="javascript.void()" links. Is there a difference SEO wise?
-
I am currently working a site re-design and we are looking at if href="#" and href="javascript.void()" have an impact on the site? We were initially looking at getting the links per page down but I am thinking that rel=nofollow is the best method for this. Anyone had any experience with this? Thanks in advanced
-
All links consume link juice even nofollow links.
What happens to the link juice is the question, does href="#" just flow back to the same page, first thoughts are yes, but then if that is the case, you would be able to manipulate how much link juice flows out of other links but adding more. so I think they may waste link juice. JavaScript links use link juice and there is no guarantee that Google is able to pass that link juice on.A lot of CMS use this type of links href="#" on a A tag then use the A tag for some other reason, such as a button to fire JavaScript. I believe that if you want a button use a button, if you want a JavaScript link then attach the event to a SPAN or DIV, use A tags only for real links and be sure you know what is happening to your link juice.
-
Thanks for the response, the amount of links really varies per page but could be around 170 in some cases and some of these links are external as well as internal. The site itself has plenty of content so it isn't a case of us trying to cheat any sort of Google guideline but to try and keep number of page links down.
Basically I wanted to know if we would be hurt by using javascript link instead of the usual href="#"
-
How many links are on the page?
If the links are internal and there to help the users navigate then why not leave them as do follow? If there are so many links that you're concerned, it might be worth considering that there too many links, not just as far a Google is concerned but also form the users perpective.
Remember, using nofollow top sculpt Page Rank is against G's guidelines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Google not disavow some bad links
I have submitted bad links that I want to disavow on google with the Moz Pro hight spam score. Its almost 4 months completed yet I have a bad link that exists with high spam score any solution? https://fortniteskinsgenerator.net/
White Hat / Black Hat SEO | | marktravis0 -
How to find trustful seo specialist?
How to find trustful seo specialist if you don't know about SEO a lot?
White Hat / Black Hat SEO | | DigiVital1 -
Disavow or not? Negative SEO
Since last November we have been receiving a lot of low quality backlinks from over 700 websites. It looks like one of our pages from our website has been copied with the links being kept as they are. I have left a link to an example of this here: https://goo.gl/eWQODJ Please note, all examples seem to be copied in the same way. We have also started seeing a decrease in the amount of organic traffic (Analytics Picture), As you can see the decrease is not yet so drastically high, but it is still a decrease and this is the third consecutive month we have seen this decrease. Do you think it is worth it to use Disavow tool for all of these bad link or not? uuuLt
White Hat / Black Hat SEO | | Tiedemann_Anselm1 -
C-Block and link juice
We manage a couple of different domains on different hosting providers. I want to consolidate to one provider, but one site has some good links juice to another site (actually just one link). Should I worry about having both sites on the same C-block - and probably the same IP address?
White Hat / Black Hat SEO | | ThomasErb0 -
Regular links may still
Good day: I understand guest articles are a good way to pass linkjuice and some authors have a link to their website on the "Author Bio" section of the article. These links are usually regular links. However, I noticed that some of these sites (using wordpress) have several SEO plugins with the following settings: Nofollow: Tell search engines not to spider links on this webpage. My question is: If the setting above was activated, I would assume the author's website link would look like a regular link but some other code could still be present in the page (ex, header) that would prevent this regular link from being followed. Therefore, the guest writer would not experience any linkjuice. Any idea if there's a way of being able to see if this scenario is happening? What code would we look for?
White Hat / Black Hat SEO | | Audreythenurse0 -
Linking C blocks strategy - Which hat is this tactic?
This related to a previous question I had about satellite sites. I questioned the white-hativity of their strategy. Basically to increase the number of linking C blocks they created 100+ websites on different C blocks that link back to our main domain. The issue I see is that- the sites are 98% exactly the same in appearance and content. Only small paragraph is different on the homepage. the sites only have outbound links to our main domain, no in-bound links Is this a legit? I am not an SEO expert, but have receive awesome advice here. So thank you in advance!
White Hat / Black Hat SEO | | Buddys0 -
How to recognize Panda, Penguin or Unnatural Links Penalty ?
Hey guys, today I've received below message from Google, but I'm confused that there NO such message in WMT ?!??!?!?! I've login /out few times and situation is still same ?!?!? Still Nothing there ? Anybody had same issue ? Do I need to fill reconsideration request ? Pleased to hear back from you guys. NikoT Google Webmaster Tools notice of detected unnatural links to .com/ Dear site owner or webmaster of , We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. Specifically, look for possibly artificial or unnatural links pointing to your site that could be intended to manipulate PageRank. Examples of unnatural linking could include buying links to pass PageRank or participating in link schemes. We encourage you to make changes to your site so that it meets our quality guidelines. Once you've made these changes, please submit your site for reconsideration in Google's search results. If you find unnatural links to your site that you are unable to control or remove, please provide the details in your reconsideration request. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team
White Hat / Black Hat SEO | | NikoT0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0