href="#" and href="javascript.void()" links. Is there a difference SEO wise?
-
I am currently working a site re-design and we are looking at if href="#" and href="javascript.void()" have an impact on the site? We were initially looking at getting the links per page down but I am thinking that rel=nofollow is the best method for this. Anyone had any experience with this? Thanks in advanced
-
All links consume link juice even nofollow links.
What happens to the link juice is the question, does href="#" just flow back to the same page, first thoughts are yes, but then if that is the case, you would be able to manipulate how much link juice flows out of other links but adding more. so I think they may waste link juice. JavaScript links use link juice and there is no guarantee that Google is able to pass that link juice on.A lot of CMS use this type of links href="#" on a A tag then use the A tag for some other reason, such as a button to fire JavaScript. I believe that if you want a button use a button, if you want a JavaScript link then attach the event to a SPAN or DIV, use A tags only for real links and be sure you know what is happening to your link juice.
-
Thanks for the response, the amount of links really varies per page but could be around 170 in some cases and some of these links are external as well as internal. The site itself has plenty of content so it isn't a case of us trying to cheat any sort of Google guideline but to try and keep number of page links down.
Basically I wanted to know if we would be hurt by using javascript link instead of the usual href="#"
-
How many links are on the page?
If the links are internal and there to help the users navigate then why not leave them as do follow? If there are so many links that you're concerned, it might be worth considering that there too many links, not just as far a Google is concerned but also form the users perpective.
Remember, using nofollow top sculpt Page Rank is against G's guidelines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Disappearing Links Black Hat ?
I have seen reports of Black hat spamming with dodgy links but we have another issue with a clients site. The site had a small number of solid following links about 60 which had been in place for years and in the past few weeks all but those directly under their control have ceased to link. At the same time a very aggressive competitor has entered their market which is owned by the officers of an SEO company. Could it be that they have somehow disavowed the links to the site to damage it how do we find out? there are now just 10 following links?
White Hat / Black Hat SEO | | Eff-Commerce0 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
Does Google Consider a Follow Affiliate Link into my site a paid link?
Let's say I have a link coming into my domain like this http://www.mydomain.com/l/freerol.aspx?AID=674&subid=Week+2+Freeroll&pid=120 Do you think Google recognizes this as paid link? These links are follow links. I am working on a site that has tons of these, but ranks fairly well. They did lose some ranking over the past month or so, and I am wondering if it might be related to a recent iteration of Penguin. These are very high PR inbound links and from a number of good domains, so I would not want to make a mistake and have client get affiliates to no follow if that is going to cause his rankings to drop more. Any thoughts would be appreciated.
White Hat / Black Hat SEO | | Robertnweil10 -
"NOINDEX,FOLLOW" same as "NOINDEX, FOLLOW" ?
Notice the space between them - I am trying to debug my application and sometimes it put in a space - Will this small difference matter to the bots?
White Hat / Black Hat SEO | | bjs20100 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Banner Ads help seo?
I see in OSE banner ads counting ads as incoming links - My question is has anyone done a study showing a non tagged banner ad link and its effects on seo? Does google counting it as organic since it has no tagging or since its in a ad spot its ignored?
White Hat / Black Hat SEO | | DavidKonigsberg0 -
"take care about the content" is it always true?
Hi everyone, I keep reading answer ,in reference to ranking advice, in wich the verdict is always the same: "TAKE CARE ABOUT THE CONTENT INSTEAD OF PR", and phrases like " you don't have to waste your time buying links, you have first of all to engage your visitors. ideally it works but not when you have to deal with small sites and especially when you are going to be ranked for those keywords where there's not too much to write. i'll give you an example still unsolved: i've got a client who just want to be ranked first for his flagship store, now his site is on the fourth position and the first ranked is a site with no content and low authority but it has the excact keyword match domain. tell me!!! what kind of content should i produce in order to be ranked for the name of the shop and the city?? the only way is to get links.... or to stay forth..... if you would like to help me, see more details below: page: http://poltronafraubrescia.zenucchi.it keyword: poltrona frau brescia competitor ranked first: http://turra.poltronafraubrescia.it/ competiror ranked second: http:// poltronafraubrescia.com/
White Hat / Black Hat SEO | | guidoboem0 -
Can our white hat links get a bad rap when they're alongside junk links busted by Panda?
My firm has been creating content for a client for years - video, blog posts and other references. This client's web vendor has been using bad links and link farms to bolster rank for key phrases - successfully. Until last week when Google slapped them. They have been officially warned on WMT for possibly using artificial or unnatural links to build PageRank. They went from page one of the most popular term in Chicago for their industry where they had been for over a year - to page 8 - overnight. Other less generic terms that we were working on felt the sting as well. I was aware of and had warned the client of the possibility of repercussions from these black hat tactics (http://www.seomoz.org/blog/how-google-makes-liars-out-of-the-good-guys-in-seo#jtc170969), but didn't go as far as to recommend they abandon them. Now I'm wondering if one of our legitimate sites (YoChicago.com), which has more than its share of the links into the client site is being considered a bad link. All of our links are legitimate, i.e., anchor text equals description of destination, video links describe the entity that is linked to. Our we vulnerable? Any insight would be appreciated.
White Hat / Black Hat SEO | | mikescotty0