Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
href="#" and href="javascript.void()" links. Is there a difference SEO wise?
-
I am currently working a site re-design and we are looking at if href="#" and href="javascript.void()" have an impact on the site? We were initially looking at getting the links per page down but I am thinking that rel=nofollow is the best method for this. Anyone had any experience with this? Thanks in advanced
-
All links consume link juice even nofollow links.
What happens to the link juice is the question, does href="#" just flow back to the same page, first thoughts are yes, but then if that is the case, you would be able to manipulate how much link juice flows out of other links but adding more. so I think they may waste link juice. JavaScript links use link juice and there is no guarantee that Google is able to pass that link juice on.A lot of CMS use this type of links href="#" on a A tag then use the A tag for some other reason, such as a button to fire JavaScript. I believe that if you want a button use a button, if you want a JavaScript link then attach the event to a SPAN or DIV, use A tags only for real links and be sure you know what is happening to your link juice.
-
Thanks for the response, the amount of links really varies per page but could be around 170 in some cases and some of these links are external as well as internal. The site itself has plenty of content so it isn't a case of us trying to cheat any sort of Google guideline but to try and keep number of page links down.
Basically I wanted to know if we would be hurt by using javascript link instead of the usual href="#"
-
How many links are on the page?
If the links are internal and there to help the users navigate then why not leave them as do follow? If there are so many links that you're concerned, it might be worth considering that there too many links, not just as far a Google is concerned but also form the users perpective.
Remember, using nofollow top sculpt Page Rank is against G's guidelines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Significant "Average Position" dips in Search Console each time I post on Google My Business
Hi everyone, Several weeks ago I noticed that each Wednesday my site's Average Position in Search Console dipped significantly. Immediately I identified that this was the day my colleague published and back-linked a blog post, and so we spent the next few weeks testing and monitoring everything we did. We discovered that it was ONLY when we created a Google My Business post that the Average Position dipped, and on the 1st July we tested it one more time. The results were the same (please see attached image). I am 100% confident that Google My Business is the cause of the issue, but can't identify why. The image I upload belongs to me, the text isn't spammy or stuffed with keywords, the Learn More links to my own website, and I never receive any warnings from Google about the content. I would love to hear the community's thoughts on this and how I can stop the issue from continuing. I should note, that my Google My Business insights are generally positive i.e. no dips in search results etc. My URL is https://www.photographybymatthewjames.com/ Thanks in advance Matthew C0000OTrpfmNWx8g
White Hat / Black Hat SEO | | PhotoMattJames0 -
How many links can you have on sitemap.html
we have a lot of pages that we want to create crawlable paths to. How many links are able to be crawled on 1 page for sitemap.html
White Hat / Black Hat SEO | | imjonny0 -
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
Adult Toy Store SEO
Hi fellows, I'm not so strange to SEO. I have been promoting our spiritual network through SEO and we have received great returns from it. I'm planning to promote an adult toy store via SEO. I have never done any adult store promoting before but I think there are a lot of down sides to it, such as: #1 When I search related keywords many porn websites show up; I assume it seems spammy to google's eye. Also most of the links that I will get are probably from porn websites due to relevancy. #2 Many of our returning customers are coming from retargeting but I assume there is no adult promotion via google display. Is that right? (It's not SEO related) I'm wondering to know if google is against adult content in any way? Any feedbacks are appreciated.
White Hat / Black Hat SEO | | Arian-Ya0 -
How do you change the 6 links under your website in Google?
Hello everyone, I have no idea how to ask this question, so I'm going to give it a shot and hopefully someone can help me!! My company is called Eteach, so when you type in Eteach into Google, we come in the top position (phew!) but there are 6 links that appear underneath it (I've added a picture to show what I mean). How do you change these links?? I don't even know what to call them, so if there is a particular name for these then please let me know! They seem to be an organic rank rather than PPC...but if I'm wrong then do correct me! Thanks! zorIsxH.jpg
White Hat / Black Hat SEO | | Eteach_Marketing0 -
Why does expired domains still work for SEO?
Hi everyone I’ve been doing an experiment during more than 1 year to try to see if its possible to buy expired domains. I know its considered black hat, but like I said, I wanted to experiment, that is what SEO is about. What I did was to buy domains that just expired, immediately added content on a WP setup, filled it with relevant content to the expired domain and then started building links to other relevant sites from these domains.( Here is a pretty good post on how to do, and I did it in a similar way. http://searchenginewatch.com/article/2297718/How-to-Build-Links-Using-Expired-Domains ) This is nothing new and SEO:s has been doing it for along time. There is a lot of rumors around the SEO world that the domains becomes worthless after they expire. But after trying it out during more than 1 year and with about 50 different expired domains I can conclude that it DOES work, 100% of the time. Some of the domains are of course better than others, but I cannot see any signs of the expired domains or the sites i link to has been punished by Google. The sites im liking to ranks great ONLY with those links 🙂 So to the question: WHY does Google allow this? They should be able to see that a domain has been expired right? And if its expired, why dont they just “delete” all the links to that domain after the expiry date? Google is well aware of this problem so what is stopping them? Is there any one here that know how this works technically?
White Hat / Black Hat SEO | | Sir0 -
Avoiding the "sorry we have no imagery here" G-maps error
Hi there, we recently did a redesign on a big site and added Gmaps locations to almost every page since we are related to Real State, Listings, Details, search results all have a map embedded. While looking at GWT I found that the top keywords on our site (which is in spanish) are the following. have here imagery sorry After a quick search I found out this is a Gmaps bug, when Google Bot accesses the Pages it throws an error out with this text repeated several times. If you do a search for "sorry we have no imagery here" you will see lots of sites with this issue. My question is, Is this affecting the overall SEO since Bots are actually crawling and indexing this hence its being reported by GWT, Should I cloak this to robots? Has anyone noticed this or has been able to fix it? Thanks in advance!
White Hat / Black Hat SEO | | makote0 -
How Is Your Approach Towards Adult SEO?
I would like to know how SEOMoz community members approach adult SEO. How do you approach a project when you get one (if you do it that is). If you dont do adult SEO, why do you not do it? Is it because it's much more difficult than normal SEO or do you not want to associate yourself with that industry?
White Hat / Black Hat SEO | | ConversionChamp0