Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Do backlinks need to be clicked to pass linkjuice?
-
Hi all:
Do backlinks need to be clicked to pass linkjuice? Is so, can someone explain how much traffic is needed from a backlink to count as linkjuice?
Thanks for the help.
Audrey.
-
Backlinks do not have to be clicked in order for them to count as linkjuice. Recently my org (missionquest.org) joined MOZ and it helped our backlinks and improved our SEO.
-
I would be surprised.
Google knows a lot, but not everything. Unless GA tracking code is installed google shall not know about things such a user click.
If they were passing page juices only for clicked backlink they would be ruling out a too big chunk of the web. It doesn't sound logic to me.
Also it doesn't sound realistic to analyze all users click in the world when refreshing google index, they do have a lot of metal, but not that much.
-
So, are you saying that a link having traffic kind of disqualifies it as spammy? Or at least in the eyes of Google?
-
Absolutely not. Spam links still work fantastic for ranking a site (temporarily). Those are links that never get seen or clicked, they pretty much just get crawled. Don't go the spam route, but also don't worry too much about people clicking links. I've gotten a ton of great links that have sent very, very little referral traffic, meaning links on popular posts still don't guarantee getting any/many clicks.
-
I don't think so. I usually fetch and render then submit my pages anytime I add one to my site, or make a significant change, like adding content or changing images. Nothing unnatural about it.
-
Good idea. I wonder if it would seem "un-natural" however?
-
Submitting the page to Google for Indexing doesn't guarantee that the backlinks will be crawled, but it can be a good way to try to force them to be crawled.
-
In that case, wouldn't it be ideal to submit the page to google indexing right after it's published?
-
I think it's about Page popularity and users engagements. Popularity in search results means a lot of spiders in the page. And, when a user clicks the link, there's a spider follows him to the new page. And it's all about the spider discovered your page and your link as well (as I think).
-
In fact, it's not like that.
I will tell you a very important rule about backlinks and really hard to find it. Tha main point is that the link need to be discovered by Google. And, the page which contain the link must have popularity in Google search results which mean a lot of people entering the page through search results. This what we call "the Quality of the link"
Keep up with your link building journey.
-
The way that I understand it is that the click helps the link to be found faster than if it had not been clicked. It might have equity and pass link juice prior, but before Google finds it, it might not be counted as a link to your site. Does that make sense? The link needs to be discovered before the link juice is actually counted. At least that is the way that I understand it.
I do know a few professionals who believe that if a link isn't clicked link juice is never passed. I don't know if that is necessarily true. It makes sense that a link could be discovered but not have any equity because it isn't being used. I wonder if someone has a better idea of whether or not that is true, or if it another secret Google keeps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing Url Removes Backlink
Hello MOZ Community, I have question regarding Bad Backlink Removal. My Site's Post's Image got 4 to 5k backlinks from unknown sites and also their is no contact details on their site so that i can contact them to remove. So, I have an idea for which i want suggestion " If I change the url that receieves backlinks" does this will remove backlinks? For Example: https://example.com/test/ got 5k backlinks if I change this url to https://examplee.com/test-failed/ does this will remove those 5k backlinks? If not then How Can I remove those Backlinks? I Know about disavow but this takes time.
Intermediate & Advanced SEO | | Jackson210 -
What IP Address does Googlebot use to read your site when coming from an external backlink?
Hi All, I'm trying to find more information on what IP address Googlebot would use when arriving to crawl your site from an external backlink. I'm under the impression Googlebot uses international signals to determine the best IP address to use when crawling (US / non-US) and then carries on with that IP when it arrives to your website? E.g. - Googlebot finds www.example.co.uk. Due to the ccTLD, it decides to crawl the site with a UK IP address rather than a US one. As it crawls this UK site, it finds a subdirectory backlink to your website and continues to crawl your website with the aforementioned UK IP address. Is this a correct assumption, or does Googlebot look at altering the IP address as it enters a backlink / new domain? Also, are ccTLDs the main signals to determine the possibility of Google switching to an international IP address to crawl, rather than the standard US one? Am I right in saying that hreflang tags don't apply here at all, as their purpose is to be used in SERPS and helping Google to determine which page to serve to users based on their IP etc. If anyone has any insight this would be great.
Intermediate & Advanced SEO | | MattBassos0 -
If a page ranks in the wrong country and is redirected, does that problem pass to the new page?
Hi guys, I'm having a weird problem: A new multilingual site was launched about 2 months ago. It has correct hreflang tags and Geo targetting in GSC for every language version. We redirected some relevant pages (with good PA) from another website of our client's. It turned out that the pages were not ranking in the correct country markets (for example, the en-gb page ranking in the USA). The pages from our site seem to have the same problem. Do you think they inherited it due to the redirects? Is it possible that Google will sort things out over some time, given the fact that the new pages have correct hreflangs? Is there stuff we could do to help ranking in the correct country markets?
Intermediate & Advanced SEO | | ParisChildress1 -
Multiple Landing Pages and Backlinks
I have a client that does website contract work for about 50 governmental county websites. The client has the ability to add a link back in the footer of each of these websites. I am wanting my client to get backlink juice for a different key phrase from each of the 50 agencies (basically just my keyphrase with the different county name in it). I also want a different landing page to rank for each term. The 50 different landing pages would be a bit like location pages for local search. Each one targets a different county. However, I do not have a lot of unique content for each page. Basically each page would follow the same format (but reference a different county name, and 10 different links from each county website). Is this a good SEO back link strategy? Do I need more unique content for each landing page in order to prevent duplicate content flags?
Intermediate & Advanced SEO | | shauna70840 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Changing my pages URL name - HELP NEEDED FAST
Hello, I need to change the URL name for a few pages on my site. The site was launched just recently, so it has no obvious ranking and traffic. My question is, what is the best practice for changing/deleting the page name? after deleting the page, should I go to Google webmaster tool and use URL Removal and remove the old page? I know that I have to also create a new XML sitemap file, but not sure about the old pages in google search result Thanks!
Intermediate & Advanced SEO | | mdmoz0 -
Does a 302 redirect pass penalties?
I'm having problems finding a definitive answer to this question, there is a lot of rumour and gossip out there but nothing I can rely on. I'm working with a site that received an unnatural links notice followed by a massive drop in search traffic. Looking at the link profile it's pretty much jacked beyond repair and I have recommended that we move over to a fresh domain. However, it's an established brand with many more sources of traffic than organic search. There's no way we can burn all their repeat visits, loyal customers, brand recognition that they've built up over the years so I want to redirect from the old domain to the new. This is not to try and make any SEO gain from the previous site, frankly we don't give a crap about that. We just want to maintain the brand. A 302 is a temporary redirect, this will be a permanent move BUT a 301 will pass on the penalty. So can we safely use a 302 redirect in this situation or is there a better alternative (meta refresh?) Thanks for your help! MB.
Intermediate & Advanced SEO | | MattBarker0 -
Does 302 pass link juice?
Hi! We have our content under two subdomains, one for the English language and one for Spanish. Depending on the language of the browser, there's a 302 redirecting to one of this subdomains. However, our main domain (which has no content) is receiving a lot of links - people rather link to mydomain.com than to en.mydomain.com. Does the 302 passing any link juice? If so, to which subdomain? Thank you!
Intermediate & Advanced SEO | | bodaclick0