JavaScript encoded links on an AngularJS framework...bad idea for Google?
-
Hi Guys,
I have a site where we're currently deploying code in AngularJS. As part of this, on the page we sometimes have links to 3rd party websites.
We do not want to have followed links on the site to the 3rd party sites as we may be perceived as a link farm since we have more than 1 million pages and a lot of these have external 3rd party links.
My question is, if we've got javascript to fire off the link to the 3rd party, is that enough to prevent Google from seeing that link? We do not have a NOFOLLOW on that currently.
The link anchor text simply says "Visit website" and the link is fired using JavaScript.
Here's a snapshot of the code we're using:
Visit website
Does anyone have any experience with anything like this on their own site or customer site that we can learn from just to ensure that we avoid any chances of being flagged for being a link farm?
Thank you
-
Hm, I'd be a little concerned if GSC can see it. Maybe GSC can see that JS turns it into a link, but can't figure out what that link is?
Any way, sounds like your hands are kind of tied until you can get those nofollows! Definitely make a note in your analytics platform when you get them implemented - it'll be interesting to see what effect they have on your rankings.
Good luck!
Kristina
-
Hi Kristina,
First of all, thank you for taking the time out to respond.
Very valid rationale you provided. I did have a look at the cache version before I posted on here and it didnt show the link I was looking for, however the GSC screen showed the link highlighted as a link.
That's what got me confused. I guess its safe to assume in that case that it wont be seen by Google considering it's not in the text version of the cached page.
I'll work on getting a NOFOLLOW in there since there's no guarantees with Google when they change stuff around. But, its great to know that it isnt an immediate requirement at the moment...
Thank you again Kristina!
-
Hi Kavit,
The short answer is no. Google can render some JS - possibly even AngularJS - so never assume that something rendered in JS is invisible to Google. You should assume that Google can see all links visitors can, and really push for a nofollow tag.
I usually check what Google can render by loading Google's cache of the page (go to Google.com and type in "cache:" in front of the exact URL of one of your pages). Look at the text-only version of the cache, and see if Google puts a link there. If they do, it's safe to assume that they can see that link. Another option is to use GSC to Fetch as Google; Google claims this is exactly what they're seeing.
If both the cache and GSC show that Google can't see a link, Google's probably not crawling it. But, Google's always getting better, and could suddenly see the links any day now. If these links are really a concern to you, I'd strongly suggest that you push your dev team to add nofollow tags to these outgoing links.
Best,
Kristina
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Want to remove a large amount of links from spam sites. SEO company says we will lose a lot of link juice?
Hi, We have a lot of links that have a spam score above 30% and 60%. I don't know if someone has spammed our website. However our SEO company has said we should remove these carefully over a period of 3 months while they add new good links. I don't quite trust this advice. Are they trying to get more business?? They have put doubt in our mind. Can anyone please shed any light on this?? Thank you
White Hat / Black Hat SEO | | YvonneDupree0 -
Buying links - where is the line drawn?
I apologise in advance if this has been discussed before, but I'm a bit confused by this whole buying links/outreach scenario. Example.. High ranking PR site (PR 85) has people advertising they can get you links from that site in exchange for money.
White Hat / Black Hat SEO | | nick-name123
You would give them an article and it would look natural and a link - branded or keyword - links back to you. This is not new to people here who know of this. Obviously there is a difference between a link farm (crap site just selling links) and one of these highly recognised sites where you can obtain a link from. I'm sure a goody 2 shoes will now tell me 'i should do everything natural not be tempted', but I actually dont know where the line is drawn between the same site giving a natural link to me and someone selling a link from the same site. Google isnt going to downgrade the site I'm sure but how do they combat this or even do they combat it? Do we have to accept that buying links is still a normal process and if done in moderation and discretely, you can get away with it?1 -
Malicious links on our site indexed by Google but only visible to bots
We've been suffering from some very nasty black hat seo. In Google's index, our pages show external links to various pharmaceutical websites, but our actual live pages don't show them. It seems as though only certain user-agents see the malicious links. Setting up Screaming Frog SEO crawler using the Googlebot user agent also sees the malicious links. Any idea what could have caused this or how this can be stopped? We scanned all files on our webserver and couldn't find any of malicious links. We've changed our FTP and CMS passwords, is there anything else we can do? Thanks in advance!
White Hat / Black Hat SEO | | SEO-Bas0 -
Still Battling On With Link Profile Audit
I'm getting there, I can see the light! 🙂 I have covered one complete audit of the link profile and I am now going back over it looking at the links I had 'question marked' - I should have this completed by the end of this week and I will then focus on using DISAVOW for the links that I am really struggling with, the foreign sites that are in Chinese or Russian, the sites that have absolutely no 'contact us' information and have been privately registered (in WhoIs) I have come across this domain which links to our site about 8 times and although I cannot find any contact info I can't quite make my mind up, to be honest I would rather get rid of it BUT I'm trying to avoid taking the easy option of disavowing where I can; http://www.askives.com/ Fo anyone who has gone through what I am currently going through, please help me just this once and tell me 'should it stay or should it go'?! 🙂 Many thanks! Andy
White Hat / Black Hat SEO | | TomKing0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Google Sitemaps & punishment for bad URLS?
Hoping y'all have some input here. This is along story, but I'll boil it down: Site X bought the url of Site Y. 301 redirects were added to direct traffic (and help transfer linkjuice) from urls in Site X to relevant urls in Site Y, but 2 days before a "change of address" notice was submitted in Google Webmaster Tools, an auto-generating sitemap somehow applied urls from Site Y to the sitemap of Site X, so essentially the sitemap contained urls that were not the url of Site X. Is there any documentation out there that Google would punish Site X for having essentially unrelated urls in its sitemap by downgrading organic search rankings because it may view that mistake as black hat (or otherwise evil) tactics? I suspect this because the site continues to rank well organically in Yahoo & Bing, yet is nonexistent on Google suddenly. Thoughts?
White Hat / Black Hat SEO | | RUNNERagency0 -
Would you get link from this blog?
I have an opportunity to place a guest blog on a site. The site has the following metrics: DA/PA: 24/36 Inbound links: 3K+ from 16 root domains Here is what makes me uneasy: The number of links from the same domain, suggesting sitewide or footer links When I look at the backlinks, there are links from sites like http://best-american-law-firms.info/, or http://www.luvbuds.info/. They sare blogroll links that are likely paid for. Would you get a link from this blog?
White Hat / Black Hat SEO | | inhouseseo0 -
Google Panelizes to much SEO
I just read this interesting article about a new Google Penalty that will be up in the next upcoming weeks/months about Google making changes to the algorithm. The penalty will be targeted towards websites that are over optimized or over seo'ed. What do you think about this? Is this a good thing or is this not a good thing for us as SEO marketeers? here's the link: SEL.com/to-much-seo I'm really curious as to your point of views. regards Jarno
White Hat / Black Hat SEO | | JarnoNijzing0