In order for Google to recognize a hyper-link on your website, does it have to be written in a specific java script?
-
Does it have to read as the following script?
-
Not a problem I find that all too often, if the question is a bit ambiguous - people will ignore it. If there are only a handful of interpretations, I will still try to answer
-
Thank you, that was extremely insightful and helpful.
-
Just so you are aware, the code-sample which you supplied is HTML and not JavaScript (or for that matter, any type of script. Scripting languages include JavaScript, Python, Ruby, Perl etc).
You may be asking one of two things (I think!):
1) Is there a set HTML format for hyperlinks which Google knows how to read?
Yes, and you can find **information all about ** conventional use of the <a></a><a>(HTML) tag here:</a>
<a></a>
<a></a>
HTML is a static language and is not (unlike many scripting languages) 'object oriented'. You don't define "<a>" and as such</a> <a>is not interpreted based upon your programmed parameters.</a> <a>always means the same thing (to a a web browser). Sure stuff like CSS can style links in different ways, JavaScript can modify</a> <a>tags by injecting event-tracking attributes etc (also a common use of jQuery) but fundamentally the usage of</a> <a>is</a> <a>(mostly) universally agreed. So yes - links are coded according to conventions and Google will interpret those widely accepted conventional use-cases, as well as a few more experimental deployments (possibly through error handling in Google's algorithms). In general, you should follow W3C / W3 Schools guidelines. There are many forms of link (no-followed links, text links, image links) and all are valid but yes - they are predetermined</a>
<a>2) This is the HTML which my JavaScript will output - is it ok?
Yeah it's fine dude. If you can handle JS, you can handle HTML (it's way simpler). One thing though, although Google can deploy rendered (JS-enabled) crawling, that involves using headless browsers and such to render the 'modified' source code (so, what you see in 'inspect element' is the modified source. What you see in "view page source" is different, that's the pre-modified or base-source code).
Usually speaking this takes 10x longer than simple DOM / base-source scrapes. As such if Google were to deploy that tech on every crawl for every page on the web, the efficiency hit to their 'index the web' mission would be colossal. Many studies show that Google will not render JS on all sites (especially one perceived to be low value). Even on sites where they will use this tech, they won't deploy it all of the time. There really is no substitute for forcing your links and content to be readable in the base-source code (un-modified). It's way better for crawlers, way more efficient for them to work with. Just because Google ' can' do something, it doesn't mean they always will. It doesn't mean it's a good idea to ignore basic SEO principles!
Hope that helps</a>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How long does it take my links to get indexed by Google?
How long does it take my links to get indexed by Google? Also Is it possible to get my links indexed faster?
Link Building | | EugeneMot0 -
Why are bit.ly and ow.ly links showing up as Inbound Links?
Hello! I noticed this question was asked a while back (around 2013 i think), but I still don't quite understand why I'm seeing these shortened URLs come up as Inbound Links? Can someone please explain? Thanks,
Link Building | | txwildcard
Carlos0 -
Self linking - loss of link juice?
There are 10 links on page that point to this page where they exist (link to the page itself). Does it influence on ranking? And what's happening to link juice? How does it spread? Is there any loss of it?
Link Building | | templatemonster0 -
Should I continue adding new content on the website penalized by Google?
I have a website that is imposed partial action by Google due to unnatural links. We are in the process to clean up those links. At the same time, I want to add more content to the same website. My concern is whether the new content pages will be affected by Google's action as well. Should I wait until we finish the cleanup and get a green light from Google? In addition, will it be worth doing some on-page optimization on the pages that are affected by unnatural links and Google's penalty? Please advise. Thanks. John
Link Building | | pianomother0 -
Indirect Link Earning via dofollow Links In News Articles
Hello, MOZ SEO Gurus. I've been trying to think some deep thoughts on safe, effective link earning for news publishing sites, and wanted to run this up the flagpole and see if you salute. Our site is a biotech news service -- we pump out copious amounts of news content each day, which works well for driving traffic. That being said, we also want to rank some optimized landing pages as well. Take, for example, this page, which we'd like to rank for "secondary progressive MS" and related keywords: http://bionews-tx.com/secondary-progressive-ms/ Now, as far as I'm concerned, shopping this page around to MS influencers isn't easy. I can go to Foundational websites, blogs, etc., and say, "hey, we have this info page on SPMS, and I thought that you might find it helpful/want to link to it." But chances are, the MS influencers already have their own proprietary content on SPMS, and there isn't much value to linking to it. Therefore, I think that we'll get few link earning conversions on the effort. However, what if I take our Secondary Progressive MS landing page, and I link to it in a corresponding article about SPMS research, as I did here: http://bionews-tx.com/news/2014/01/30/secondary-progressive-ms-natalizumab-clinical-trial/ Then, I go to the drug developer who is at the center of this story and say to them, "hey, we recently covered your drug in the news, and I thought you might want to link to it." Then, we get a link from an MS drug developer to the news article, which in turn has a prominent anchor text, dofollow internal link to the landing page for SPMS. If the link from the drug developer is dofollow, then we flow page rank juice from the drug developer page to our news page to our landing page. To me, it's much easier to earn safe links this way than to try and shop the landing page itself. That being said, if we get a dofollow link on the news piece, we only get a diminished portion of page rank going to the landing page. Is this strategy viable? Is the indirect flow of page rank from a linking site to a news article to a landing page even worth it? I'd love to hear your thoughts. Thanks!
Link Building | | bionewstx2 -
Can high SERPS and/or social signals minimize Google penalties and a back linking removal question
As I am continually sizing up my competition in the SERPS I have scanned their sites with a fine tooth and comb. I have found that these sites practice in the very things that I have practiced in the past and have removed thinking that may be some of the reasons I was hit with Penguin. Some of these factors are: Link Scheme with sites they own (C Blocks) Content for Search Engines (Keyword rich text) Exact anchor text in back linking profile Yet even though my competition practices in these methods (One site even places exact anchor text in the footer and header of every page for one of their other forum site) they seem to have not even been touched with any of the recent updates. In fact it seems their ranking have increased. In scanning these sites the only major difference that I have been able to see between them and I is that their SERPS are higher than mine and they have way more social signals than me. One site has about 73k facebook likes where I only have about 300. My question is Can Google ignore penalties for sites that have higher SERPS and /or social signals that would effect another site that had lower ones? My other question is related to back links My main site has links from another site I built a long time ago (Pre SEO and not knowing what I was doing) somewhere in the 73k range. Obviously a HUGE signal to Google that this might be spam and I am aware. I have removed the links from that site but unfortunately the average crawl rate per day is very low so it is taking a very long time for Google to find those pages and re-crawl them to find the links gone. Since that site I have than has those links pointing to my main site has very low traffic I am totally willing to kill that entire site with a 404. Can this help speed up the removal of those links from that site? I figure since the site no longer exists all links from that site will be removed almost immediately from my main site. Any thoughts?
Link Building | | cbielich0 -
Google sending warnings about Artificial or Unnatural links in Google Webmaster Central
Has anyone seen warnings about Artificial or Unnatural links notice show up in their Google Webmaster central yet? I just looked at each of our clients and had not but after reading the Search Engine Land article at http://searchengineland.com/google-warning-more-about-bad-link-networks-117079 I was wondering what your thoughts were regarding this topic? We are not purchasing links for our clients sites but definitely going out their and building links. Article concerned me that I may run into issues for my link building clients. Thoughts?
Link Building | | jfeitlinger0 -
SEOs and web developers frequently leave links to their site in the footer of their clients' sites. Does this negatively impact the site with the links?
Does this provide any SEO value to the receiving site? Has anyone experienced problems doing this?
Link Building | | KatMouse2