Does Google Read Javascript?
-
I would like to include a list of links in a select type box which I would like google to follow. In order to do this, I will be styling it with the help of javascript, and in turn change the select box into a ul and the options into li's. The li's would each contain a link, but if javascript is disabled it will fallback to a normal css styled select box.
My question is would google follow the links made by the javascript? Or would the bot just recognize the select box as a select box and not links.
Thanks for any help!
-
Some additional notes...
Traditionally, Google hasn't followed javascript, but they are getting much better at using it for link discovery.
Just a few weeks ago Matt Cutts made a video in which he recommended making your javascript more readable:
http://www.youtube.com/watch?feature=player_embedded&v=8yTn_HLDaJs
And some evidence in the wild about Google becoming more script friendly:
http://www.webpronews.com/is-googlebot-getting-more-human-like-2012-05
That said, it's still by far better for SEO purposes to ensure that your links are HTML based. Although Google may still discover javascript links, it's unknown what link attributes like anchor text and PageRank pass through them. Best practices, for now, still say to use regular HTML links.
-
Matt says that Google can read some javascript...
Danny Sullivan speculates that they will be able to see those links:
http://searchengineland.com/google-can-now-execute-ajax-javascript-for-indexing-99518
But in my opinion, if you want your links to be seen without a doubt, don't put them in java.
-
I would assume you're talking about using a JS method to create links in the DOM. The answer here is no, Googlebot will not recognize them. Google has improved their ability to read some JS (like if you use an embedded JS link Googlebot can understand that) but it does not interpret or execute JS that I know of.
If you go this route, I would suggest enumerating your links in a noscript block.
-
Google doesn't follow javascript so if you want the links followed on that page you'll need to somehow include them in your code.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Does not find Internal links
Hi guys I involved in difficult situation. in google webmaster tools -> internal links some important pages doesn't have any links from all pages. for example home page just have 9000 internal inks but there are 29000 indexed pages by google and some not important pages have 27000 internal links.(more than home page) Site made by angular v1 Is there anyone can help me why google could not find all internal links?
Technical SEO | | cafegardesh0 -
Vanity URLs are being indexed in Google
We are currently using vanity URLs to track offline marketing, the vanity URL is structured as www.clientdomain.com/publication, this URL then is 302 redirected to the actual URL on the website not a custom landing page. The resulting redirected URL looks like: www.clientdomain.com/xyzpage?utm_source=print&utm_medium=print&utm_campaign=printcampaign. We have started to notice that some of the vanity URLs are being indexed in Google search. To prevent this from happening should we be using a 301 redirect instead of a 302 and will the Google index ignore the utm parameters in the URL that is being 301 redirect to? If not, any suggestions on how to handle? Thanks,
Technical SEO | | seogirl221 -
Google Indexing - what did I missed??
Hello, all SEOers~ I just renewed my web site about 3 weeks ago, and in order to preserve SEO values as much as possible, I did 301 redirect, XML Sitemap and so on for minimize the possible data losses. But the problem is that about week later from site renewal, my team some how made mistake and removed all 301 redirects. So now my old site URLs are all gone from Google Indexing and my new site is not getting any index from Google. My traffic and rankings are also gone....OMG I checked Google Webmaster Tool, but it didn't say any special message other than Google bot founds increase of 404 error which is obvious. Also I used "fetch as google bot" from webmaster tool to increase chance to index but it seems like not working much. I am re-doing 301 redirect within today, but I am not sure it means anything anymore. Any advise or opinion?? Thanks in advance~!
Technical SEO | | Yunhee.Choi0 -
Hit by Google
My site - www.northernlightsiceland.com - has been hit by google and Im not sure why. The traffic dropped 75% last 24 hours and all the most important keywords have dropped significantly in the SERP. The only issue I can think of are the subpages for the northern lights forecasting I did every day e.g. http://www.northernlightsiceland.com/northern-lights-forecast-iceland-3-oct-2012/ I have been simply doing a copy/paste for 1 month the same subpage, but only changing the top part (Summary) for each day. Could this be the reason why Im penalized? I have now simply taken them all down minus the last 3 days (that are relevant). What can I do to get up on my feet again? This is mission critical for me as you can imagine. Im wondering if it got hit by this EMD update on 28 sept that was focusing on exact match domains http://www.webmasterworld.com/google/4501349-1-30.htm
Technical SEO | | rrrobertsson0 -
How do you get a Google+ pic in your SERP snippet
Hi from from 20 degrees C 83% humidity wetherby UK 🙂 A few weeks back i decided i needed to get my pretty face appearing in my serps for www.davidclick.com But after having set up a Gppgle+ account and linking my site to the Google+ account i think I may have done something wrong 😞 I linked to the Google+ page via a footer link in www.davidclick.com but alas I'm not able to get my face in my SERP which this website has: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/google-plus-picJPGcopy.jpg So my question is please - "How do you get your Google+ account image to appear in the SERPS. Ta muchly,
Technical SEO | | Nightwing
David0 -
Unit # No Longer Showing On Google Places
We've noticed that the unit # is no longer showing on a clients Places profile page. Any thoughts on why and if relevant to rankings? Places page - https://skitch.com/kyegrace/8fimr/vuppie-real-estate-team or http://maps.google.ca/maps/place?hl=en&qscrl=1&nord=1&rlz=1T4GGNI_en-GBCA461CA461&gs_upl=&ion=1&bav=on.2,or.r_gc.r_pw.r_cp.,cf.osb&biw=1366&bih=641&wrapid=tlif133037839755610&um=1&ie=UTF-8&q=realtor&fb=1&gl=ca&hq=realtor&hnear=0x548673f143a94fb3:0xbb9196ea9b81f38b,Vancouver,+BC&cid=5594900399034912659&ei=o_ZLT4XHM6ObiQKZnbCdDw&sa=X&oi=local_result&ct=placepage-link&resnum=8&ved=0CIMBEOIJMAc and as it appears in the backend https://skitch.com/kyegrace/8fimk/google-places-analytics Any insight greatly appreciated!
Technical SEO | | kyegrace0 -
Javascript late loaded content not read by Gogglebot
Hi, We have a page with some good "keyword" content (user supplied comment widget), but there was a design choice made previously to late load it via JavaScript. This was to improve performance and the overall functionality relies on JavaScript. Unfortunately since it is loaded via js, it isn't read by Googlebot so we get no SEO value. I've read Google doesn't weigh <noscript>content as much as regular content. is this true? Once option is just to load some of the content via <noscript> tags. I just want to make sure Google still reads this content.</p> <p>Another option is to load some of the content via simple html when loading the page. If JavaScript is enabled, we'd hide this "read only" version via css and display the more dynamic user friendly version. - Would changing display based on js enabled be deemed as cloaking? Since non-js users would see the same thing (and this provides a ways for them to see some of the functionality in the widget, it is an overall net gain for those users too).</p> <p>In the end, I want Google to read the content but trying to figure out the best way to do so.</p> <p>Thanks,</p> <p>Nic</p> <p> </p></noscript>
Technical SEO | | NicB10 -
Multiple Google Places listings under review
I have a client with a waste removal business who had multiple listings on his Google Places account for different service locations. Over the last four months I have been creating separate listings for each separate service location, each under a different Google account and with a unique business name, address, phone number and website URL. They have all been verified by postcard and listed separately in local directories so that they have citations. As I have been creating the new listings I have also been deleted the old ones to make sure they are not flagged as duplicates. 2 months ago all the listings in the client's Google Places account were placed under review. I made some changes and submitted it for re-review but no go with Google. Now all the new listings I set up have also suddenly been placed under review. About a week ago I noticed that information in two separate Google Places listings was being mixed up - for example, the website URL for one listing was being shown in another listing. There is no connection between these two listings other than that they were both set up from the same IP address so this seems very strange. I reported this to Google and asked them to sort it out, then all of a sudden I found that ALL of the new listings had been placed under review. So now my client has no active listings at all. He can't afford to wait another 2 months for Google to review all the listings again so I am wondering whether the best course of action would just be to delete everything and start over. Any advice would be most welcome!
Technical SEO | | EssexGirl0