JS loading blocker
-
Is there a tool, or Chrome extension I can use to load a page, identify the .js on the page, 'uncheck' selected .js and load the page again to check loading correctly? Even better to be able to defer/move to the end of the file to test.
-
Thanks for checking in, Mick!
-
Sorry for the delay. I got sidetracked on another project and this client decided they would leave .js as is for the time being so I have not really tested. Initially I couldn't get the Chrome ext to do what I wanted and need to look at Firefox.
-
Hi Mick, did you find what you were looking for? We'd love an update. Thanks!
Christy
-
thanks. I'll give it a try and let you know.
-
Hey Mick,
I use Firebug there is a version for Chrome, but it was originally built for Firefox.
Full java-script debugging, breaking, conditional breaking, watching, step in, and profiling
Chrome Version Here: https://getfirebug.com/releases/lite/chrome/
Hope this helps,
Don
-
I´ve found this discussion about the same subject if you want to have a look
stackoverflow.com/questions/9698059/disable-single-javascript-file-with-addon-or-extensionSorry but i can´t help you more than this.
Good luck
-
thanks, that's quite handy but not what I need in this case. This tool seems to switch off .js for the whole page. I'm looking for something where I can cherry pick the .js on the page I want to block, or ideally move.
-
Hi,
You can find what you´re looking for https://chrome.google.com/webstore/detail/quick-javascript-switcher/geddoclleiomckbhadiaipdggiiccfje
Hope it helps you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Consolidating 301 Redirects to Decrease Page Load Times - Major Concerns?
Hello, I am being pushed to consolidate our over 6k redirects that have accumulated over the course of 4 years. These redirects are one of the many factors causing extensive load times for our website. Many to most or over a year old, have not been used, or simply redirect back to the home page. Other than looking to keep the pages that have external links (also looking for recommendations/tools), are there other best practices from an SEO stand point to ensure there are no major hits to our website. A little more info, I am looking to pair 6K down by Removing all Redirects that have not been used Removing all redirects that are over 1 yr+ Remove all redirects that redirect to simply the home page or a smaller big bucket subfolder
Technical SEO | | Owner_Account
This should take the number from 6K to around 300. Are there any major concerns? Pat0 -
Fetch as Google - stylesheets and js files are temporarily unreachable
Fetch as Google often says that some of my stylesheets and js files are temporarily unreachable. Is that a problem for SEO? These stylesheets and scripts aren't blocked and Search Consoles show that a normal user would see the page just fine.
Technical SEO | | WebGain0 -
In Facebook when i place my site URL the image does not load?
In Facebook when i place my site URL the image does not load? It loads some generic image or logo but not other image thats related to the page. Is there any Tag we need to add in the website so the image loads? Is it good to use a tag as this for description? property="og:description" content="Some data" />
Technical SEO | | bsharath0 -
Loading images below the fold? Impact on SEO
I got this from my developers. Does anyone know if this will be a SEO issue? We hope to lazy-load images below the fold where possible, to increase render speed - are you aware of any potential issues with this approach from an SEO point of view?
Technical SEO | | KatherineWatierOng1 -
Check my website loading time
Kindly check my website loading time for the home page and deep pages. Do I need to make it fast or it is Okey? Website - brandstenmedia.com.au
Technical SEO | | Green.landon0 -
How does Progressive Loading, aka what Facebook does, impact proper search indexation?
My client is planning on integrating progressive loading into their main product level pages (those pages most important to conversions and revenue). I am not skilled on "progressive laoding" but was told this is what Facebook does. Currently, the site's pages are tabbed and use Ajax. Is there any negative impact by changing this up by including progressive loading? If anyone can help me understand what this is and how it might impact a site from an SEO perspective, please let me know. thanks a ton!! Janet
Technical SEO | | ACNINTERACTIVE1 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0