JS loading blocker
-
Is there a tool, or Chrome extension I can use to load a page, identify the .js on the page, 'uncheck' selected .js and load the page again to check loading correctly? Even better to be able to defer/move to the end of the file to test.
-
Thanks for checking in, Mick!
-
Sorry for the delay. I got sidetracked on another project and this client decided they would leave .js as is for the time being so I have not really tested. Initially I couldn't get the Chrome ext to do what I wanted and need to look at Firefox.
-
Hi Mick, did you find what you were looking for? We'd love an update. Thanks!
Christy
-
thanks. I'll give it a try and let you know.
-
Hey Mick,
I use Firebug there is a version for Chrome, but it was originally built for Firefox.
Full java-script debugging, breaking, conditional breaking, watching, step in, and profiling
Chrome Version Here: https://getfirebug.com/releases/lite/chrome/
Hope this helps,
Don
-
I´ve found this discussion about the same subject if you want to have a look
stackoverflow.com/questions/9698059/disable-single-javascript-file-with-addon-or-extensionSorry but i can´t help you more than this.
Good luck
-
thanks, that's quite handy but not what I need in this case. This tool seems to switch off .js for the whole page. I'm looking for something where I can cherry pick the .js on the page I want to block, or ideally move.
-
Hi,
You can find what you´re looking for https://chrome.google.com/webstore/detail/quick-javascript-switcher/geddoclleiomckbhadiaipdggiiccfje
Hope it helps you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
I want to load my ecommerce site xml via CDN
Hello Experts. My ecommerce site - abcd.com
Technical SEO | | micey123
My ecommrece site sitemap abcd.com/sitemap.xml
My subdomain - xyz.abcd.com ( this is blank page but status is 200 which runs from cdn) My ecommerce site sitemap abcd.com/sitemap.xml contains only 1 link of subdomain sitemap- xyz.abcd.com/sitemap.xml
And this sitemap- xyz.abcd.com/sitemap.xml contains all category and product links of abcd.com So my query is :- Above configuration is okay? In search console I will add new property - xyz.abcd.com. and add sitemap xyz.abcd.com/sitemap.xml So Google will able to give errors for my website abcd.com Purpose - I want to run my xml sitemap from cdn that's why i have created subdomain like xyz.abcd.com Hope you understood my query. Thanks!0 -
Can ht access file affect page load times
We have a large and old site. As we've transition from one CMS to another, there's been a need for create 301 redirects using our ht access file. I'm not a technical SEO person, but concerned that the size of our ht access file might be contributing source for long page download times. Can large ht access files cause slow page load times? Or is the coding of the 301 redirect a cause for slow page downloads? Thanks
Technical SEO | | ahw1 -
Iframes, AJAX, JS, Etc.
Just started SEO on some legacy sites running JS navigation. Are there any proven ways to stop Google from parsing links and passing internal linkjuice? Ex: iframes, Ajax, JS, etc. Google is parsing some JS links on a couple of our legacy sites. The problem is that some pages are getting link juice and others aren't. It's also unpredictable which links are parsed and which aren't. The choice is rebuild the navigation (ouch), or figure out a way to block JS links entirely and build a simple text based secondary nav for link juice distribution. I definitely don't want to use nofollow. Any thoughts?
Technical SEO | | AMHC0 -
A website that will not load on a particular computer? Help Me Please!
We took on a new client about two weeks ago, took them off a proprietary CMS, placed them on a WordPress site, optimized the site, etc. and were finishing up small details three days ago. My PC in my personal office all of a sudden would not load the site from a Google search, from a direct url, etc.
Technical SEO | | RobertFisher
Our office was using a D-Link wireless router but my PC is hardwired in the office. I cranked up my MacBook Pro with solid state drive (6 months old), got on wireless, and....site would not load. PC's and Macs in offices around me would all load the site. A search online brought up a fix for the PC and tried it - did not work, had lead dev try it - did not work, called a server side friend and he had never heard of such a thing. Every fix revolved around changing IP addresses, etc. I uninstalled my antivirus programs on my PC, installed every update that was outstanding, there was no new software installed on either box prior to problem. Can you help??? Is there any chance someone not associated with us and just looking for my client or someone entering a direct url could experience?0 -
Javascript late loaded content not read by Gogglebot
Hi, We have a page with some good "keyword" content (user supplied comment widget), but there was a design choice made previously to late load it via JavaScript. This was to improve performance and the overall functionality relies on JavaScript. Unfortunately since it is loaded via js, it isn't read by Googlebot so we get no SEO value. I've read Google doesn't weigh <noscript>content as much as regular content. is this true? Once option is just to load some of the content via <noscript> tags. I just want to make sure Google still reads this content.</p> <p>Another option is to load some of the content via simple html when loading the page. If JavaScript is enabled, we'd hide this "read only" version via css and display the more dynamic user friendly version. - Would changing display based on js enabled be deemed as cloaking? Since non-js users would see the same thing (and this provides a ways for them to see some of the functionality in the widget, it is an overall net gain for those users too).</p> <p>In the end, I want Google to read the content but trying to figure out the best way to do so.</p> <p>Thanks,</p> <p>Nic</p> <p> </p></noscript>
Technical SEO | | NicB10 -
Good technical parameters worst load time.
I have recently created a page and added expires headers, nonconfigured e-tags and gzip to htaccess code and just after that according to pingdom tools my page load time has doupled although my yslow ponts went from 78 to 92. I always get a lite bit lost with this technical issue. I mean obviously a site should not produce worse results with adding these parameters and this increase in page load time should rather be due to bandwith usage. I suppose I should leave this stuff in the htacces. Than what is an accurate way to know if you have done a real improvement to your site or your load time has really went up? This question is more up to date with css sprites as I always read that sometimes spriting every picture is a waste of resources. How can you decide when to stop?
Technical SEO | | sesertin0 -
Mask links with JS that point to noindex'ed paged
Hi, in an effort to prepare our page for the Panda we dramatically reduced the number of pages that can be indexed (from 100k down to 4k). All the remaining pages are being equipped with unique and valuable content. We still have the other pages around, since they represent searches with filter combination which we deem are less interesting to the majority of users (hence they are not indexed). So I am wondering if we should mask links to these non-indexed pages with JS, such that Link-Juice doesn't get lost to those. Currently the targeted pages are non-index via "noindex, follow" - we might de-index them with robots.txt though, if the "site:" query doesn't show improvements. Thanks, Sebastian
Technical SEO | | derderko0