Brushing up on my SEO skills - how do I check my website to see if Javascript is blocking search engines from crawling the links within a javascript-enabled drop down menu?
-
I set my user agent in my Chrome browser to Googlebot and I disable javascript within my Chrome settings, but then what?
-
What you can do is basically refresh the page and see if the menus still work. If the menu itself is still working at that point you'll know that there is no JS needed to run see the menu. The alternative is just checking in the Console to see if the code is being rendered there.
-
Hi,
You can run a crawl in Screaming Frog without JavaScript rendering enabled to see how well your site can be crawled without JavaScript.
That will be a lot faster than trying to simulate a crawl manually.
Cheers,
David
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to check if my website is penalized ?
Hi all, I just went over some post that my page can get penalized for over optimizing. I realized my page has quite a lot of h1 (6 it had 30) and a lot of "bold" keywords, does the bolding affect the page seo/penalizing the page? the page im talking about it palmislander.com/dumaguete-travel-guide Thanks
Technical SEO | | i3arty0 -
Salvaging links from WMT “Crawl Errors” list?
When someone links to your website, but makes a typo while doing it, those broken inbound links will show up in Google Webmaster Tools in the Crawl Errors section as “Not Found”. Often they are easy to salvage by just adding a 301 redirect in the htaccess file. But sometimes the typo is really weird, or the link source looks a little scary, and that's what I need your help with. First, let's look at the weird typo problem. If it is something easy, like they just lost the last part of the URL, ( such as www.mydomain.com/pagenam ) then I fix it in htaccess this way: RewriteCond %{HTTP_HOST} ^mydomain.com$ [OR] RewriteCond %{HTTP_HOST} ^www.mydomain.com$ RewriteRule ^pagenam$ "http://www.mydomain.com/pagename.html" [R=301,L] But what about when the last part of the URL is really screwed up? Especially with non-text characters, like these: www.mydomain.com/pagename1.htmlsale www.mydomain.com/pagename2.htmlhttp:// www.mydomain.com/pagename3.html" www.mydomain.com/pagename4.html/ How is the htaccess Rewrite Rule typed up to send these oddballs to individual pages they were supposed to go to without the typo? Second, is there a quick and easy method or tool to tell us if a linking domain is good or spammy? I have incoming broken links from sites like these: www.webutation.net titlesaurus.com www.webstatsdomain.com www.ericksontribune.com www.addondashboard.com search.wiki.gov.cn www.mixeet.com dinasdesignsgraphics.com Your help is greatly appreciated. Thanks! Greg
Technical SEO | | GregB1230 -
Search Operator Link: How accurate is the data?
How often do you guys use the search operator "link:"? How accurate is the data? Why the numbers of links when we use the parameter is way lower than the number on Google webmaster Tool or open site explorer? It only show the most powerful links?
Technical SEO | | Felip30 -
Blocking Affiliate Links via robots.txt
Hi, I work with a client who has a large affiliate network pointing to their domain which is a large part of their inbound marketing strategy. All of these links point to a subdomain of affiliates.example.com, which then redirects the links through a 301 redirect to the relevant target page for the link. These links have been showing up in Webmaster Tools as top linking domains and also in the latest downloaded links reports. To follow guidelines and ensure that these links aren't counted by Google for either positive or negative impact on the site, we have added a block on the robots.txt of the affiliates.example.com subdomain, blocking search engines from crawling the full subddomain. The robots.txt file is the following code: User-agent: * Disallow: / We have authenticated the subdomain with Google Webmaster Tools and made certain that Google can reach and read the robots.txt file. We know they are being blocked from reading the affiliates subdomain. However, we added this affiliates subdomain block a few weeks ago to the robots.txt, but links are still showing up in the latest downloads report as first being discovered after we added the block. It's been a few weeks already, and we want to make sure that the block was implemented properly and that these links aren't being used to negatively impact the site. Any suggestions or clarification would be helpful - if the subdomain is being blocked for the search engines, why are the search engines following the links and reporting them in the www.example.com subdomain GWMT account as latest links. And if the block is implemented properly, will the total number of links pointing to our site as reported in the links to your site section be reduced, or does this not have an impact on that figure?From a development standpoint, it's a much easier fix for us to adjust the robots.txt file than to change the affiliate linking connection from a 301 to a 302, which is why we decided to go with this option.Any help you can offer will be greatly appreciated.Thanks,Mark
Technical SEO | | Mark_Ginsberg0 -
Are similar title tags frowned upon by search engines?
We are a B2B company that is looking to convert to one global portal very soon. It is only then that we will be able to address a lot of the IA SEO issues we are currently facing. However, we are looking to make some quick fixes, namely adding a proper title to the different country homepages. Will having the same title, with only the country modifier swapped out be a good tactic? Since we want a unified title across all country sites, it just makes sense that we don't change it for every single country. For example: Small Business Solutions for B2B Marketers | Company USA Small Business Solutions for B2B Marketers | Company Italy Small Business Solutions for B2B Marketers | France
Technical SEO | | marshseo0 -
Site wide search v catalogue search
I have a client building a new web site who has agreed that a site search function is a good thing in order to get a view on how customers are using the site, the search terms they are using as a source of keywords etc. The problem is the developer has implemented a catalogue/product search which only queries the products in the database. On the one hand this is fine in that the search is directing users to products and not to other areas of the site. But the customer is disappointed that the search is not site wide. Are there any solutions where third party search utility could be implemented whithin the site which will search both? The ecommerce platform is Magento. Any views would be very helpful!
Technical SEO | | k3nn3dy30 -
CSS for SEO - can search engine see the text in the body?
We use CSS to arranging (absolute positioning) our content to makes it easier to crawl. I am using your On-Page Keyword Optimization tool and other tools to check our pages (i.e. http://www.psprint.com/gallery/invitation-cards), to make sure it works. For the “On-Page Keyword Optimization” tool, it gives a petty good grade (I guest it sees the text in the body). However, when I am using other tool to test the page (e.g. http://tools.seobook.com/general/spider-test/) it could not see the text in the body. Did we do something wrong? Thanks Tom
Technical SEO | | tomchu0