Google and JavaScript
-
Hey there!
Recent announcements at Google to encourage webmasters to let Google crawl Java Script http://www.googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html
http://googlewebmastercentral.blogspot.com/2014/05/rendering-pages-with-fetch-as-google.html
We have always put JS and CSS behind robots.txt, but now considering taking them out of robots.
Any opinions on this?
-
When Google puts out recommendations like this, they rarely lead people on a self-destructive path. If JS and CSS files could contain relevant information to help Google crawl or index your site more appropriately, then I say let them see those. Sorry I have no data to back up my position, but the articles you listed make a good case. I read a similar article weeks ago and unblocked JS and CSS from robots.txt, but I haven't really thought about this since.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve JavaScript paginated content for SEO
In our blog listings page, we limit the number of blogs that can be seen on the page to 10. However, all of the blogs are loaded in the html of the page and page links are added to the bottom. Example page: https://tulanehealthcare.com/about/newsroom/ When a user clicks the next page, it simply filters the content on the same page for the next group of postings and displays these to the user. Nothing in the html or URL change. This is all done via JavaScript. So the question is, does Google consider this hidden content because all listings are in the html but the listings on page are limited to only a handful of them? Or is Googlebot smart enough to know that the content is being filtered by JavaScript pagination? If this is indeed a problem we have 2 possible solutions: not building the HTML for the next pages until you click on the 'next' page. adding parameters to the URL to show the content has changed. Any other solutions that would be better for SEO?
Intermediate & Advanced SEO | | MJTrevens1 -
Google Search Operators Acting Strange
Hi Mozers, I'm using search operators for a count of how many pages have been indexed for each section of the site. I was able to download the first 1000 pages from Google Search Console but there are more than 1000 pages indexed, so I'm using operators for a count (even if I can't get the complete list of indexed URLs). [Although, if there is a better way, PLEASE let me know!] Anyway, in terms of search operators: from my understanding, the more general the URL, the more results should come up. However, when I put in the domain site:www.XXX it gives me FEWER results than when I put in site:www.XXX/. When I add the backslash to the end of the domain, it gives me MORE results. And when I put in site:www.AAA/BBB/CC it gives me MORE results than when I put in site:www.AAA/BBB. What's with this? Yael
Intermediate & Advanced SEO | | yaelslater1 -
How good is Google at reading geo-targeted dynamic content -- Javascript?
We are using a single page application for a section of our website where it generates content based on the user's geographical location. Because Google's Search Console is searching from Virginia (where we don't have any content), we are not able to see anything render in Google Search Console. How good is Google at reading geo-targeted dynamic content? Do we have anything to worry about in terms of indexing the content because it's being served through JS?
Intermediate & Advanced SEO | | imjonny1230 -
Sitemap Migration - Google Guidelines
Hi all. I saw in support.google.com the following text: Create and save the Sitemap and lists of links A Sitemap file containing the new URL mapping A Sitemap file containing the old URLs to map A list of sites with link to your current content I would like to better understand about a "A list of sites with bond link to current content" Question 1: have I need tree sitemaps simultaneously ?
Intermediate & Advanced SEO | | mobic
Question 2: If yes, should I put this sitemap on the Search Console of the new website?
Question 3: or just Google gave a about context how do we make the migration? And I'll need really have sitemaps about the new site only..? What about is Google talking? Thanks for any advice.0 -
Google + and Schema
I've noticed with a few of the restaurant clients I work with that Schema isn't contributing at all to their SERP -- their Google + page is. Is there any way to have more control over what Google is pulling to help make UX better? I.e. showing photos of the restaurant without a logo, etc.
Intermediate & Advanced SEO | | Anti-Alex0 -
Google Manual Action Disappear
Hi Guys, I have a website that received unnatural link message. We had started link removal process, disavowed not approachable links and file reconsideration four times but all the times Google sent some samples of unnatural links and rejected our reconsideration. Last week we again disavowed non approachable links and planned to file reconsideration after a week but today when i tried to file reconsideration, Manual action message was disappeared. We haven't received any message from Google. The same case was happened with one more of our site earlier. Manual action disappeared means it has been revoked or something else??
Intermediate & Advanced SEO | | RuchiPardal0 -
Google Processing but Not Indexing XML Sitemap
Like it says above, Google is processing but not indexing our latest XML sitemap. I noticed this Monday afternoon - Indexed status was still Pending - and didn't think anything of it. But when it still said Pending on Tuesday, it seemed strange. I deleted and resubmitted our XML sitemap on Tuesday. It now shows that it was processed on Tuesday, but the Indexed status is still Pending. I've never seen this much of a lag, hence the concern. Our site IS indexed in Google - it shows up with a site:xxxx.com search with the same number of pages as it always has. The only thing I can see that triggered this is Sunday the site failed verification via Google, but we quickly fixed that and re-verified via WMT Monday morning. Anyone know what's going on?
Intermediate & Advanced SEO | | Kingof50 -
Is Google Webmaster tools Accurate?
Is Google webmaster Tools data completely inaccurate, or am I just missing something? I noticed a recent surge in 404 errors detected 3 days ago (3/6/11) from pages that have not existed since November 2011. They are links to tag and author archives from pages initially indexed in August 2011. We switched to a new site in December 2011 and created 301 redirects from categories that no longer exist, to new categories. I am a little perplexed since the Google sitemap test shows no 404 errors, neither does SEO MOZ Crawl test, yet under GWT site diagnostics, these errors, all 125 of them, just showed up. Any thought/insights? We've worked hard to ensure a smooth site migration and now we are concerned. -Jason
Intermediate & Advanced SEO | | jimmyjohnson0