Googlebot soon to be executing javascript - Should I change my robots.txt?
-
This question came to mind as I was pursuing an unrelated issue and reviewing a site's robots/txt file.
Currently this is a line item in the file:
Disallow: https://* According to a recent post in the Google Webmasters Central Blog: [http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html](http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html "Understanding Web Pages Better") Googlebot is getting much closer to being able to properly render javascript. Pardon some ignorance on my part because I am not a developer, but wouldn't this require Googlebot be able to execute javascript? If so, I am concerned that disallowing Googlebot from the https:// versions of our pages could interfere with crawling and indexation because as soon as an end-user clicks the "checkout" button on our view cart page, everything on the site flips to https:// - If this were disallowed then would Googlebot stop crawling at that point and simply leave because all pages were now https:// ??? Or am I just waaayyyy over thinking it?...wouldn't be the first time! Thanks all! [](http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html "Understanding Web Pages Better")
-
Excellent answer. Thanks so much Doug. I really appreciate it! Adding a "nofollow" attribute to the Checkout button is a good suggestion and should be fairly easy to implement. I realize that internal nofollows are not normally recommended, but in this instance, may not be a bad idea.
-
Hi Dana,
When you click on the checkout button - what's the mechanism for taking people to the https:// site. Is it just that the checkout link uses https:// in it's link? Is there some javascript wizardry you're particularly concerned about?
Even though googlebot follows this one link to the https version of the cart, it will still have all the other links on the previous page queued up to follow (non-https) so I don't think this will stop the crawl at that point. It would be a nightmare if googlebot stopped crawling hte entire site everytime it went down a rabbit hole!
That's not to say that you wouldn't want to consider no-following your checkout button. I'm sure neither you, nor google want to the innards of the cart pages to be indexed? There's probably other pages you'd rather Googlebot spent it's time finding right?
My take on the Google blog about understanding Javascript is that the aim is to try and do a better job discovering content that might be hidden by Javascript/Ajax. It's a problem for google when the raw html that they're crawling doesn't accurately reflect the content that is displayed in front of a real visitor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can Yoast plugin change back old title automatically?
I have a problem with the first page on my website. Im using Wordpress and Yoast SEO plugin. After I have changed and updated some information on the page, I also wanted to change and make a better page title. That I understand is to be changed under pages, by scrolling down to the Yoast setting for the specific page. Is that right? However, I have changed the title over and over again, and asked Google to re-index the page. Everything is fine directly after that. But when I check again after 24-48 hours, the page have automatically changed back to the old title? How is that possible? I´ve tried about 5-10 times, but it does the same thing after 24-48 hours every time. Hope you people with great knowledge can help me out here. 🙂
Algorithm Updates | | Masse0 -
Homepage title tag: "Keywords for robots" vs "Phrases for users"
Hi all, We keep on listening and going through the articles that "Google is all about user" and people suggesting to just think about users but not search engine bots. I have gone through the title tags of all our competitors websites. Almost everybody directly targeted primary and secondary keywords and few more even. We have written a very good phrase as definite title tag for users beginning with keyword. But we are not getting ranked well comparing to the less optimised or backlinked websites. Two things here to mention is our title tag is almost 2 years old. Title tag begins with secondary keyword with primary keyword like "seo google" is secondary keyword and "seo" is primary keyword". Do I need to completely focus on only primary keyword to rank for it? Thanks
Algorithm Updates | | vtmoz0 -
Anyone Notice Google's Latest Change Seems to Favor Google Books?
I've noticed a change in the search results lately. As I search around I notice a lot of results from books.google.com Seems a little (ok a lot) self serving... JMHO
Algorithm Updates | | get4it1 -
50% drop in search, no changes to site over 2 days, no notifications, A rank...
My URL is: http://applianceassistant.com
Algorithm Updates | | applianceassistant
With no changes to my site, I suddenly experienced a huge drop in search queries on Aug1. Your company has still given me an overall rating of A. I just thought you may be able to help or be interested in my case due to it's strange nature. Due to some suggestions on the webmaster forums, I have disavowed all low quality back links to the site, and I am currently working through each page trying to make the key words a little less spammy. Here are some screen shots of the action...
https://lh6.googleusercontent.com/-WgXUf-lvUyg/U-nrWNgspPI/AAAAAAAAAEI/imoI190LUns/s1600/Analytics_081214.tiff
https://lh4.googleusercontent.com/-srmvn288rr0/U-pxlwoycVI/AAAAAAAAAEg/ckmyX_2Sl_Y/s1600/PAGES_AUG.tiff
https://lh3.googleusercontent.com/-DVCYxhkutbQ/U-pxpQVfYfI/AAAAAAAAAEo/MN9PiLFT-zs/s1600/pages_july.tiff This appears to be almost a 50% 2 year set back. Any ideas or suggestions are greatly appreciated0 -
Changes in Google "Site:" Search Algorithm Over Time?
I was wondering if anyone has noticed changes in how Google returns 'site:' searches over the past few years or months. I remember being able to do a search such as "site:example.com" and Google would return a list of webpages where the order may have shown the higher page rank pages (due to link building, etc) first and/or parent category pages higher up in the list of the first page (if relevant) first (as they could have higher PR naturally, anyways). It seems that these days I can hardly find quality / target pages that have higher page rank on the first page of Google's site: search results. Is this just me... or has Google perhaps purposely scrambled the SERPS somewhat for site: searches to not give away their page ranking secrets?
Algorithm Updates | | OrionGroup1 -
How can the same key term change positioning seasonally impacting some companies and not others?
The company I do SEO work for is a vacation rental company. Through research, reports, and crunching numbers I've discovered the seasonality of the buiness (when what we do is a popular search and when it is not). The problem I've encountered is that our average position for google seems to follow this exact same seasonality, but only for us. What I mean is, as the season for looking up vacation rentals in our area fades away we take a serious nose dive in our positioning, yet when the season comes back we once again are first page. The other interesting fact is this isn't happening to the other companies who are in the same area. How can the same key term change positioning seasonally like that and impact some companies and not others? Can you give me some insights into where to at least look or pursue? I should have started with you guys, but I had so far spent 4 months trying to get answers and so far I've come up with nothing more than guesses and frustration. Please help me.
Algorithm Updates | | SSRMarketing0 -
Today all of our internal pages all but completely disappeared from google search results. Many of them, which had been optimized for specific keywords, had high rankings. Did google change something?
We had optimized internal pages, targeting specific geographic markets. The pages used the keywords in the url title, the h1 tag, and within the content. They scored well using the SEOmoz tool and were increasing in rank every week. Then all of a sudden today, they disappeared. We had added a few links from textlink.com to test them out, but that's about the only change we made. The pages had a dynamic url, "?page=" that we were about to redirect to a static url but hadn't done it yet. The static url was redirecting to the dynamic url. Does anyone have any idea what happened? Thanks!
Algorithm Updates | | h3counsel0 -
Google changing the casing in SERPs of our domain name in Title tag!
I've added NOODP and NOYDIR metas to our pages... but Google is still somehow showing the correct title tag that is on the page, but is changing the CASING of the | Domain.com portion. In some instances, they are still showing a different title tag all together. Why would they be ignoring the <title>tag on the page and placing an uncased version of our domain name at the end?</p> <p> </p> <a download="MxQjo" class="imported-anchor-tag" href="http://imgur.com/MxQjo" target="_blank">MxQjo</a></title>
Algorithm Updates | | CareerBliss0