Googlebot soon to be executing javascript - Should I change my robots.txt?
-
This question came to mind as I was pursuing an unrelated issue and reviewing a site's robots/txt file.
Currently this is a line item in the file:
Disallow: https://* According to a recent post in the Google Webmasters Central Blog: [http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html](http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html "Understanding Web Pages Better") Googlebot is getting much closer to being able to properly render javascript. Pardon some ignorance on my part because I am not a developer, but wouldn't this require Googlebot be able to execute javascript? If so, I am concerned that disallowing Googlebot from the https:// versions of our pages could interfere with crawling and indexation because as soon as an end-user clicks the "checkout" button on our view cart page, everything on the site flips to https:// - If this were disallowed then would Googlebot stop crawling at that point and simply leave because all pages were now https:// ??? Or am I just waaayyyy over thinking it?...wouldn't be the first time! Thanks all! [](http://googlewebmastercentral.blogspot.com/2014/05/understanding-web-pages-better.html "Understanding Web Pages Better")
-
Excellent answer. Thanks so much Doug. I really appreciate it! Adding a "nofollow" attribute to the Checkout button is a good suggestion and should be fairly easy to implement. I realize that internal nofollows are not normally recommended, but in this instance, may not be a bad idea.
-
Hi Dana,
When you click on the checkout button - what's the mechanism for taking people to the https:// site. Is it just that the checkout link uses https:// in it's link? Is there some javascript wizardry you're particularly concerned about?
Even though googlebot follows this one link to the https version of the cart, it will still have all the other links on the previous page queued up to follow (non-https) so I don't think this will stop the crawl at that point. It would be a nightmare if googlebot stopped crawling hte entire site everytime it went down a rabbit hole!
That's not to say that you wouldn't want to consider no-following your checkout button. I'm sure neither you, nor google want to the innards of the cart pages to be indexed? There's probably other pages you'd rather Googlebot spent it's time finding right?
My take on the Google blog about understanding Javascript is that the aim is to try and do a better job discovering content that might be hidden by Javascript/Ajax. It's a problem for google when the raw html that they're crawling doesn't accurately reflect the content that is displayed in front of a real visitor.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Adding non-important folders to disallow in robots.txt file
Hi all, If we have many non-important folders like /category/ in blog.....these will multiply the links. These are strictly for users who access very rarely but not for bots. Can we add such to disallow list in robots to stop link juice passing from them, so internal linking will me minimised to an extent. Can we add any such paths or pages in disallow list? Is this going to work pure technical or any penalty? Thanks, Satish
Algorithm Updates | | vtmoz0 -
Future address change and local search
I have a client who targets a particular city, and up until now has had his physical location in the suburbs of that city. This April 1, his office will have the city address he has been targeting. I have spent a lot of time over the past year claiming ownership of all local directory listings and consolidating addresses as he has moved several times in the past 5 years. Looking at this as an opportunity to get the official USPS address he will be using and use the exact same address for everything. So many different variations out there right now for him. Wondering if it would be ok to start promoting the new address before the April 1 move and also when to start with the directory listings. Also, have held off on purchasing the yahoo directory link because of the suburban address but reconsidering this as of April 1 as well.
Algorithm Updates | | c2g0 -
Webmaster Guidelines Change History
Does any one have the dates of changes to Googles Webmaster Guidelines?
Algorithm Updates | | MiroAsh0 -
Google Local Algorithm Changes?
I was wondering if you have heard about any Google Local algorithm changes. We have about 200 franchise locations. Some of our locations have dropped significantly over the past few weeks. Locations that were showing up in the 1-3 positions are now no longer showing on the first page. This is for very relevant phrases for our main line of business (which is also in our business name)... ‘Phrase, CITY NAME’. These locations have plenty of positive Google reviews. We would typically rank well for a phrase like that based on our relevance. I did some brainstorming. Do you think any of these could have any impact? Google is all about things looking and feeling natural including link building, etc. We have used Yext which made a lot of changes across the web to fix addresses, etc. Do you think Google may be seeing this as unnatural? Too many changes at to many sites in to short a period of time? Along those same lines, do you think Google may be penalizing some of our franchise pages for being to ‘perfect’? It would be ‘natural’ for addresses to have some difference across the web and a bit unnatural to have them all match so perfectly. I know that Google has always stated the business name should be listed in Google Local the way it is listed to the general public. Things such as “Business Name Boston” should be listed as “Business Name”. Each of our franchise locations is named in house to reflect their geo location..... "Business Name Boston", "Business Name St. Louis". Many of our competitors also use the practice of attaching geo terms as well. Do you think we may be getting hit with a penalty now even though we have listed things on Google with the Geo term for years.... and is how WE refer to each location? Is it possible that by working with Yext, we drew attention to this practice? Should we remove our local listings geo term on Google Local? How about across the web? We are in a business that does not require customers to come to our location. Some of our locations have not suppressed the address in their local listings while others have. Many of our competitors have not. Do you think this could play into it? Some of our locations that are not showing in Local have good organic results. Have you heard anything about Google dropping Local if they show in organic? I know Google has been looking at social media more and more and I believe they will continue to do so. If our local pages have no social presence, could this adversely affect things? (I think this is probably not the case…. but wanted to throw it out there) I have noticed that in some cases where Local has dropped, we have multiple offices in that metro area. Is it possible that this could affect things? Have you heard of any Local algorithm changes? I know they are releasing a new dashboard sporadically, could this be in conjunction with a larger Local algorithm change? Our CMS tool does not allow us to change Title/Meta per page (I know... terrible!!). So every page has the same title and same meta description. (We are changing our CMS system! Can't wait!). Could this play into it? Thanks for any feedback!
Algorithm Updates | | MABES1 -
Will Ranking Reports be Affected with the new Google Changes?
For example: Raven stopped use of scraped Google, SEMRush data on Jan. 2 Raven stopped offering unauthorized Google SERP rankings and keyword data (a.k.a. scraped Google data) on Jan. 2, 2013. The change included the retirement of the SERP Tracker and the elimination of SEMRush data from the Raven platform. Raven has released new SEO performance reports that make it easy to show clients the impact of campaigns to improve organic traffic. Raven will continue to upgrade reports through the year. We thank the many customers who continue their business with Raven. More details about the SEO performance reports and other recent releases are available Is SEOMoz protected in some way? Or will you have to give up rankings reports too?
Algorithm Updates | | MSWD0 -
Changing Googles Sitelinks
Hi all, I know Google will only show sitelinks if the site is deemed authoritive and if it will help the user searching a keyword, but is there anyway to order or control which links appear in the sitelinks? I know you can demote a sitelink in Webmasters, but is this not shooting yourself in the foot? If I demote a link will Google replace it with the next link it thinks is worthwhile and be doing this eventually show the links you want to appear in your sitelinks? Thanx Gary
Algorithm Updates | | gazza7770 -
Has there been a Google change in the last 24 hours?
We have come in this morning to find our site (paydayuk.co.uk) has suddenly disappeared from their SERPs, we have consistently been ranking in the top 5 for a wide range of search terms but now do not even appear for our brand name of Payday UK where we have been first for many months. Our site is still indexed and we have made no changes for a while as any SEO work is waiting on completion of a CMS system. Looking in https://groups.google.com/a/googleproductforums.com/forum/#!categories/webmasters/crawling-indexing--ranking and there seem to be a lot of people having the same issues but as of yet no answers. I'd also like to add we don’t use black hat techniques so we really don’t understand why we have been penalised. Can anyone help please?
Algorithm Updates | | Sarbs0 -
Javascript hidden divs, links to anchor content
Hello, I am working on a web project that breaks up its sections by utilizing hidden divs shown via javascript activated through anchor links. http://www.janandtom.com/ First question: Is this SEO suicide? I have confirmed that the content is being indexed by searching for specific text but have been led to believe that hidden div content will be afforded a lower 'importance'. One suggestion has having the text as display:block and then hiding it on page load. Will this make a difference? Second: Is there any way to have Google index the anchored content by the specific anchor text? An example for the second question: If you search google right now for: buyers like to look at floorplans Tom & Jan You will get a link to: http://www.janandtom.com but I would rather it be: [http://www.janandtom.com/#Interactive Floorplans](http://www.janandtom.com/#Interactive Floorplans) Sorry if this is redundant or addressed before. I tried searching the questions but wasn't getting and definitive direction to go and this project is a little unique for me. Also, I'm just getting my feet we into this 'high-end' seo (new member of SEOMoz) so please bear with me. Any help would be greatly appreciated. Thanks!
Algorithm Updates | | MASSProductions0