Paste 'do a barrel roll' into Google - See what happens!!
-
Hi,
Paste the words 'do a barrel roll' in Google and see what happens!
-
LOL that is funny!
-
You have to disable google instant predictions in your search settings, if you really wanted to know! Bit daft isn't it.
-
Same Happens when you search "bskew"
-
Search 'Google Gravity' then click back into the search box and remove the 'y' and it should predict (looks like it only works in Firefox and Chrome)
-
How does one hit "i'm feeling lucky" now with the suggested searches jumping straight to the live serp?
-
When I first sore it I honestly thought my PC had been infected with something for a split second.
-
I almost threw up!
-
What Sorcery is this?
-
Uh more, I thought I found something hidden then.
-
Not really a "Q&A", but here are a few more:
- Enter "Google Gravity" in the search bar and hit "I'm feeling lucky"
- Enter "askew" in the search bar
http://mashable.com/2011/11/03/google-easter-eggs-2/#33105Gravity
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
YELP: Legit, or is it wearing Prada's with a black hat?
Should I stop recommending clients to be in yelp? http://finance.yahoo.com/news/yelps-newest-weapon-against-fake-100101689.html
Industry News | | Chenzo0 -
Did Google Search Just Get Crazy Local?
Hey All, I think it's a known fact at this point that when signed into a personal Google account while doing a search, the results are very oriented around keywords and phrases you have already searched for, as well as your account's perceived location; for instance when I wanted to check one of my own web properties in SE listings I would sign out or it would likely appear first as a false reading. Today I noticed something very interesting: even when not signed in, Google's listings were giving precedence to locality. It was to a very extreme degree, as in when searching for "web design," a firm a mile away ranked higher than one 1.5 miles away and such. It would seem that the algos having this high a level of location sensitivity and preference would actually be a boon for the little guys, which is, I assume why it was implemented. However, it brings up a couple of interesting questions for me. 1. How is this going to affect Moz (or any SE ranking platform, for that matter) reports? I assume that Google pulls locations from IP Addresses, therefore would it not simply pull the local results most relevant for the Moz server(s) IP? 2. What can one do to rise above this aggressive level of location based search? I mean, my site (which has a DA of 37 and a PA of 48) appears above sites like webdesign.org (DA of 82, PA of 85). Not that I'm complaining at the moment, but I could see this being a fairly big deal for larger firms looking to rank on a national level. What gives? I'd love to get some opinions from the community here if anyone else has noticed this...
Industry News | | G2W1 -
Is there a way to get a list (backlink profile) of all tiny url's that point to my site or a competitors site?
I have noticed that most all links you find in all the major back link profile tools such as OSE or GWM, etc... do not show tiny url's. If there is a service that shows all the tiny urls pointing to your site, can someone please share. It has already been proven that tiny url's do pass link juice, so with that being said... if there is no way to find all the tiny urls that point to a site, wouldn't it be a great strategy to create all my back links with tiny url's to mask my profile from competitors? Thanks!
Industry News | | johnd57890 -
Can't seem to submit my xml sitemap to Baidu, can someone help?
Hello Mozers and SEM's, I'm sure all of you know and understand the importance of uploading your sitemap to search engines. Once search engine in particular is Baidu. For some reason, i can't find a way to submit my link www.mysite.com/sitemap.xml. For Google, Bing, and Yandex it was easy, but Baidu is giving me problems. Don't tell me to "wait until they crawl your site", i have over 1000+ pages of unique content that those other 3 search engines found because of my sitemap; Baidu can't be that slow. Their robots only found 147 pages. 😞 If you take a look at the image attachment, you can see why I'm stuck, i can't read chinese! (and it's an image, so i couldn't translate it). Has anybody had any luck submitting their .xml link into Baidu? Can someone walk me through it? Let me know! Shawn xILP69w
Industry News | | Shawn1240 -
Google Search Quality Team - Commission Based Reviews
I have been busy this past week writing articles for various sources about the recent update on Google. A number of people contacted me about the analysis I was doing and the report. Some were members of the Google Search Quality Team. I knew manual reports were done before - but after the documents they showed me regarding the reports they do and the compensation for doing the reports - I am left in a state of being pretty shocked. May be I have been naive for all these years but I didn't realize that; Google outsourced the review and reconsideration requests to individual reviewers for a compensation Google's position in terms of checking qualification and experience of these "reviewers" was very insufficient at best, The three contacts I spoke to who had done reports had very little training or experience. I went through the GSQT REVIEWERS PDF (a very long and thorough document) that I was sent - with them. We went together through some sites I wanted them to review and their comments that came back were quite astounding to say the least and would have made many of you Mozzers laugh. Obviously I don't want to post said document online here.... BUT, I wanted to know if: a) any Mozzers had ever been part of such a group - the GSQT b) had any dealings with them - in terms of having your website reviewed and known about it. I knew about this group way back - like in 2005 or 2006 or sometime around then - I was told at time it was stopped and Google had stopped paying these sub contractor reviewers. Please don't get me wrong here... totally on board with manual reviews... I would just prefer them done by a trained team that possibly worked for either a professional company that maintain high quality review testing and standards - or for that matter GOOGLE employees that were trained. I just am a little unsure of them being done by individual subbies that get paid for the amount they do. What if that subbie has got some skin in the game for a particular keyword? What if their knowledge about certain aspects isn't up to par or not tested on a regular basis. This space is always changing and as you guys ./ girls on this forum know - it can change pretty quick. I just would want all websites to be judged fairly and equally by a group trained EQUALLY and to the same standards. I don't care if this is a G team or not - I just want it to be a team that is trained equally and trained continuously as opposed to paying outside people based on numbers of reviews done. When the livelihood of a small business is the balance I don't want a commission hungry toe rag with one years experience being the gate keeper for me or any of our clients. Carlos
Industry News | | CarlosFernandes0 -
Chrome blocked sites used by Googles Panda update
Google's Panda update said it used Chrome users blocked sites lists as a benchmark for what they now term poor quality content. They said the Panda update effectively took about 85% of them out of the search results. This got me thinking, it would be very nice to discover what are the exact sites they don't like. Does anyone know if there is an archive of what these sites might be? Or if none exists, maybe if people could share their Chrome blocked sites on here we might get an idea?
Industry News | | SpecialCase0 -
Google guidlines 2011
Guys I have asked for the leaked SEO guidlines just to fine tune my SEO campaigns and it seems no one wanted to send it to me. Anyone can do it here, please?
Industry News | | SearchOfficeSpace230 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690