New Search Engine.... Vanoogle.com
-
I'd like to see google start a new search engine. They might call it Vanoogle.com (Vanilla Google).
This search engine would not be stinked up with social data, freshness inclusions, crap from my last query, skewed based upon my IP, warped because of my browser, targeted because of my cookies, no personalization, no image results, product results, none of that stuff.
Ads are OK if labeled.
I just want a plain vanilla search. Something that I know is "clean".
Just like the good olde days. Millions of people will start using it right away.
Would you use Vanoogle.com?
-
I wonder how much money Google make per user of their search engine. Would you pay for vanoogle.com? Say, US$20 a year? $50? $100?
TV channels without commercials isn't so strange a concept - here in the UK we have the BBC! Though we have to pay a yearly license. Partly as a result of the lack of adverts, I watch more television on the BBC than all other channels combined. The quality is often higher too. The TV license converts to about US$240.
-
This has to be the most entertaining thread I have read since Q&A started!
http://blekko.com/ states right on their home page "the spam free search engine
I don't see any adwords stuff on there. Maybe Blekko will take over the world. Oh wait, facebook search might take over. No wait... Bing is taking over...... No no no, my xBox 360 is taking over! Yeah that's it. My xBox 360. Nothing but Bing.
-
I use Google custom search to filter out a lot of things I do not need or want. You can put as many urls in that you want . Very useful in looking at your competitions SERPS.
-
We currently have a browser session on a local server used for serving search results without any of the cr*p that Google like to push. This gives a completely clean and accurate search results page in any brand of search engine (Google, Yahoo, Bing etc) and type of search engine (web, image, maps etc).
This is mainly controlled via the use of query parameters in the URL string. No results are ever clicked on within the session as to not influence rankings. There is no web history, no personalisation, no geo-targeted within the results.
I hate everything Google have done to make search results more 'personalised' or 'targeted'. And that's not just because I work in the SEO industry either.
Granted, stripping back your search experience to the raw criteria as we have done shouldn't be this difficult and I would certainly be a solid user of Vanoogle but what we've done works for us and ensures we don't see skewed results (when we require).
-
Like SEO Moz has Roger, Bulloogle.com could have this as their mascot http://bit.ly/HUIovX
I think advertising is so overkill and its only getting worse. I really don't like route search engines are taking when showing their SERPs and other content, personally I use google with JavaScript disabled the "page preview" on hover of the link/arrow is useless and really naff.
FYI: The twocents html tag is depreciated and won't help your SERP rankings on bulloogle.com
-
I tested this out myself but couldn't replicate it, however I can imagine it happening - like you and others have said they are testing things all the time.
Maybe they ran out of bananas that day! Just imagine all those years we have spent trying to second guess the algo of Googlebot and the key was a monkey haha.
The only problem with vanilla is that it is easily influenced by other flavours around it don't you think..?
-
Bulloogle.com would definitely have to be a metacrawler - putting emphasis on metatags - oh the good old days haha!
-
I remember what Google was like a few years ago. The SERPs were full of relevant information (in my opinion). Now they have a few relevant at the top and marginally relevant below... and some other things that are tangents.
-
It would also enable us to see how much better these additional factors make the search results with out own eyes and not have to rely on Google's promise that they do. Show us the evidence and let us come to our own conclusions!
At the moment it's a bit like a kid being told to eat their greens...
-
I would most definitely use it! Dare to dream, dare to dream.................
-
When I want to access the "official" site without having to dig through the commercialized sites on the SERPS, I use Bing instead of G. Much more pleased with the results when I'm not searching for "long tail" phrases. Vanoogle (your idea of a toggle to get "pure" results) is a great idea but G wants ALL the ad revenues it can bleed out of a page.
-
For pure results we should have all the sites that match the search term listed in alphabetical order.
-
Thanks for your dad's perspective.
He thinks any weakness in the results returned are because he "must have types the wrong thing."
That is eyeopening!
Experienced people might enjoy the toggle feature you suggest... that will allow them to filter the "fluff" and get pure results.
-
That's all well and good, but how do you get the average man on the street to switch?
For example, my dad has never "chosen" a search engine in his life. He just goes with whatever he browser defaults to / manufacturer set up as a default and failing that "google" because it's the only one he's heard of... He thinks any weakness in the results returned are because he "must have types the wrong thing."
It would be really nice to be freely toggle all the factors your mentioned on/off (and set defaults) so that you could have the search that you wanted.
-
I used to have a "clean machine" that I used to check rankings, never signed in and never clicked anything in the SERPs. That has stopped working because previous searches are stinking up the SERPs.
I want a button to "turn off all bias".
-
Yeah that would be nice, the nearest thing I got to that is going 'incognito' in chrome.
-
It would be nice if they gave you google classic (AKA Vanoogle.com) as an option. That way everyone would be happy.
-
The other 20% with the yellow pages.
No need for vanoogle, why don't you just go back to the very beginning and use http://www.dmoz.org/.
-
Your sites ranking well is the most important criteria for Vanoogle!!
-
I like it. The Faveicons add character. (... and my sites rank well)
-
So, you would use vanoogle for the other 20%?
I think that most people would use it all of the time.... so if 80% of people use it all of the time and the rest use it 20% of the time that would be 84% market share.
-
Nice Post, EGOL. You don't like Google with all the "improvements" - like to I rank 6th on page 1 or 17th, depending on what Google decides to display on the SERPS.
How about DuckDuckGo? They are pretty generic and without personalization.
-
I don't think so EGOL, maybe you are just looking at it from the SEO side of the fence.
When im searching for my own purposes Google 80% of the time delivers everything I want, whether its a map of places to eat in my local city or youtube rich snippets of a band ive heard about.
-
Right! It might replace StumbleUpon.
-
SEOs would like to have it to know "where they really rank".
The average guy would like to have it just to enjoy "crap-free SERPs".
-
Now Bulloogle.com, that is something I can get behind
all BS all the time, you never know what you will get!
-
ha... That's really funny.... and I think you are right!
-
Never Seen BS tags before, is that a way to rank higher in Vanoogle?
Heaven's no!
We will need yet another search engine for that.... Bulloogle.com
Lots of what I write should be indexed there.
-
I was surprised last week when I searched for "georgia" and then searched for "guitars" a moment later and found that google was delivering results contaminated by previous queries. http://www.seomoz.org/q/google-query-contamination
They monkey with the SERPs and don't tell.
So, I agree, sometimes vanilla is the best flavor.
That's why I want Vanoogle.com
-
Never Seen BS tags before, is that a way to rank higher in Vanoogle?
I would not sorry
I am a convert, I like the way search is going. of course there are gonna be bumps along the way, but I think the social integration is a better way to connect people. We have already shown our predisposition to loving this mentality of online communitites, so i think this is just another stepping stone to the new social "It"product.
I also like geolocation, I think as an SEO/Internet Marketer it makes my life more confusing and more confusing to clients/employers, but as a general user I think it is definitely on the right track to helping people get with local resources, as well as brands, which i thinks makes for a more informed consumer.
just my 2 cents
-
yea i'd like to see TV channels with no commercials too.
-
mmmmmmm.... I like Vanilla!!
My life would be complete if Google decided to do that!
-
Vanilla sometimes is the best flavour - I'd definitely give it a go! Here's to making the web a better place Egol.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Images search traffic and image thumbnail issues
Hi MOZ community! Need a little help with a strange issue we are seeing of late on our project CareerAddict.com. We have seen a sudden and significant drop in image visibility in Search Console from the 27th August onwards. I understand that Google has been updating their filters and other bits in image search, so maybe this could have impacted us? I also noticed that the images which are mapped to our articles are not the full featured article 700px wide images which we provide to Google in the Structured Data. They are instead taking the OG share 450px wide images now on many occasions. You can see this by searching for "careeraddict.com" in images. Any insight or suggestions welcome on both of these. Interested to understand if any other webmasters are experiencing other or similar problems with image visibility in Google also. Thanks!
Algorithm Updates | | dqmedia0 -
Does Google considers the cached content of a page if it's redirected to new page?
Hi all, If we redirect an old page to some new page, we know that content relevancy between source page and this new page matters at Google. I just wonder if Google is looking at the content relevancy of old page (from cache) and new page too. Thanks
Algorithm Updates | | vtmoz0 -
Are you seeing 404's from utililab.mysearchguardian.com?
I've been noticing a lot of 404's popping up in my Google Webmaster accounts coming from utililab.mysearchguardian.com. Utililab itself seems to be some sort of malware, but why is Google indexing it and sending 404's?
Algorithm Updates | | EthanThompson0 -
New Domain, Subdomain or Subfolder
Hi All, I am working with a bank that would like to rank as many parts of the company site as possible for the company name. This includes the home page, a page on careers and a page on company reviews. The question is, it is better to structure the careers and reviews content on a subfolder, subdomain or new domain. Using subfolders to retain equity of the root domain site.americanbank.comamernicanbank.com/careersamericanbank.com/reviews or (use subdomains - you lose some of the main domain equity and it is counter to the Moz research) americanbank.comcareers.americanbank.comreviews.americanbank.com or set up new domains to overcome Google bias not to rank the same root domain in the top 7 to 10 results multiple times when displaying results for a company name. americanbank.comhttp://americanbankcareers.comhttp://americanbankreviews.com Thanks for your perspective.
Algorithm Updates | | BetterAnalytics0 -
New Website Old Domain - Still Poor Rankings after 1 Year - Tagging & Content the culprit?
I've run a live wedding band in Boston for almost 30 years, that used to rank very well in organic search. I was hit by the Panda Updates August of 2014, and rankings literally vanished. I hired an SEO company to rectify the situation and create a new WordPress website -which launched January 15, 2015. Kept my old domain: www.shineband.com Rankings remained pretty much non-existent. I was then told that 10% of my links were bad. After lots of grunt work, I sent in a disavow request in early June via Google Wemaster Tools. It's now mid October, rankings have remained pretty much non-existent. Without much experience, I got Moz Pro to help take control of my own SEO and help identify some problems (over 60 pages of medium priority issues: title tag character length and meta description). Also some helpful reports by www.siteliner.com and www.feinternational.com both mentioned a Duplicate Content issue. I had old blog posts from a different domain (now 301 redirecting to the main site) migrated to my new website's internal blog, http://www.shineband.com/best-boston-wedding-band-blog/ as suggested by the SEO company I hired. It appears that by doing that -the the older blog posts show as pages in the back end of WordPress with the poor meta and tile issues AS WELL AS probably creating a primary reason for duplicate content issues (with links back to the site). Could this most likely be viewed as spamming or (unofficial) SEO penalty? As SEO companies far and wide daily try to persuade me to hire them to fix my ranking -can't say I trust much. My plan: put most of the old blog posts into the Trash, via WordPress -rather than try and optimize each page (over 60) adjusting tagging, titles and duplicate content. Nobody really reads a quick post from 2009... I believe this could be beneficial and that those pages are more hurtful than helpful. Is that a bad idea, not knowing if those pages carry much juice? Realize my domain authority not great. No grand expectations, but is this a good move? What would be my next step afterwards, some kind of resubmitting of the site, then? This has been painful, business has fallen, can't through more dough at this. THANK YOU!
Algorithm Updates | | Shineband1 -
URL Importance In Search
This may have been addressed before. If it is, please link me to the thread. I'm trying to SEO for local surrounding cities my client services. It was suggested I purchase domains relevant to those cities and create separate pages optimized for those local keywords. Wondering if this is a good tactic. For example my client's business is located in Chicago, but services the surrounding suburbs of Chicago. Whats the current, best way to SEO?
Algorithm Updates | | severitydesign0 -
301 Redirect has removed search rankings
As per instructions from a SEO , we did a 301 redirect on our url to a new url (www.domain.com to subdomain xxxx.domain.com). But the problem is we lost all the google rankings that the previous url had gained. How can we rollback this situation. Can we retrieve the rankings of the previous url if we remove 301 permenant move redirection ? The new url does not figure in the google search for the keyword that use to fetch the previous url at no 3 in the results Please help ...
Algorithm Updates | | BizSparkSEO0 -
Is a slash just as good as buying a country specific domain? .com/de vs .de
I guess this question comes in a few parts: 1. Would Google read a 2-letter country code that is after the domain name (after the slash) and recognize it as a location (targeting that country)? Or does is just read it as it would a word. eg. www.marketing.com/de for a microsite for the Germans www.marketing.com/fr for a microsite for the French Or would it read the de and fr as words (not locations) in the url. In which case, would it have worse SEO (as people would tend to search "marketing france" not "marketing fr")? 2. Which is better for SEO and rankings? Separate country specific domains: www.marketing.de and www.marketing.fr OR the use of subfolders in the url: www.marketing.com/de and www.marketing.com/fr
Algorithm Updates | | richardstrange0