Chrome blocked sites used by Googles Panda update
-
Google's Panda update said it used Chrome users blocked sites lists as a benchmark for what they now term poor quality content. They said the Panda update effectively took about 85% of them out of the search results.
This got me thinking, it would be very nice to discover what are the exact sites they don't like.
Does anyone know if there is an archive of what these sites might be?
Or if none exists, maybe if people could share their Chrome blocked sites on here we might get an idea?
-
Alan,
Google was sued back in mid-2000s, the judicial system has affirmed (and Google won that case) that Google search results are protected first amendment OPINIONATED speech and Google has every right to do as it pleases with its results. So be it if their wishes are politically biased.
Having said that, Google will never utilize one factor to penalize a site, if all factors are looking mighty well for a Rush L. site or fan site but the Google Chrome user blockage stats, then I strongly doubt that will have any significant impact on rankings, if any at all. After all, Google nowadays considers such abuses when introducing a factor and may do away with a factor if it becomes abused as such. Especially, on such a little known thing as Chrome site blocking. I mean, I have been doing SEO on and off for a few years now and consider myself fairly updated and tech savvy, and I just found out about this option smh doubt the bast majority of Chrome users even know this exists, let alone try to use it. As such, any potential abuses of this should for now easily stick out to Google.
Let us not forget Google does have manual review and appeals in place as well and if they find their automated code screwed up, they will re-evaluate and readjust the rankings again.
-
Thanks for the info, I didn't know Google were going as far as deranking sites based on the blocks.
I'm not sure the Panda update has done a great job in deranking the crap in my industry though. All the update has done for me so far is lower my rankings.. and change the angle to how I attack SEO. I'm hoping my competition doesn't know about writing decent content giving me the edge over them once I've overhauled my pages.
"Google was clear that they did not use that as a ranking signal in the Panda algorithm change; however, they did compare the effects of the Panda change to that block list, and found an 84% overlap, which they considered an excellent signal that they were on the right path."
The question still remains though about the sites Google deems to be poor quality (the 84%), I'm probably just dreaming if I think I'd ever get to see the compiled list of Chrome users blocked sites though!
-
I think this is venturing into dangerous ground. Imagine if thousands of anti-Rush Limbaugh fans put his site into their block list and google stopped displaying his site, then all his fans did the same to Obama We would end up with people blocking their competition
-
Hi Special, As Google unveiled the “Panda” algorithm change that targeted low quality content sites. Around the same time, they released a feature for the Google Chrome browser that gave users an option to block a site from showing up in their personal search results.
Thus — if you’re using Google Chrome as your browser — if you search for something and get what you think is a crummy content farm, you can click a button and never see that site show up in your results again.
And of course, Google gets to gather all that information on how many times each site has been blocked.
Google was clear that they did not use that as a ranking signal in the Panda algorithm change; however, they did compare the effects of the Panda change to that block list, and found an 84% overlap, which they considered an excellent signal that they were on the right path.
Google announced an update to their algorithm that will take all of that data on blocked sites and use it as a ranking signal. So now when tons of people choose to delete a site from their search results, that information may be used to downrank that site from everyone’s search results.
In some high-confidence situations, we are beginning to incorporate data about the sites that users block into our algorithms. In addition, this change also goes deeper into the “long tail” of low-quality websites to return higher-quality results where the algorithm might not have been able to make an assessment before.
Google also stated that this change will be much smaller in scope than the original Panda algorithm update, affecting only about 2% of search queries, rather than the 12% that Panda affected.
We’ll have to wait and see whether this is a change that can be effectively gamed (say, by encouraging a mass of Twitter followers to do a search with Chrome and block the competition) but if this algorithm update is finally going to get crummy eHow and Yahoo Ask garbage out of my search results, I welcome it!
For more details please refer to the website:
Factors affecting when panda updates.
I hope that you will find the solution.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Did Our Site Disappear for 6 Months?
Hello! The company that I work for recently hired a company to do their SEO. After they took over we fell out of the search engines for our keywords for 6 months! (I had previously done the SEO using Moz, but had to step down due to my husband getting cancer. He's all better and I'm back. 🙂 ) We contacted the SEO company recently to find out why the only way we could be found in the search engines was if we searched on our name. The reason they gave was Penguin, Panda, and Hummingbird. I personally don't believe that would cause our site to disappear for 6 months then miraculously reappear for our keywords a few days after us complaining. One of my main concerns is they have submitted us to 140 directories and 20 social bookmarking sites. How do I tell if these are good or bad sites? A few of them I've clicked are suspended and one Google warns about malicious content and to leave immediately - so those are obvious. Thanks for your help! Kelley Insana
Industry News | | Kelley_I0 -
Get Google To Crawl More Pages Faster on my Site
We opened our database of about 10 million businesses to be crawled by Google. Since Wednesday, Google has crawled and indexed about 2,000 pages. Google is crawling us at about 1,000 pages a day now. We need to substantially increase this amount. Is it possible to get Google to crawl our sites at a quicker rate?
Industry News | | Intergen0 -
Build a site, do SEO work on it and sell it?
Does anybody do this? With success? I keep finding industries right here in my local area (concrete work, home security, painting) that have 4-5 local companies that are competing and NONE of them are doing even the most BASIC items to seo their site or capitalize on ANYTHING online. I could pick 7-8 of these industries and have somebody who works for me spend a couple hours a week on each building links and writing a half way interesting blog post, etc. and once they rank higher than most of the competition sell em for 2-3 grand I bet, especially since I can prove how much traffic they are getting. Thoughts? Thanks for weighing in. Matthew
Industry News | | Mrupp440 -
So, Google is the best site on the internet.. Right? Or is that just what most people tend to think off-face?
LOL woah, put the guns away. I'm not about to rant, I just have a question and wanted to present it well. Then again, I might have actually found some easy fixes to some of Google's tools that they could make. So here's the thing. I noticed how annoyed I always getting when I have to sign in every time I go to the adwords keyword tool, or analytics. Why do you have to sign in a million times? I think it is a problem that can be fixed because if you go to check your webmaster tools, you go straight into your account, where you can then select which site you want to explore. It knows that I am already signed in to Google Accounts when I go to webmaster tools, but it doesn't recognize that fact when I go to my Analytics account, or to use the Adwords Keyword Tool. Now, every site has things that they need to work on, but not necessarily that need to be 'fixed'. Google being so commonly accepted as the best site on the net, I thought it was funny/interesting at the least to point out the problem. Even funnier is the fact that I could submit it as a problem to see if they could fix it or not, but they do such a good job of making it hard for people to contact them, that A) I don't feel like wasting my time trying B) I don't even really know if it is possible to do that. Also, why is there no official Google Analytics App / Mobile site?? Google has been pushing how important mobile is to us webmasters, but then it doesn't seem to be very high on their priority list for the tools that we use. I mean you can't view graphs on phones / tablets (mine at least), in webmaster tools, OR google analytics. Also, its a pain in the but to click the sign in button on Google Analytics when using my phone / tablet, it disapears really fast for me (needs more research from others to see if everyone has the same problem) Thanks for the interest / answers everybody. Look forward to hearing from you guys. Also, tips and help would be nice if anybody knows a solution to my sign in issue
Industry News | | TylerAbernethy0 -
Searching for a keyword on html source code of a website via Google
Is such a thing possible? Can we google for a specific keyword that can be found on the source code of a website? Is there any search operator for this? Thanks in advance!
Industry News | | merkal20050 -
Subdomains seen as one site
Earlier this month Google announced that sub-domains are now
Industry News | | AlanMosley
be treated as one site. At first I thought this was good news as I like to use
sub-domains for separation of categories and the like. But what about links
from one sub-domain to the other, they uised to be external links now they are
internal links. If you don’t have many external links, I would say that the
cross sub-domain links would have been important, if you have a lot of external
links then the flow of link juice would be of more benefit. I think overall its
is a good thing. Does anyone have any opinions about this or know of any writings
on the subject since this announcement? http://googlewebmastercentral.blogspot.com/2011/08/reorganizing-internal-vs-external.html0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Rebranding Sites
Our company just went through a rebranding and this includes the web sites. We have a new website (same domain) hosted on a new sever. We implemented 301's for outdated domain names and urls. The content has been overhauled to be simpler and we are building this from ground up with SEO in mind.I have canonical tags, being mindful of follow and no follow, with out trying to page sculpt) and ensuring the urls are descriptive and free of auto generated cms trash. To prepare for the launch, I captured all the analytics and adword campaigns before the switch. and began making lists of where we need to change the name around the web. We were performing pretty well, on SERPS before hand and now we want to try to keep the momentum speeding up with a cleaner newer site. Does anyone have any more suggestions on what more I should be doing to start off on the right foot.
Industry News | | KJ-Rodgers0