I think Panda was a conspiracy.
-
It's just a theory, but I think that Panda was not really an algorithm update but rather a conspiracy.
Google went out of their way to announce that a new algorithm was being rolled out. The word on the street was that content farms would be affected. Low quality sites would be affected. Scrapers would be affected. So, everyone with decent sites sat back and said, "Ah...this will be good...my rankings will increase."
And then, the word started coming in that some really good sites took a massive hit. We've got a lot of theories on what could be causing the hit, but there doesn't seem to be an obvious fix.
Many of the key factors that have been suggested causes of a site to look bad in Panda's eyes are present on one of my sites, but this site actually increased in rankings after Panda.
So, this is my theory: I think that Google made some random changes that made no sense. They made changes that would cause some scraper sites to go down but they also knew that many decent sites would decline as well.
Why would they do this? The result is fantastic in Google's eyes. They have the whole world of web design doing all they can to create the BEST quality site possible. People are removing duplicate content, reducing ad clutter and generally creating the best site possible. And this, is the goal of Larry Page and Sergey Brin...to make it so that Google gives the user the BEST possible sites to match their query.
I think that a month or so from now there will be a sudden shift in the algo again and many of those decent sites will have their good rankings back again. The site owners will think it's because they put hard work into creating good quality, so they will be happy. And Google will be happy because the web is a better place.
What do you think?
-
hahahaha. I agree with you. First Google manipulates results to see if Bing is copying. Soon after this update comes with all the care that is creating its own content sites optimized for the robots and we can not think of duplicate content. hehehehe
-
Actually I really liked the "content registry" idea.
An library of content where you could register what you have created and, optional, a link to where you want to be considered the main source.
At least it would be 10x more usefull than the google knol idea..
-
I would pay a fee to protect my best content in the Google SERPs.
-
Actually you have a point there... if JC Penny was indeed the catalyst (which I could easily imagine it being) then the time between that and the update would surely mean it would have to have been rushed. I never considered that before.
-
ha ha... I think they did rush this out.... they were quickly trying to pull up their pants after getting embarassed from the JCPenny problem... they needed to bust a few heads quickly...
-
Ha ha, maybe
I think it's something infinitely less planned out and they simply rushed this change out the door without understanding fully what it would do to the SERPs.
Although I do think you're right that in a few months (in what will be claimed to be a second Panda sweep) that things will go back and only the very worst offenders will stay penalised.
-
Yes I like the content registry idea! It would probably be necessary to pay for it as a service though, and to cover dupes that are okay maybe they could just allow dupes as long as they reference back to the source in the registry (for news, quotations, etc... where dupes can't be avoided).
-
Interesting ideas. Thanks for sharing them.
I think that Google is talking a lot about this as a "quality website update"... and that is getting them attention in the media but it is also kicking a lot of webmasters in the butt to clean up their websites.
I think that google should make a "content registry" where I can submit my content and say "this is mine" and then copies or spins of that content will not get traction in the SERPs.
And, I think that they should take a closer look at websites in the Adsense program because the ability to monetize crap and theft is driving lot of bad odor in the SERPs.
-
Haha I like it!!
Well, if it's not what happened, they'll wish they thought of it anyway lol
My view on why other sites got hit is just that they had at least some links coming from sites that got hit... i.e. got 100 backlinks, 10 are from articles on article sites, article sites get hit... lose 10 backlinks (or at least lose some of the value from some of those backlinks)... hence, good site takes a hit too
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How often does Google review Featured Snippets? What do you think?
Hi, I was wondering if anyone knows how often Google review Featured Snippets and if there is a way to find out? Thanks
Algorithm Updates | | Chris29181 -
Panda 4.0 Suggestions
My site was hit pretty negatively by Panda 4.0 and I am at a loss for the best way to address it. I have read like every article that I can and I know there is some duplicate manufacturer product descriptions but I don't hear many other ecoms complaining about Panda so I figure it must be something else. Also, the pages that seem most negatively affected are category and product list pages. Any help or suggestions would be much appreciated. Thanks! http://bit.ly/1plgOzM
Algorithm Updates | | Gordian0 -
What do you think Google analyzes for SERP ranking?
I've been doing some research trying to figure out how the Google algorithm works. The one thing that is constant is that nothing is constant. This makes me believe that Google takes a variable that all sites have and divides it by that number. One example would be taking the load time in MS and dividing it by the total number or points the website scored. This would give all of the websites a random appearance since there that variable would throw off all the other constants. I'm going to continue doing research but I was wondering what you guys think matters in the Google Algorithm. -Shane
Algorithm Updates | | Seoperior0 -
How important are links after Panda
I have noticed that the sites in my niche that were at the top of the SERP's are still at the top of the SERP's after panda. I have also heard people theorizing that links are no longer important, its now all about bounce rates, time on site, etc. Is there any consensus about how important links are after Panda? thx Paul
Algorithm Updates | | diogenes1 -
Panda Updates?
Anyone aware of any algo updates/refreshes today? Been seeing some serp movements on .co.uk
Algorithm Updates | | PeterAlexLeigh0 -
Regarding google panda: would it be wise to use automatic generated content when there is no content.
Hi guys, i am currently creating a local business directory and was deciding when we first start there will be a lot of business listings without a business decription until the owner of that business come to submit a description. so when if a business listing have no business description would it be better to have an automatic generated business description like this:
Algorithm Updates | | usaccess608
www.startlocal.com.au/retail/books/tas_hobartandsouth/Scene_Magazine_2797040.html the automated genrated description for this listing on that page is:
Scene Magazine is a business that is based in Kingston, 7050, TAS: Hobart And South. Scene Magazine is listed in 2 categories including: Magazines and Periodicals Shops and Book Stores and Shops. Within the Magazines and Periodicals Shops category there are 5 businesses within 25 km of Scene Magazine. Some of those businesses included within the radius of 25 km are Island Magazine, Artemis Publishing Consultants and Bride Tasmania Magazine. would google panda affect this or not and would it be wise to use this auto content when there is no description for a business?0 -
How influential do you think user behavior is in the algorithm?
I'm one of the guys out there that is super focused on user behavior right now. I think with the implementation of different things such as the block feature and +1, it points to the fact that Google is putting a lot more power in the users. How influential do you think factors such as bounce rate, CTR, time on site, and other user behavior metrics are in the algo?
Algorithm Updates | | TommySwanson520 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0