Google Webspam Algo Update 24/4/12
-
Having just checked our clients rankings 95% have not been affected, in fact many have moved up rankings. 1 or 2 have had big drops
Who has been effected by this? The forums are full of people talking about sites being floored from the serp's.
it will be interesting to follow the aftermath of this and get some insight into what exactly has changed!
-
"Something to note is that the problem could fix itself. Out of that 3-5% of the queries impacted, some sites are gonna get hit incidentally. Google seems to be decent at spotting these over time and fixing it. Sometimes people get the short end of the stick though."
Thats a very good point. We have often notice websites take a hit after an algo date, only to come back a few days or weeks later.
One thing is sure- this seems to have affected quite a lot of sites. The site we have seen drop has a very natural link profile and no "over optimisation" in terms of keyword stuffing etc.
There seems to be alot of poor websites that have benefited, as mentioned- blank sites now ranking #1 on google! Lets hope we some further tweaks in the near future as currently it seems that there have been a lot of "incidental" hits!
-
did you apply any Seo methods that aren't white hat? Perhaps bought links for a couple of bucks? What efforts did you take for optimization?
If you could let us know perhaps we can give some direct advise on how to change the rankings for you again.
kind regards
Jarno
-
Something to note is that the problem could fix itself. Out of that 3-5% of the queries impacted, some sites are gonna get hit incidentally. Google seems to be decent at spotting these over time and fixing it. Sometimes people get the short end of the stick though.
There is one thing I always do in the week or two following an update: nothing at all
Let the fallout happen, and see what other people are saying. While we know they did the update, we don't have a ton of metrics yet on what bigger sites got hit. Once you do you will be better equipped to assess the situation.
I see a lot of people panic when they drop. They want to build links, change the title tags, tweak the KW density and change their menu structure all at once. Even if these were all practical, how could you know which one was the most effective if you did them all within a few weeks of each other?
Cheers,
Vinnie
-
So far, I see two sites that we work with that have notably dropped. Hopefully over the next few days we'll start to figure out specific causes and solutions.
-
Looks like we posted this at about the same time! I will just close mine and discuss it in yours.
A few references:
http://searchengineland.com/google-launches-update-targeting-webspam-in-search-results-119295
http://googlewebmastercentral.blogspot.com/2012/04/another-step-to-reward-high-quality.html
Cheers,
Vinnie
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pre-launch market/industry research
I am getting ready to start a web based marketing firm, with a strong focus on local search and b2b organic search engine optimization. However, I want to pick a specific b2b market to focus on when the site goes live. How do I go about conducting this market research when i do not have any visitors to my site yet. My goal is to spot a market/industry that has a gap where my service will be in demand? All of the popular guides and blogs focus on market research that takes place after you launch your website or after you already have an audience to analyze. Any advice or suggestions on conducting market/industry research without having a audience yet would be greatly appreciated!
Industry News | | ranch1130 -
Now that Google is no longer publicly displaying Page Rank updates, how will this effect Moz's ability to calculate DA and PA?
Hi, How much more important do you guys think that Moz's Page Authority and Domain Authority metrics are going to become now that Google has stopped giving people public access to a site or pages Page Rank? And how accurate is PA and DA as a measurement in comparison to Page Rank..so for example if I was seeking a guestposting opportunity and saw a site as having a PR of 4....if I now looked to Moz's Page and Domain Authority metrics instead...would that still give me equivalent information on the strength of that domain and thus make a judgement on whether it will be a worthy site for a guestpost.. I guess what I am asking is, how close is now looking at Moz's metrics (ie. a third party company) to the info on PageRank that was being updated by Google themselves? Also will the lack of updated public PR info from Google effect the ability for Moz to calculate PA and DA?? Look forward to your replies on this,
Industry News | | sanj50500 -
Is this still Google?
My niche, my concern.
Industry News | | webfeatus
http://www.google.com/search?q=jimbaran+villa
My site just dropped out of the rankings completely. But if you look at the Google search above you will notice 2 things:
1. First page: 75% of space above the fold is dedicated to Google making money
2. Subsequent pages: It is like you don't actually search "Google" If you flip through a few pages what you actually search is:
agoda.com
flipkey.com
tripadvisor.com
homeaway.com Do I have a point or am I simply having a cynical day?1 -
Will Google ever begin penalising bad English/grammar in regards to rankings and SEO?
Considering Google seem to be on a great crusade with all their algorithm updates to raise the overall "quality" of content on the Internet, i'm a bit concerned with their seeming lack of action towards penalising sites that contain terrible English. I'm sure you've all noticed this when you attempt to do some proper research via Google and come across an article that "looks" to be what you're after, then you click through and realise it's obviously been either put together in a rush by someone not paying attention or putting much effort in, or been outsourced for cheap labour to another country whose workers aren't (close to being) native speakers. It's getting really old trying to make sense of articles that have completely incorrect grammar, entirely missing words, verb tenses that don't make any sense, randomly over-extravagant adjectives thrown in just as padding, etc. etc. No offense to all those from non-native speaking countries who are attempting to make a few bucks online, but this for me is becoming by far more of an issue in terms of "quality" of information online as opposed to some of the other search issues that are being given higher priority, and it just seems strange that Google have been so blasé about it up to this point - especially given so many of these articles and pages are nothing more than outsourced filler for cheap traffic. I understand it's probably hard to code in something so advanced, but it would go a long way towards making the web a better place in my opinion. Anyone else feeling the same way? Thoughts?
Industry News | | ExperienceOz1 -
Anyone else know much about the Google Pirate penalty?
The Google 'Pirate' (no official name) seems to have gone largely undiscussed since it was launched last - Fri 10th August http://insidesearch.blogspot.co.uk/2012/08/an-update-to-our-search-algorithms.html. The idea of it is to ensure those 'Pirating' content or abusing trademarks e.g. fake ugg boot sites and file sharing sites do not appear higher in the search results than the genuine websites. Google is using DMCA take down requests for labeling sites as Pirate and demote their rankings, Im amazed not even seomoz has covered the subject yet as far as I can see, yet it is a hugely important new update, albeit affecting a relatively small number of sites now, and in some cases (at least one I know first hand) seemingly without justification (the example I know is not a file sharing, fake goods, trademark abusive site at all.) Google updating its search algorithm based on DMCA take down requests seems a bit strong - these are takedown requests, not legal proof that a site is infringing a trademark. A real weapon for negative SEO? Anyone else had experience of the pirate update or know much more about it? Outside Danny Sullivan I dont see many SEO folk covering it. Heres my own insights into it and what ive learned about what (only innocently) affected sites should do to appeal http://www.andy-maclean.net/the-google-pirate-dmca-guidance/
Industry News | | AndyMacLean0 -
Google Search Quality Team - Commission Based Reviews
I have been busy this past week writing articles for various sources about the recent update on Google. A number of people contacted me about the analysis I was doing and the report. Some were members of the Google Search Quality Team. I knew manual reports were done before - but after the documents they showed me regarding the reports they do and the compensation for doing the reports - I am left in a state of being pretty shocked. May be I have been naive for all these years but I didn't realize that; Google outsourced the review and reconsideration requests to individual reviewers for a compensation Google's position in terms of checking qualification and experience of these "reviewers" was very insufficient at best, The three contacts I spoke to who had done reports had very little training or experience. I went through the GSQT REVIEWERS PDF (a very long and thorough document) that I was sent - with them. We went together through some sites I wanted them to review and their comments that came back were quite astounding to say the least and would have made many of you Mozzers laugh. Obviously I don't want to post said document online here.... BUT, I wanted to know if: a) any Mozzers had ever been part of such a group - the GSQT b) had any dealings with them - in terms of having your website reviewed and known about it. I knew about this group way back - like in 2005 or 2006 or sometime around then - I was told at time it was stopped and Google had stopped paying these sub contractor reviewers. Please don't get me wrong here... totally on board with manual reviews... I would just prefer them done by a trained team that possibly worked for either a professional company that maintain high quality review testing and standards - or for that matter GOOGLE employees that were trained. I just am a little unsure of them being done by individual subbies that get paid for the amount they do. What if that subbie has got some skin in the game for a particular keyword? What if their knowledge about certain aspects isn't up to par or not tested on a regular basis. This space is always changing and as you guys ./ girls on this forum know - it can change pretty quick. I just would want all websites to be judged fairly and equally by a group trained EQUALLY and to the same standards. I don't care if this is a G team or not - I just want it to be a team that is trained equally and trained continuously as opposed to paying outside people based on numbers of reviews done. When the livelihood of a small business is the balance I don't want a commission hungry toe rag with one years experience being the gate keeper for me or any of our clients. Carlos
Industry News | | CarlosFernandes0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
Which pieces of content in the online marketing/social media space were the very best of the best in the past 12 months?
As you may be aware - SEOmoz are hiring for a marketing oracle (one of 8 jobs going at the moment: http://www.seomoz.org/blog/8-open-positions-at-seomoz). Their key responsibility will be to produce world class content about online marketing, seo and social media. So, let ask you - what are the best examples of content in this space from the past 12 months? The very best of the best. Don't just think SEO, but any kind of online marketing content. Whether it's editorial, infographic, research etc etc.
Industry News | | TomCritchlow2