How do you measure impacts of Google Updates Like Penguin 4?
-
Having a conversation with a fellow SEO via twitter and we were discussing measuring algorithm updates.
In the aftermath of Google Penguin 4 how do you determine the effects it has on your site/sites and your respective verticals?
-
I look a the general factors of traffic and ranking first. Then I move to pages that receive new visitors.
But I also try to monitor the top 100 in the 'pest control' search at US level. Someone said this was irrelevant as it's not how users search.
I am curious as to other factors: Webmaster tools for Indexed pages? Traffic? Impressions? Clearly a link warning would be an indicator. So what other tools could indicate how much of an impact updates are having?
-
I like these. Visibility is a great addition to the simple traffic and ranking.
-
Thomas,
I look at 4 key things:
Rankings, Traffic, Visibility, & our Competition's Results.
Rankings - What keywords went up/down etc.
Traffic - Detailed traffic analysis to see changes by keyword and overall
Visibility - This is also important. For example, we had a site that was #1 in Google for it's main keyword (and still is) but now has less visibility because we actually had 5 of the 10 spots on page 1 and now we have 2 out of 10. Still #1 though, so no complaints.
Competition - In addition, I look at our competition's results and changes in the 3 criteria above as well.
How about you?
-
I would look at two things. I would look in Google Analytics at a daily graph of overall organic traffic to the site over time and add date markers for known Google updates to see if there are any changes that align with updates. I would also look in Google Webmaster Tools at the average ranking for the top 20-100 keywords before and after the date of a known Google update.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is this a violation of Google guidelines and current industry best practices correct? Regarding Iran Facts
I have read Moz for a good deal of time but I have never been gotten involved, until now... While watching a YouTube video in the app on my smartphone an Advertisement came on Still screen shots located on my website www.dleichtweis.com This is a video of the advertisement: http://www.youtube.com/watch?v=PCQFm7PjWb8 I have reason to believing this is a violation of numerous polices, procedures, conditions and or best practices. I value Moz as a communities opinion. Google has been contacted in regards to this https://www.en.adwords-community.com/t5/Ad-Approvals-and-Advertising/I-want-answers-to-issue-Re-3-8187000002180/m-p/278355#M14740 I value your response. D Leichtweis
Industry News | | dleichtweis0 -
How can i discover how many of my pages have been indexed by google?
I am currently in the process of trying to produce a report for my corporation and this is a metric that i cannot seem to find on OpenSiteExplorer. Could anyone help?
Industry News | | CF20150 -
Help seems like google is punishing our site and I don't understand why
I would love some helpMy anme is Suzanne Diamond our web site is WWW.thefutonshop.com we have been on line for years, 10 brick and mortar stores, had a strong ecommerce web site but in the past 4 months we have been dropping like a leaf. Organic search has been dropping since November 2012 but now total traffic seems to be slowing We are working on our SEO but I feel it has to be something more direct that we are being punished by google Perhaps it s obvious but not to me Can anyone help me please or give me a direction. suzanne@thefutonshop.com thanks in advance for any direction
Industry News | | FUTONSHOP0 -
Google Alert not working - anyone else have this problem?
I have a Google Alert that has stopped pulling in recent results even though a web search indicates that the pages are being indexed. None of the alert settings have changed. Anyone else have this happen recently and know how to remedy this problem?
Industry News | | BostonWright0 -
Are Wordpress sites being dinged by Google? Read a few articles regarding.
I read a couple "SEO" related articles that sites built in Wordpress are going to be dinged by Google because Google sees Wordpress sites as simple to make and a higher potential to be "spammy". Is there any truth to this? Your thoughts? I do give "thumbs up" and "best answer" marks and appreciate receiving thumbs up myself... Thanks
Industry News | | JChronicle1 -
Google Search Quality Team - Commission Based Reviews
I have been busy this past week writing articles for various sources about the recent update on Google. A number of people contacted me about the analysis I was doing and the report. Some were members of the Google Search Quality Team. I knew manual reports were done before - but after the documents they showed me regarding the reports they do and the compensation for doing the reports - I am left in a state of being pretty shocked. May be I have been naive for all these years but I didn't realize that; Google outsourced the review and reconsideration requests to individual reviewers for a compensation Google's position in terms of checking qualification and experience of these "reviewers" was very insufficient at best, The three contacts I spoke to who had done reports had very little training or experience. I went through the GSQT REVIEWERS PDF (a very long and thorough document) that I was sent - with them. We went together through some sites I wanted them to review and their comments that came back were quite astounding to say the least and would have made many of you Mozzers laugh. Obviously I don't want to post said document online here.... BUT, I wanted to know if: a) any Mozzers had ever been part of such a group - the GSQT b) had any dealings with them - in terms of having your website reviewed and known about it. I knew about this group way back - like in 2005 or 2006 or sometime around then - I was told at time it was stopped and Google had stopped paying these sub contractor reviewers. Please don't get me wrong here... totally on board with manual reviews... I would just prefer them done by a trained team that possibly worked for either a professional company that maintain high quality review testing and standards - or for that matter GOOGLE employees that were trained. I just am a little unsure of them being done by individual subbies that get paid for the amount they do. What if that subbie has got some skin in the game for a particular keyword? What if their knowledge about certain aspects isn't up to par or not tested on a regular basis. This space is always changing and as you guys ./ girls on this forum know - it can change pretty quick. I just would want all websites to be judged fairly and equally by a group trained EQUALLY and to the same standards. I don't care if this is a G team or not - I just want it to be a team that is trained equally and trained continuously as opposed to paying outside people based on numbers of reviews done. When the livelihood of a small business is the balance I don't want a commission hungry toe rag with one years experience being the gate keeper for me or any of our clients. Carlos
Industry News | | CarlosFernandes0 -
How good is Guest post after Penguin Update ?
Hello 🙂 , You agree or not but 90% of guest posting is done for Link Building, we have seen several blogs on SEOMoz or more trusted SEO Sites that talk about Guest blogging. Everyone recommends guest posts with good content (Unique ofcourse) , valued content & linking to related websites. I have been doing Guest posting for many of my clients but from last 2 months, we got good success too (That is most of our content was approved) . Our writers did great job writing quality content, authoritative subject, providing do's & don'ts, tips, pros - cons etc. Though I have been trying to link to our direct URL instead of anchor text, or long words or name of our business not our main kws, just to be on safe side from Google. Sites where we posted guest post were minimum home page rank of 2 & seomoz's domain authority 30+ , so we know that our guest posts are not on junk sites or blogs. To be honest we have received around 35 guest blogs in one month (All unique content, never spun) linked on above topics, anchor links like I said above - other than that we have not done too fancy for link building, we paid attention to our social media, brand building, inbound marketing recommended by rand fishkin http://goo.gl/64VB5 & we never participated to buy link, buy guest post, buy any likes on fb etc... Still our ranks are not improved actually our ranks has dragged back, we have been doing SEO from last 4 yrs but we never did black hat, our website content is 10 times better than websites in front of us, our titles are unique and advice to human, but our competitors ahead in ranks talk to engines. Our website is up & running from 1993. That is too old and our service is top class. We used to rank well in most competitive words for yrs but new websites came in and started to outrank us. Now situation is this that - websites ahead of us are brands or microsites. I understand Google favors brands but that doesn't mean we are doing anything against Google. We have been practicing guest blogs from yrs but frequency was less. Now question is - Is Guest post worth, bz it is for link (ofcourse with good content) and Google say don't work for link. This is big dilemma We make good content and we need credit for that and link is credit which everyone wants. After penguin update things has started more worse than before. So modified question is - Is guest post worth after penguin update ? Thank you for taking time to read this & I look forward for ongoing discussion on this & I hope leading industry people will participate here, especially those talks and post about guest blogs. -Sunny.
Industry News | | sunny.popali0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690