Penguin 3.0 has rolled within the last hour - who has been affected
-
Hi
I am hearing that Penguin 3.0 has rolled out within the last hour or so, can anyone confirm this and secondly has anyone been affected?
My rankings don't seen to have changed, neither has my main competitors but it might just be too early to tell.
Thanks
Andy
-
Thanks, Marie. That post is really helpful.
Just to clarify one point. I've read that google may take as long as six months to recrawl a website. Does that mean that many of the links on our disavow list uploaded in Sept might not have actually had the (invisible) nofollow tag applied yet, and in which case may still be harming our website as far as penguin is concerned? When I read that google was processing disavow requests with the penguin update, I thought that that meant that the usual wait wouldn't apply, that everything would be recrawled with the penguin refresh, if that makes sense.
I'm trying to convince myself that our work on removing/disavowing links hasn't fully taken effect yet, and that we'll see a bounce in our rankings with the next penguin update, whenever that may be. I'd rather not take the lack of improvement we've seen this time around as a sign that we're never going to make a recovery. I certainly can't see how we can do much more work in terms of removing links. We were pretty thorough.
-
Hi mgane. It's not that Google stopped processing disavow requests, but rather, the data that the Penguin algorithm needed in order to make its calculations was likely collected at that time. (The middle of September). So, whether or not your site looks trustworthy now in the eyes of Penguin depends on what your link profile looked like at that time.
If you filed a disavow in early September, those links are not considered disavowed until Google revisits them. Then, once recrawled, they apply the invisible nofollow to links pointing to your site.
Regarding the https switch, I get asked that a lot, so I wrote a post about it: http://www.hiswebmarketing.com/switching-https-happens-disavow/. The TLDR version is that if you had a disavow pointing to your non https then it's not like you start over again with the https version.
-
I thought I'd post this here, rather than start a new thread (let me know if this is the wrong place)
I've read in recent reports about Penguin 3.0 that google had said they had stopped processing new disavow requests around the middle of Sept. I'm worried that this might have prevented our site from escaping Penguin this time (we've seen no changes in our rankings). I submitted a disavow request for our website in early Sept, before the 'deadline', so to speak. However, we moved our website to HTTPS about two weeks following the new google guidelines, and it only occurred to me recently that I needed to add this as a new site to webmaster and re-upload the disavow request, which I did a week ago. Will our disavow request have been ignored for this reason, or would google simply carry over the disavow list from the http to https version of the site automatically?
I sort of hope we have missed the ship this time, as otherwise the fact that our rankings haven't changed in spite of all the work we did cleaning up our backlinks is pretty grim news.
-
17th October we have seen dramatic increase in Google Crawl activity, not as strong as the last update, about half the pages were crawled this time than that of the last main update.
Bruce
edit typo
-
Thanks EGOL.
I really think that Penguin is continuing to roll out. I have a number of sites that I am monitoring that should see recovery but have not. But, I've seen several sites make fantastic recoveries as well which is awesome.
This is just a theory, but it seems to me like the majority of reported recoveries have been outside of the US.
The next few days will tell us more!
-
If you're wondering when we, at Moz, will be posting something, Dr. Pete will be digging into the data tomorrow and have something up soon. (He tweeted that today he's looking at some actual penguins in the zoo with his family.
-
Our GWT Crawl goes wild when it reaches the UK, so nothing yet.
Looking forward to seeing the impact.
All this very hard work that we all put in makes us look forward to Penguin dropping by to those who ride on the crest of others work!
Have a great weekend all
Bruce
-
yeah and most worrying is people are claiming to be affected by Negative SEO - which if true is a whole new SEO strategy to worry about. Protecting yourself as well as trying to improve
-
Marie Haynes thinks that there was an update. https://twitter.com/Marie_Haynes
She sent an email out to subscribers...
Hi! I woke up this morning to an inbox full of people asking whether Penguin had updated. Twitter is ablaze with people talking about Penguin as well. While we haven't had an official announcement from Google yet, it really does appear that Penguin is in the process of updating.
If you haven't seen any changes in your rankings, don't despair. It looks like the update has only affected some people. I have seen people in the UK, Australia and the US who have seen changes, but other people in those same regions have not. Some of my clients have seen nice jumps back to page one, but I have other clients who have not budged. Also, several people on Twitter who do a lot of Penguin work are not yet seeing any changes either.
I have also heard several reports of sites dropping several pages in rankings today. Most were sites owned by people who admittedly built unnatural links.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google only indexing the top 2/3 of my page?
HI, I have a page that is about 5000 lines of code total. I was having difficulty figuring out why the addition of a lot of targeted, quality content to the bottom of the pages was not helping with rankings. Then, when fetching as Google, I noticed that only about 3300 lines were getting indexed for some reason. So naturally, that content wasn't going to have any effect if Google in not seeing it. Has anyone seen this before? Thoughts on what may be happening? I'm not seeing any errors begin thrown by the page....and I'm not aware of a limit of lines of code Google will crawl. Pages load under 5 seconds so loading speed shouldn't be the issue. Thanks, Kevin
Intermediate & Advanced SEO | | yandl1 -
Domain remains the same IP address is changing on same server only last 3 digits changing. Will this effect rankings
Dear All, We have taken and a product called webacelator from our hosting UKfast and our ip address is changing. UKFasts asked to point DNS to different IP in order to route the traffic through webacelator, which will enhance browsing speed. I am concerned, will this change effect our rankings. Your responses highly appreciated.
Intermediate & Advanced SEO | | tigersohelll0 -
Content question about 3 sites targeted at 3 different countries
I am new here, and this is my first question. I was hoping to get help with the following scenario: I am looking to launch 3 sites in 3 different countries, using 3 different domains. For example the.com for USA, the .co.uk for UK , and a slightly different .com for Australia, as I could not purchase .com.au as I am not a registered business in Australia. I am looking to set the Geographic Target on Google Webmaster. So for example, I have set the .com for USA only, with .co.uk I won't need to set anything, and I will set the other Australian .com to Australia. Now, initially the 3 site will be "brochure" websites explaining the service that we offer. I fear that at the beginning they will most likely have almost identical content. However, on the long term I am looking to publish unique content for each site, almost on a weekly basis. So over time they would have different content from each other. These are small sites to begin with. So each site in the "brochure" form will have around 10 pages. Over time it will have 100's of pages. My question or my worry is, will Google look at the fact that I have same content across 3 sites negatively even though they are specifically targeted to different countries? Will it penalise my sites negatively?
Intermediate & Advanced SEO | | ryanetc0 -
Is my domain scorched earth from Penguin?
http://pisoftware.com was never a huge leader of traffic, but it ranked top 5 for my money keyphrases, and was bringing consistent quality visitors. As traction went up, that traffic just became more valuable. I was happy. Then Penguin came along, and made me sad. 60% loss in traffic, I stayed calm. I disavowed. I sent emails asking for links to come down. I atoned for my sins (of the distant, distant past - I know better now) - and waited. Never a hard penalty, never an email from Google - just rankings that got hammered. From #3 for my best keyphrase for #25 today. I write content, and I try and write it better all the time. I try to make it valuable. I leverage social media to the extent that I can. I do outreach. I'm trying to be patient, but it's hard when the software is awesome, and so few people see it. I'm considering starting over - or maybe even just creating another domain to use if this one never comes back. I wonder on the thoughts of experts. At MozCon I talked to a lot of people in the same boat, and it seems we are all taking similar steps. So the questions: 1. Should I start over? Or stay the course? 2. What has worked for others - what seems to have been the most valuable in getting back on the rise? 3. Thoughts on the site as it is now? I've worked lately on speed, mobile rendering, etc - and it seems responsive and solid to me. Thanks in advance, you crazy bunch of Mozzers you. Kelly
Intermediate & Advanced SEO | | Kellster0 -
Recent Penguin Update
Hi SEOMoz, Today www.carrentalbuddy.com.au was hit pretty big by the Penguin 2.0 update (I believe). We had some pretty strong rankings for multiple search terms and we believe we have done everything by the book for Google. We can't seem to figure out why our rankings have dropped so dramatically recently and was hoping that some SEOMoz's could take a quick look to help us fix this problem. Kindest Regards, Chris
Intermediate & Advanced SEO | | kymodo0 -
301 redirects within same domain
If I 301 redirects all urls from http://domain.com/folder/keyword to http://domain.com/folder/keyword.htm Are new urls likely to keep most of link juicy from source url and maintain the rankings in SERP?
Intermediate & Advanced SEO | | Bull1350 -
Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites. Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.) Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.) The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed. So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc. After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.) How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both? If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ? Thanks for reading this long post.
Intermediate & Advanced SEO | | JustDucky0 -
How to prevent duplicate content within this complex website?
I have a complex SEO issue I've been wrestling with and I'd appreciate your views on this very much. I have a sports website and most visitors are looking for the games that are played in the current week (I've studied this - it's true). We're creating a new website from scratch and I want to do this is as best as possible. We want to use the most elegant and best way to do this. We do not want to use work-arounds such as iframes, hiding text using AJAX etc. We need a solid solution for both users and search engines. Therefor I have written down three options: Using a canonical URL; Using 301-redirects; Using 302-redirects. Introduction The page 'website.com/competition/season/week-8' shows the soccer games that are played in game week 8 of the season. The next week users are interested in the games that are played in that week (game week 9). So the content a visitor is interested in, is constantly shifting because of the way competitions and tournaments are organized. After a season the same goes for the season of course. The website we're building has the following structure: Competition (e.g. 'premier league') Season (e.g. '2011-2012') Playweek (e.g. 'week 8') Game (e.g. 'Manchester United - Arsenal') This is the most logical structure one can think of. This is what users expect. Now we're facing the following challenge: when a user goes to http://website.com/premier-league he expects to see a) the games that are played in the current week and b) the current standings. When someone goes to http://website.com/premier-league/2011-2012/ he expects to see the same: the games that are played in the current week and the current standings. When someone goes to http://website.com/premier-league/2011-2012/week-8/ he expects to the same: the games that are played in the current week and the current standings. So essentially there's three places, within every active season within a competition, within the website where logically the same information has to be shown. To deal with this from a UX and SEO perspective, we have the following options: Option A - Use a canonical URL Using a canonical URL could solve this problem. You could use a canonical URL from the current week page and the Season page to the competition page: So: the page on 'website.com/$competition/$season/playweek-8' would have a canonical tag that points to 'website.com/$competition/' the page on 'website.com/$competition/$season/' would have a canonical tag that points to 'website.com/$competition/' The next week however, you want to have the canonical tag on 'website.com/$competition/$season/playweek-9' and the canonical tag from 'website.com/$competition/$season/playweek-8' should be removed. So then you have: the page on 'website.com/$competition/$season/playweek-9' would have a canonical tag that points to 'website.com/$competition/' the page on 'website.com/$competition/$season/' would still have a canonical tag that points to 'website.com/$competition/' In essence the canonical tag is constantly traveling through the pages. Advantages: UX: for a user this is a very neat solution. Wherever a user goes, he sees the information he expects. So that's all good. SEO: the search engines get very clear guidelines as to how the website functions and we prevent duplicate content. Disavantages: I have some concerns regarding the weekly changing canonical tag from a SEO perspective. Every week, within every competition the canonical tags are updated. How often do Search Engines update their index for canonical tags? I mean, say it takes a Search Engine a week to visit a page, crawl a page and process a canonical tag correctly, then the Search Engines will be a week behind on figuring out the actual structure of the hierarchy. On top of that: what do the changing canonical URLs to the 'quality' of the website? In theory this should be working all but I have some reservations on this. If there is a canonical tag from 'website.com/$competition/$season/week-8', what does this do to the indexation and ranking of it's subpages (the actual match pages) Option B - Using 301-redirects Using 301-redirects essentially the user and the Search Engine are treated the same. When the Season page or competition page are requested both are redirected to game week page. The same applies here as applies for the canonical URL: every week there are changes in the redirects. So in game week 8: the page on 'website.com/$competition/' would have a 301-redirect that points to 'website.com/$competition/$season/week-8' the page on 'website.com/$competition/$season' would have a 301-redirect that points to 'website.com/$competition/$season/week-8' A week goes by, so then you have: the page on 'website.com/$competition/' would have a 301-redirect that points to 'website.com/$competition/$season/week-9' the page on 'website.com/$competition/$season' would have a 301-redirect that points to 'website.com/$competition/$season/week-9' Advantages There is no loss of link authority. Disadvantages Before a playweek starts the playweek in question can be indexed. However, in the current playweek the playweek page 301-redirects to the competition page. After that week the page's 301-redirect is removed again and it's indexable. What do all the (changing) 301-redirects do to the overall quality of the website for Search Engines (and users)? Option C - Using 302-redirects Most SEO's will refrain from using 302-redirects. However, 302-redirect can be put to good use: for serving a temporary redirect. Within my website there's the content that's most important to the users (and therefor search engines) is constantly moving. In most cases after a week a different piece of the website is most interesting for a user. So let's take our example above. We're in playweek 8. If you want 'website.com/$competition/' to be redirecting to 'website.com/$competition/$season/week-8/' you can use a 302-redirect. Because the redirect is temporary The next week the 302-redirect on 'website.com/$competition/' will be adjusted. It'll be pointing to 'website.com/$competition/$season/week-9'. Advantages We're putting the 302-redirect to its actual use. The pages that 302-redirect (for instance 'website.com/$competition' and 'website.com/$competition/$season') will remain indexed. Disadvantages Not quite sure how Google will handle this, they're not very clear on how they exactly handle a 302-redirect and in which cases a 302-redirect might be useful. In most cases they advise webmasters not to use it. I'd very much like your opinion on this. Thanks in advance guys and galls!
Intermediate & Advanced SEO | | StevenvanVessum0