Has Panda 2.5 Hit?
-
I'm sure a few people have been asking this direct question throughout the forums but most of them are masked by indirect questions like "my traffic has dipped" and the like. Does anyone have a firm confirmation that Panda has hit?
My indication that it has hit is that I'm experiencing a ~%30 increase in traffic to my e-commerce client from organic searches after this past weekend. We haven't made any significant changes to content besides daily postings, but even that doesn't account for a %30 spike that has maintained for 3 days straight.
So again, what have you guys experienced? Anything to support this?
-
The trick is, Google isn't going to issue a release saying that they DIDN'T do an update, unless everyone seems convinced they did. Even then, they aren't consistent. It used to be that we didn't get official notifications of algo updates at all - that's a pretty recent development.
The last official roll-out was the global release on August 12th, which some called Panda 2.4 (although I think of it as just 2.3 to a broader audience).
Given the timing of Panda data updates, it is possible to see the impact of a Panda release a couple of weeks after it happens, but a 30% traffic increase doesn't sound like Panda, unless your competitors got hit. Has you ranking changed? Which keywords have more volume? Are there seasonal trends going on? You need to dig deep into the data, but it's not pointing to Panda from what I can tell.
-
We're not seeing it anywhere...
-
I hear you, but if Barry writes that Google told me that no update is running, I trust him. I don't think he was referring to the bath cleaners of the Google office when saying "Google".
Mine is just a reminder that before looking at major algo changes is always better to (re)control all the other potential reasons of a ranking change.
-
I've seen Barry's post, however he doesn't cite an actual source or release. While I do enjoy Barry's site I don't really trust information from an unnamed, unofficial source. Does anyone have a release saying it?
-
No... Panda 2.5 is not here. At least that's what Google itself told to Barry Schawrtz of Serountable.com here:
-
Why? To improve search results.
When? It's impossible to say
It could happen on Christmas Day for all we know.
-
I'm glad my rankings are not suffering but still begs the question why? And what day?
-
I have read around forums quite often actually that people's rankings are suffering so possibly however I don't think it's anything concrete at the moment, I think it's just a case of seeing more than usual
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
E-Commerce Panda Question
I'm torn. Many of our 'niche' ecommerce products rank well, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Intermediate & Advanced SEO | | saultienut0 -
Website gone from PR 2 to PR 0
Hi guys! We're looking at a site in the trade industry here in Australia and it appears that around the Panda/Penguin updates in September & October last year, they've had their Google PageRank wiped out back to zero. I know we shouldn't be focusing too much on PR these days, but can't help but wonder what's caused this. It's a local business website who aren't selling links etc. I'm thinking backlinks pointing at the site that were giving them a boost have been discounted, however, they still have quite a number of quality links coming into them. Would love to pick your brains! Regards.
Intermediate & Advanced SEO | | WCR0 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
We haven been hit by penguin 2.0 what to do?
Hi, Last week we got hit bij penguin 2.0. Our sites dropped on most keywords on average 10 places. We had a steady place for 2 to 3 years. We have site-wide links in the top of our websites to the other websites ( about 9 e-commerce sites). Today i have put rel= "nofollow " tags in all these links (accept on the hompages). To prevent spammy links. Is there anything else we can do ? Url ww.klokkenpaleis.nl most important keyword = klokken ( previous position, 2nd place) search engine = google netherlands Thanks a lot for your help.
Intermediate & Advanced SEO | | GTGshops0 -
What About Google Panda Update 22?
Maybe I haven't found the threads or whatever but I haven't seen lots of posts about the latest Google Panda update from November 21-22 on SEOmoz. Panda 22 is not even listed here: http://www.seomoz.org/google-algorithm-change Until November 21st, Google killed 3 of 5 websites I own through their Panda updates (never got hit by Penguin updates as I got only original content), accounting for about 25% of my income. Fortunately, the 2 remaining websites gained more traffic throughout the summer of 2012 so my income almost got back to 100% even though I got the "Unnatural Links" warning in Google Webmaster Tools in July. Since then, I did a huge link cleanup and according to the Link Detox Tool (from another SEO service), the number of "toxic links" went from about 350 to 50. Back link reports is as follow: 8% (52) Toxic Links; 57% (382) Suspicious Links; 35% (235) Healthy Links; Out of the 382 suspicious, most of them are coming from the same domain and they are all directories to which my website has been submitted automatically (not using any specific keyword anchor). On the opposite, healthy links are coming from different domains so I like to think they have a stronger impact than suspicious links. That said, my two remaining websites were still doing well until November 21 where it got hit by the Panda. Now traffic has dropped by 55% and income has dropped by 75% (yes I'll have to look for a job within a year if I don't fix this). (I want to add that none of my websites are "thin websites". One has over 1500 pages of content and the other has about 500 pages. All websites have content added 3 to 5 times a week.) What I don't get is that all my "money keywords" are still ranked in the top 10 results on Google according to multiple tools / services I use, yet the impressions dropped from 50% to 75% for those keywords?!? I have a feeling that this time it's not only a drop in ranking. There's a drop in impressions caused by something else. Is it caused by emphasis on local search? Are they showing more ads and less organic results? But here's the "funny part": For the last 5 years, I was never able to advertise my website on Google Adwords. Each time, I got a quality score of about 4/10 only to see it drop to 1/10 within a few hours of launching the campaign. On November 22nd, I build new PPC campaigns based on the exact same PPC campaigns I had the past (same keywords, same ads, same landing pages). Guess what? Now the quality score is between 7/10 and 10/10 (most of them have 10/10) for the exact same PPC campaign! What a "coincidence" huh?
Intermediate & Advanced SEO | | sbrault740 -
Hit by Penguin, Can I move the content from the old site to a new domain and start again with the same content which is high quality
I need some advice please. My website got the unnatural links detected message and was hit by penguin.. hard. Can I move the content from the current domain to a new domain and start again or does the content need to be redone also. I will obviously turn of the old domain once its moved. The other option is to try and identify the bad links and change my anchor profile which is a hit and miss task in my opinion. Would it not be easier just to identify the good links pointing to the old domain and get those changed to point to the new domain with better anchors. thanks Warren
Intermediate & Advanced SEO | | warren0071 -
3 Sites Covering Similar Topics & Panda
My question will take a bit of explaining, so here goes: I have 3 blogs on the same server: 1. personal finance blog; 2. credit card blog; 3. prepaid credit card blog. The personal finance blog is my flagship site started in 2007, which feeds my family and pays the mortgage. By contrast, the other two sites (started in 2008 and 2010) I would gladly kill if the result would help my personal finance blog. In the fall of 2010 (before Panda) the prepaid card blog was penalized by Google. This has been confirmed by Google in response to a reconsideration request. Of course, they don't say why. I've tried a number of things and resubmitted the site, but with no luck. Both the personal finance blog and credit card blog were hit by Panda 2 (April 11, 2011) and have not recovered. While the personal finance site covers many topics (e.g., investing, credit, debt, money management), its income comes largely from credit cards. We review individual credit cards and have pages that list cards by category (e.g., balance transfer, cash back, travel). The credit card blog does the same thing, but of course covers credit cards in more depth. There is a similar overlap between the prepaid card blog on the one hand, and the credit card blog and personal finance blog on the other. However, all content is unique. I do not currently link between the sites, although until a few months ago I had blogroll links between the sites and a few (less than 10) content links. If you've made it this far (and I hope you have), here are my questions: 1. Could the existence of the credit card and prepaid credit card sites be hurting my personal finance blog's rankings in Google, whether via Panda or otherwise? 2. If there is a reasonable chance that the answer to question 1 is yes, what would you suggest I do? Of course, I could just take down the sites, but I wonder if there are other options. One thought I had was to deindex the two card sites (I assume I can do this by disallowing googlebot via robots.txt) and give it time. Would Google treat this as if the sites did not exist? Both sites get a fair amount of traffic from bing and yahoo, so this option appeals to me. Of course, for all I know the existence of the two card sites are hurting my personal finance blog's rankings in bing and yahoo, too. I thought about selling the sites, but if they are hurting my personal finance site, I grow concerned about how google distinguishes between a site being sold and a webmaster just trying to make the sites look like they are owned by different people. In this regard, I've never tried to hide the common ownership of the sites and have no intention of doing that now. If I kill the sites, should I redirect them to my personal finance site? For the penalized prepaid card site, this seems both risky and unhelpful. But perhaps redirecting the credit card site is an option. Given that the personal finance site is my livelihood, I greatly appreciate your thoughts on my dilemma.
Intermediate & Advanced SEO | | Bergerlaw0 -
Our site has been up almost 2 months and no rankings yet?
Our site is www.AkinsSeptic.com. We spent a lot of time on the site and have not received any rankings yet. Can you advise as to what we can do to get this site ranked at all. I know it needs a lot of SEO work and some links, however, it should rank mildly due to the low competition of keywords we are using. Thanks in advance.
Intermediate & Advanced SEO | | Tormz0