Is Panda as aggressive as Penguin in terms of being able to escape its clutches ?
-
Hi,
Is being hit by Panda as hard to get out of as being hit by Penguin ?
Or if you clean up all your content should you get out of it relatively quickly ?
I have a very old (11 years) and established site (but also very neglected site that i'm looking to relaunch) but its on an ancient shopping cart platform which never allowed for Google analytics & GWT integration etc so i cant see any messages in GWT or look at traffic figures to correlate a drop with any Panda updates.
The reason i ask is i want to relaunch the site after bringing up to date with a modern e-commerce platform. I originally launched the site in early 2002 and was perceived well by Google achieving first field of view SERPS for all targeted keywords however competitive, including 'ipod accessories', 'data storage' etc etc. These top positions (& resulting sales) lasted until about 2007 when it was overtaken by bigger brand competitors with more advanced & Google friendlier ecommerce platforms (& big SEO budgets)
I originally used the manufacturers descriptions editing slightly but probably not enough to avoid being considered duplicate content although still managed to obtain good rankings for these pages for a very long time even ranking ahead of Amazon in most cases. The site is still ranking well for some of the keywords relating to products for which there is still manufacturer copied descriptions so i actually don't think i have been hit by Panda.
So my questions Is, is there any way of finding out for sure if the site has indeed even been hit by Panda at all without looking at analytics & gwt ?
And once i find out if it has or not:
- Is it best if i relaunch on same domain to take advantage of the 11 year old domain history/authority etc ? So long as i make sure all product descriptions etc are unique, if i have been hit by Panda the site should escape its clutches quite quickly ?
**OR **
- Is Panda as aggressive as Penguin in which case is it best to start again on a new domain ?
Many Thanks
Dan
-
Thanks for taking time to respond Egol
Ok great Panda theoretically escapable in a few weeks then
Cheers
Dan
-
Penguin... If you have crap links you must address them. This can be very difficult to cut what you spent a lot of money on and considered to be assets of your business. The challenge is a psychological mindchange and a lot of time to address the problem. Probably pays to have an objective person review the links who has experience in recovering sites from Penguin penalties and unnatural links penalties.
Panda... If you have a site that has lots of thin content, duplicate content or low value content then you must remove that content or replace it with high value content. Again you have a psychological challenge. You must also be willing to spend money to acquire valuable unique content or spend time to create it. You must be willing to chop off your feet to save your ass. Keep in mind that some panda problems can be cause by technology glitches. Before doing major surgery or major content investment it is probably a good idea to get a person familiar with recovering Panda problems to review your site and your plan of action.
I had two sites hit by Panda. On one site I had published lots of .gov and .edu press releases, some at their request, some at my decision. I removed a lot of that content and noindex/followed the rest. That site recovered in a few weeks. On another site I had pages of .pdf content to control the printing of graphics. These were causing a duplicate content problem. We applies rel=canonical with .htacess and the site recovered a few weeks later.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Payday Loan or Panda? Any thoughts?
So this is a very old problem, but I'm finally getting around to trying to figure it out. My site experienced a dramatic organic traffic drop from Google (-40%) on May 17th 2014. It then drops another 30% on May 19th 2014. See graphs. According to Moz these two dates correlate with Payday Loan 2.0 and Panda 4.0. Panda makes complete sense, as this site (www.ausedcar.com) has a large amount of content that is syndicated across other sites (used car inventory is essentially the same everywhere on the Internet). Payday Loan on the other hand, which seems to be the primary traffic drop doesn't make any sense at all. Is it possible I started getting hit by Panda on the 17th and then it completed on the 19th? I know the dates for algorithm changes are not perfect. Next, assuming it is Panda, what are some things you guys have done to help with this? As I mentioned this content is duplicated all over the Internet, so it seems like Google arbitrarily picks winners and losers (my site is twenty years old!). I know I need unique content, but not sure how exactly to do that besides rewriting words so it doesn't appear duplicate.
Technical SEO | | Catbelly0 -
Panda Cleanup - Removing Old Blog Posts, Let Them 404 or 301 to Main Blog Page?
tl;dr... Removing old blog posts that may be affected by Panda, should we let them 404 or 301 to the Blog? We have been managing a corporate blog since 2011. The content is OK but we've recently hired a new blogger who is doing an outstanding job, creating content that is very useful to site visitors and is just on a higher level than what we've had previously. The old posts mostly have no comments and don't get much user engagement. I know Google recommends creating great new content rather than removing old content due to Panda concerns but I'm confident we're doing the former and I still want to purge the old stuff that's not doing anyone any good. So let's just pretend we're being dinged by Panda for having a large amount of content that doesn't get much user engagement (not sure if that's actually the case, rankings remain good though we have been passed on a couple key rankings recently). I've gone through Analytics and noted any blog posts that have generated at least 1 lead or had at least 20 unique visits all time. I think that's a pretty low barrier and everything else really can be safely removed. So for the remaining posts (I'm guessing there are hundreds of them but haven't compiled the specific list yet), should we just let them 404 or do we 301 redirect them to the main blog page? The underlying question is, if our primary purpose is cleaning things up for Panda specifically, does placing a 301 make sense or would Google see those "low quality" pages being redirected to a new place and pass on some of that "low quality" signal to the new page? Is it better for that content just to go away completely (404)?
Technical SEO | | eBoost-Consulting0 -
Noindex large productpages on webshop to counter Panda
A Dutch webshop with 10.000 productpages is experiencing lower rankings and indexation. Problems started last october, a little while after the panda and penguin update. One of the problems diagnosed is the lack of unique content. Many of the productpages lack a description and some are variants of eachother. (color, size, etc). So a solution could be to write unique descriptions and use rel canonical to concentrate color/size variations to one productpage. There is however no capacity to do this on short notice. So now I'm wondering if the following is effective. Exclude all productpages via noindex, robots.txt. IN the same way as you can do with search pages. The only pages left for indexation are homepage and 200-300 categorypages. We then write unique content and work on the ranking of the categorypages. When this works the product pages are rewritten and slowly reincluded, category by category. My worry is the loss of ranking for productpages. ALthoug the ranking is minimal currently. My second worry is the high amount of links on category pages that lead to produtpages that will be excluded rom google. Thirdly, I am wondering if this works at all. using noindex on 10.000 productpages consumes crawl budget and dillutes the internal link structure. What do you think?
Technical SEO | | oeroek0 -
I think I got hit by the latest Panda update
Hi everyone, I think one of my sites got hit with Panda. On Sept 18th the site dipped to "not in top 50" for almost all keywords. I checked GWT for the manual action email but my inbox is empty!!!!!!!!!! The lesser of 2 evils I guess. They had major server issues that week as well so it is hard to identify what caused the site to dip. My client has original content on the website but almost all content on the blog is copied. Do you recommend me deleting the non original content? Can the problem be elsewhere? Thanks
Technical SEO | | Carla_Dawson0 -
Panda recovery timeframe question
Site was hit by Panda Aug. 22nd. Lost 90% of Google traffic. I know 🙂 We think we found a reason and made few changes to landing pages structure. Updated sitemaps submitted. When can we expect effect (if any) - few days or after next Panda data refresh? Thank you!P.S. What is also interesting, similar traffic loss from Bing/Yahoo happened at exactly the same date. Does that mean Bing is "stealing" search results from Google when can't provide their own relevant results? 🙂
Technical SEO | | LocalLocal0 -
Google Reconsideration Request (Penguin) - Will Google give links to remove?
When Penguin v1 hit, our site took a hit for a single phrase (i.e. "widgets") due to the techniques our SEO company was using (network). We've since had those links cleaned up, and our rankings have not recovered. Our SEO company said they submitted a reconsideration request on our behalf, and that Google denied it and didn't provide which links we needed removed. Does Google list links that need removing if they are still not happy with your link profile?
Technical SEO | | crucialx0 -
I am able to point an old domain name to part of my site
hi, i have a domain name to a site that i have but i am thinking of closing that site down and transfer it to a site that is popular. I want to know if i am allowed to point the domain name to a section of my site and if so would it benefit my site or should i just continue running the other site and make it better. would the links to that site be transfered to my site any advice would be great.
Technical SEO | | ClaireH-1848860 -
UK and US subdomain. Can both rank for some keyword terms?
One of my clients has one root domain http://www.website.com and there are two versions, the US and the UK. So there are two subdomains uk.website.com and us.website.com. Both subdomains contain similar content/landing pages and are going after the same keywords. One site is supposedly crawled by UK crawlers but still shows up in US-based SERPS. Will Google take into account that both subdomains are going for the same keyword terms and only rank one of them? How is this kind of thing handled?
Technical SEO | | C-Style0