Is Panda as aggressive as Penguin in terms of being able to escape its clutches ?
-
Hi,
Is being hit by Panda as hard to get out of as being hit by Penguin ?
Or if you clean up all your content should you get out of it relatively quickly ?
I have a very old (11 years) and established site (but also very neglected site that i'm looking to relaunch) but its on an ancient shopping cart platform which never allowed for Google analytics & GWT integration etc so i cant see any messages in GWT or look at traffic figures to correlate a drop with any Panda updates.
The reason i ask is i want to relaunch the site after bringing up to date with a modern e-commerce platform. I originally launched the site in early 2002 and was perceived well by Google achieving first field of view SERPS for all targeted keywords however competitive, including 'ipod accessories', 'data storage' etc etc. These top positions (& resulting sales) lasted until about 2007 when it was overtaken by bigger brand competitors with more advanced & Google friendlier ecommerce platforms (& big SEO budgets)
I originally used the manufacturers descriptions editing slightly but probably not enough to avoid being considered duplicate content although still managed to obtain good rankings for these pages for a very long time even ranking ahead of Amazon in most cases. The site is still ranking well for some of the keywords relating to products for which there is still manufacturer copied descriptions so i actually don't think i have been hit by Panda.
So my questions Is, is there any way of finding out for sure if the site has indeed even been hit by Panda at all without looking at analytics & gwt ?
And once i find out if it has or not:
- Is it best if i relaunch on same domain to take advantage of the 11 year old domain history/authority etc ? So long as i make sure all product descriptions etc are unique, if i have been hit by Panda the site should escape its clutches quite quickly ?
**OR **
- Is Panda as aggressive as Penguin in which case is it best to start again on a new domain ?
Many Thanks
Dan
-
Thanks for taking time to respond Egol
Ok great Panda theoretically escapable in a few weeks then
Cheers
Dan
-
Penguin... If you have crap links you must address them. This can be very difficult to cut what you spent a lot of money on and considered to be assets of your business. The challenge is a psychological mindchange and a lot of time to address the problem. Probably pays to have an objective person review the links who has experience in recovering sites from Penguin penalties and unnatural links penalties.
Panda... If you have a site that has lots of thin content, duplicate content or low value content then you must remove that content or replace it with high value content. Again you have a psychological challenge. You must also be willing to spend money to acquire valuable unique content or spend time to create it. You must be willing to chop off your feet to save your ass. Keep in mind that some panda problems can be cause by technology glitches. Before doing major surgery or major content investment it is probably a good idea to get a person familiar with recovering Panda problems to review your site and your plan of action.
I had two sites hit by Panda. On one site I had published lots of .gov and .edu press releases, some at their request, some at my decision. I removed a lot of that content and noindex/followed the rest. That site recovered in a few weeks. On another site I had pages of .pdf content to control the printing of graphics. These were causing a duplicate content problem. We applies rel=canonical with .htacess and the site recovered a few weeks later.
Good luck.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can anyone tell me - in layman's terms - any SEO implications of a Netscaler redirect?
We are in the midst of exploring the best options for developing a "microsite" experience for a client and how we manage the site - subdomain vs. subdirectory... Netscaler redirect vs DNS change. We understand that a subdirectory is best for SEO purposes; however, we anticipate technical limitations when integrating the different hosting platforms and capabilities into the existing site. The proposed solutions that were provided are a netscaler redirect and/or dns changes. Any experience with these solutions?
Technical SEO | | jgrammer0 -
Is it a Panda/Penguin hit? Or it's just a natural ranking drop?
My traffic comes from google. This is the traffic profile. Does it look like a Panda or Penguin hit? I have a hard time determining it myself. Thanks. ne0r7kg.png
Technical SEO | | ChelseaP0 -
Is it panda, pengiun, ad penalty?
I'm trying to figure out why my google traffic is going down... I see that back in Feb and then March 2011 it started to drop, which I assume was pengiun. I saw a gradual comeback in traffic until March 2012, which I assume the second drop was another pengiun update. The decline continued gradually until I saw a big drop in October 2012 which is completely dropping off in the past month today. I recreated my website on wordpress, improving content and removing google ads. Relaunched a few weeks ago and still see a big drop. Any idea what happened? I only got a message from google about a large traffic drop in march 2012 and a 404 error increase recently when I launched the new site which I fixed with 301 and removing media attachment pages that were indexed that gave a 404. Once concern is I have no idea if I have a problem with pengiun. Could I have a problem with too many links coming from my blog or soicial network? What's acceptable number of back links to not be spam? If you add pages in the blogroll is this thought of as spam with pengiun? website: http://www.dashinfashion.com Thanks for your help!
Technical SEO | | dashinfashion0 -
Term for how content or data is structured
There is a term for how data or content is structured and for the life of me I can't figure it out. The following is the best I know of how to explain it: magnolia is of Seattle. Seattle is of Washington. Washington is of the US. US is of North America. North America is of Earth. etc etc etc etc. Any help is much appreciated. I'm trying to use the term to communicate It's application to SEO in that Google analyze how information is structured to understand the breadth and depth of your sites content...
Technical SEO | | BonsaiMediaGroup0 -
Number of links you should have on a taxonomy term??
According to SeoMoz, my taxonomy terms contain more than 100 links (links to articles in my case) and it tells me that I should reduce it. I have seen a video by Matt Cutts, the google software engineer, and in that video he said that Google's engine has dramatically improved ever since and 100 is not the limit anymore. What do you guys think is the best practice here? To clarify the subject even more: I want to learn this from link juice perspective, does it effect how link juice is distributed? Let's say I have 5 taxonomy terms and all of them have 200 articles and these 5 terms are listed on the home page of a PR7 website. In this case some of the PR will be passed to these 5 taxonomy terms. However, if I increase taxonomy terms to 10, then i will reduce links to 100, but the PR will be distributed even more. This means each taxonomy term will have even less PR value. Am I wrong? Any ideas?
Technical SEO | | mertsevinc0 -
Panda recovery timeframe question
Site was hit by Panda Aug. 22nd. Lost 90% of Google traffic. I know 🙂 We think we found a reason and made few changes to landing pages structure. Updated sitemaps submitted. When can we expect effect (if any) - few days or after next Panda data refresh? Thank you!P.S. What is also interesting, similar traffic loss from Bing/Yahoo happened at exactly the same date. Does that mean Bing is "stealing" search results from Google when can't provide their own relevant results? 🙂
Technical SEO | | LocalLocal0 -
Slapped by the Penguin
We had a client's website hit hard by the Penguin update, particularly on the 24th. Sitewide each keyword lost 10-20 positions. It was in #1 or #2 for the past couple years. We optimize all of our websites onpage features well and within the whitehat realm. Since this was the only website affected out of 50+ other sites, I am guessing the penalty came directly from the backlink profile which was quite bad. The client had bought two other directory link package deals about 4 years ago which all of the incoming directory links have the exact same anchor text. I warned him this was completely unnatural and we only went after "natural-looking" links since then. Keep in mind these links were from 4+ years ago and did very little for rankings as we came into the picture. Out of 143 root domain links, around 45 use the same anchor text in link. We started with about 50 links total 2 years ago and have since built a very good quality profile, or so I thought. I was almost certain is was enough various anchor text to dilute it down. I'm wondering if any of your websites that have been hit have a high amount of exact match anchor text. I can't believe Google would penalize just for linkbuilding because it seems to be an easy way to attack competitors but all my data is looking that way. Let me know your thoughts if any of your sites have been hit. Thanks
Technical SEO | | seoninja201 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0