Panda 3.7 recovery?
-
In June '12 we got a bunch of our keyword rankings wacked by Panda 3.7 and haven't recovered. This is really frustrating since we had been hit by the original Panda and spent months fixing our site so that we had recovered in December '11.
I've read what I can find about this update to see if there is something specific about it that would have knocked us down again and can't find anything.
Does anyone know of any specific issues that this update supposedly "fixed"?
-
Hi Ian,
From a quick glance I can see that your internal linking structure is extreme: http://www.seomoz.org/blog/smarter-internal-linking-whiteboard-friday
Watch this video it should help you get some ideas.
Also read this: http://www.seomoz.org/blog/internal-linking-strategies-for-2012-and-beyond
-
Ian- Here is some info from Dr. Pete in SEOMOZ. Dr. Pete is my go to guy when it comes to digging "deep into the weeds" of a particular GOOGLE update. He dissects googles algorithm updates like I eat pop tarts for breakfast. Hopefully some of his findings will help you refine your search for clues.....here is his post on the subject. Make sure you follow him....
http://www.seomoz.org/blog/the-bigfoot-update-aka-dr-pete-goes-crazySeo
I would make one other suggestion. Remember, Panda is about content. Instead of only taking a reactive stance to Pandas algorithm updates, start to work through your existing content and new content strategy and make sure that you are generating relevant content that isnt over optimized. With the drop in specific keywords you know where you can start to focus but make sure you stay ahead of the curve and proactively focus on other keywords not effected but where you might have similar content issues.
Good luck. Hope this helps..
Mark
-
Link to our site. Where did you get the info on what Panda 3.7 was about?
-
Panda 3.7 was not only a duplicate content issue, it was targeting over optimization, having 5,000 words on a page and keeping your keywords at a 10% ratio is over optimizing, having tons of internal links in your footer to pages you are trying to rank for can be overopting. In other words if you are creating your pages for a search engine you will not rank in the search engine. Unfortunately without your URL it is hard to give an exact answer to your question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Panda, rankings and other non-sense issues
Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!
Intermediate & Advanced SEO | | fablau0 -
E-Commerce Panda Question
I'm torn. Many of our 'niche' ecommerce products rank well, however I'm concerned that duplicate content is negatively effecting our overall rankings via Panda Algo. Here is an example that can be found through quite a few products on the site. This sub-category page (http://www.ledsupply.com/buckblock-constant-current-led-drivers) in our 'led drivers' --> 'luxdrive drivers' section has three products that are virtually identical with much of the same content on each page, except for their 'output current' - sort of like a shirt selling in different size attributes: S, M, L and XL. I could realistically condense 44 product pages (similar to example above) down to 13 within this sub-category section alone (http://www.ledsupply.com/luxdrive-constant-current-led-drivers). Again, we sell many of these products and rank ok for them, but given the outline for how Panda works I believe this structure could be compromising our overall Panda 'quality score', consequently keeping our traffic from increasing. Has anyone had similar issues and found that its worth the risk to condense product pages by adding attributes? If so, do I make the new pages and just 301 all the old URLs or is there a better way?
Intermediate & Advanced SEO | | saultienut0 -
Ever had a case where publication of products & descriptions in ebay or amazon caused Panda penalty?
One of our shops got a Panda penalty back in september. We sell all our items with same product name and same product description also on amazon.com , amazon.co.uk, ebay.com and ebay.co.uk. Did you ever have a case where such multichannel sales caused panda penalty?
Intermediate & Advanced SEO | | lcourse0 -
3 results for a site on page one?!?
Hi, I've never seen a website rank on page 1 in position 2, 3 and 4 for one query, completely separate results as well. I thought they limited the amount of results from a website on each page?
Intermediate & Advanced SEO | | activitysuper0 -
NOINDEX listing pages: Page 2, Page 3... etc?
Would it be beneficial to NOINDEX category listing pages except for the first page. For example on this site: http://flyawaysimulation.com/downloads/101/fsx-missions/ Has lots of pages such as Page 2, Page 3, Page 4... etc: http://www.google.com/search?q=site%3Aflyawaysimulation.com+fsx+missions Would there be any SEO benefit of NOINDEX on these pages? Of course, FOLLOW is default, so links would still be followed and juice applied. Your thoughts and suggestions are much appreciated.
Intermediate & Advanced SEO | | Peter2640 -
With Panda, which is more important, traffic or quantity?
If you were to prioritize how to fix a site, would you focus on traffic or quantity of urls? So for example, if 10% of a site had thin content, but accounted for 50% of the traffic and 50% of the site had a different type of thin content but only accounted for 5% of organic traffic, which would you work on first? I realize both need to be fixed, but am unsure of which to tackle first (this is an extremely large site). Also, I am wondering if the simply the presence of thin content on a domain can affect a site even if it isn't receiving any traffic.
Intermediate & Advanced SEO | | nicole.healthline0 -
Panda Prevention Plan (PPP)
Hi SEOMOzers, I'm planning to prepare Panda deployment, by creating a check-list from thinks to do in SEO to prevent mass trafic pert. I would like to spread these ideas with SEOMoz community and SEOMoz staff in order to build help ressources for other marketers. Here are some ideas for content website : the main one is to block duplicate content (robots.txt, noindex tag, according to the different canonical case) same issue on very low quality content (questions / answers, forums), by inserting canonical redirect or noindex on threads with few answers
Intermediate & Advanced SEO | | Palbertus1 -
Recovery during domain migration
On average, how long does it takes to recover 80% of the rankings if two high authority domains are combined without chaging any content? I totally understand that each domain is different and search engines can treat them differently but if all the steps are followed to the T what are the chances?
Intermediate & Advanced SEO | | ninjamarketer1