Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
-
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites.
Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.)
Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.)
The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed.
So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc.
After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.)
How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both?
If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ?
Thanks for reading this long post.
-
Alan : Thanks for sharing your experience in such detail.
-
After almost 2 years of panda destruction, and constant work on my site with no recovery whatsoever, I don't know if I have anything useful to contribute yet, so take this as some input.
Large site with over 2.2 million pages.
Deleted around 1.5 million pages
Removed all duplicate titles (removed for fixed)
removed all duplicate descriptions (removed or fixed)
Removed all problem pages (extra short, damaged content, empty)
Removed all duplicate body content pages.
Prevent addition of any new duplicates and if any slip past, fix within 24 hours.
Also, checked for incoming links and discovered some sites with problems pointing in - fixed or had these removed.
RESULT after almost 2 years? - zero improvement.
Almost ready to slash wrists, but about to try subdomaining first.
It would be funny if not so sad.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We are writing 5,000 word long form content that is relevant and engaging. It is too long?
We are writing a series of relevant and informative "power pages" on our site. In the past these have been 2,000 to 3,000 words and our audience has shown to be highly engaged with these pages and they converted well. We have decided to expand our new pages to capture more relevant keywords/topics and the result is they are a bit over 5,000 words. Is there a point where long content, even if highly relevant and engaging, is too long to benefit SEO? Is there any reason we would limit ourselves to 2,000-ish word long form content? I ask because I have read multiple blog posts that suggest that long form content that has ranked well in Google ranges between 2,000 and 3,000 words.
Intermediate & Advanced SEO | | Cutopia0 -
Recovery after recent Google update
Hi guys. This is somewhat a continuation for this topic: https://moz.com/community/q/january-2016-massive-rankings-fluctuations After that update, several of our clients and our website as well have experienced high fluctuation rankings period, which ended up in huge drops - 10-20 spots. Which, obviously, made everyone unhappy. Anybody knows what exactly the change was about? What should we fix/take a look at, analyze again? We aren't using any shady techniques or black hat. Everything is honest. All metrics, number of backlinks etc are going up, no major changes have been recently made. Please, help!
Intermediate & Advanced SEO | | DmitriiK0 -
NEW WEBSITE WHAT IS THE BEST WAY TO RECOVERY THE AUTHORITY OF OLD DOMAIN NAME?
HOW TO DO RECOVERY AUTHORITY OF OLD DOMAIN NAME? I got some advise on this in another post here on MOZ based on this i need a few answers TO SUMMERIZE**:****.** My client got some REALLY bad advice when they got their new website. So they ended up changing the domain name and just redirecting everything from the old domain and old website to the front page of the new domain and new website. As the new domain not optimized for SEO they of cause now are not ranking on anything in Google anymore. QUESTION 1 According to my client, they use to rank well on keywords for the old domain and get a lot of organic traffic. They don’t have access to their old google analytics account, and don’t have any reports on their rankings. Can anyone suggestions how I can find out what keywords they were ranking on? QUESTION 2 I will change the domain name back to the old domnain name (the client actually prefer the old domain name) But how to get back most possible page authority: For information titles, descriptions, content has all been rewritten. A - Redirect I will try to match the old urls with the new ones. B - Recreate site structure Make the URL structure of the new website look like the old URL structure Etc. the old structure use to be like olddomain.com/our-destinations/cambadia.html (old) newdomain.com/destinations/Cambodia (new) Or olddomain.com/private-tours.html (old) newdomain.com/tailor-made (new) does the html in the old urls need any attention when recreating the permalinks in the new websites. Look forward to hear your thoughts on this, thanks!
Intermediate & Advanced SEO | | nm19770 -
Penguin 2.0 Recovery - Penguin Update Rerun yet or not
I have been hit by the penguin 2.0 update some five months back. I believe that I have an algorythmic penalty applied to my sites. While the work to cleanup etc has been done, there is certainly no recovery. I also notice a lack of recovery stories. In fact I think anyone affected cannot recover because a recalculation has not happened? Does anyone think that a recalculation of the penguin 2.0 penalties has happened? If so why do they think that.
Intermediate & Advanced SEO | | Jurnii0 -
PR dropped from 3 to 0\. Why?
Hi, I have an important question and I hope you'll be able to help me find the answer. My site http://www.pokeronlineitalia.com had a PR of 3 (three). Then, I had it restructured to make it look more appealing in order to increase the conversion rate. The problem is that after it was redesigned, the PR dropped all of a sudden from 3 to 0. This is really bad, as it took me over three years to reach that point. Could you please analyze the site and find out what happened? I used SEOMOZ's research tools to try to understand and noticed the following message" "Accessible to Engines Easy fix Crawl status Status Code: 200 meta-robots: noindex, follow meta-refresh: None X-Robots: None Explanation Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible. Recommendation Ensure the URL returns the HTTP code 200 and is not blocked with robots.txt, meta robots or x-robots protocol (and does not meta refresh to another URL)". Basically, the message said that the search engines cannot access the homepage (http://www.pokeronlineitalia.com). May this be the reason why the PR dropped? What do I have to do to solve this problem? Is there a chance I can reach a PR of 3 again? Thank you very much for your help. It'd be great if you could help my site regain its SEO strength.
Intermediate & Advanced SEO | | salvyy0 -
Rel canonical issues on wordpress posts
Our site has 500 rel canonical issues. This is the way i understand the issues. All our blog posts automatically include a rel=canonical to themselves.
Intermediate & Advanced SEO | | acs111
eg a blog about content marketing has: Should this tag point to one of the main pages instead so the link juice is sent back to our home page?0 -
How to compete with duplicate content in post panda world?
I want to fix duplicate content issues over my eCommerce website. I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website. I want to give one example to know more about it. http://www.vistastores.com/outdoor-umbrellas Non WWW version: http://vistastores.com/outdoor-umbrellas redirect to home page. For HTTPS pages: https://www.vistastores.com/outdoor-umbrellas I have created Robots.txt file for all HTTPS pages as follow. https://www.vistastores.com/robots.txt And, set Rel=canonical to HTTP page as follow. http://www.vistastores.com/outdoor-umbrellas Narrow by search: My website have narrow by search and contain pages with same Meta info as follow. http://www.vistastores.com/outdoor-umbrellas?cat=7 http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum I have restricted all dynamic pages by Robots.txt which are generated by narrow by search. http://www.vistastores.com/robots.txt And, I have set Rel=Canonical to base URL on each dynamic pages. Order by pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name I have restrict all pages with robots.txt and set Rel=Canonical to base URL. For pagination pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2 I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages. I have also set Rel=Canonical to base URL. I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages. Google search result: https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520 Since last 7 days, my website have affected with 75% down of impression & CTR. I want to recover it and perform better as previous one. I have explained my question in long manner because, want to recover my traffic as soon as possible.
Intermediate & Advanced SEO | | CommercePundit0 -
How can scraper sites be successful post Panda?
I read this article on SEJ: http://www.searchenginejournal.com/scrapers-and-the-panda-update/34192/ And, I'm a bit confused as to how a scraper site can be successful post Panda? Didn't panda specifically target sites that have duplicate content & shouldn't scraper sites actually be suffering?
Intermediate & Advanced SEO | | nicole.healthline0