Panda 3.7 recovery?
-
In June '12 we got a bunch of our keyword rankings wacked by Panda 3.7 and haven't recovered. This is really frustrating since we had been hit by the original Panda and spent months fixing our site so that we had recovered in December '11.
I've read what I can find about this update to see if there is something specific about it that would have knocked us down again and can't find anything.
Does anyone know of any specific issues that this update supposedly "fixed"?
-
Hi Ian,
From a quick glance I can see that your internal linking structure is extreme: http://www.seomoz.org/blog/smarter-internal-linking-whiteboard-friday
Watch this video it should help you get some ideas.
Also read this: http://www.seomoz.org/blog/internal-linking-strategies-for-2012-and-beyond
-
Ian- Here is some info from Dr. Pete in SEOMOZ. Dr. Pete is my go to guy when it comes to digging "deep into the weeds" of a particular GOOGLE update. He dissects googles algorithm updates like I eat pop tarts for breakfast. Hopefully some of his findings will help you refine your search for clues.....here is his post on the subject. Make sure you follow him....
http://www.seomoz.org/blog/the-bigfoot-update-aka-dr-pete-goes-crazySeo
I would make one other suggestion. Remember, Panda is about content. Instead of only taking a reactive stance to Pandas algorithm updates, start to work through your existing content and new content strategy and make sure that you are generating relevant content that isnt over optimized. With the drop in specific keywords you know where you can start to focus but make sure you stay ahead of the curve and proactively focus on other keywords not effected but where you might have similar content issues.
Good luck. Hope this helps..
Mark
-
Link to our site. Where did you get the info on what Panda 3.7 was about?
-
Panda 3.7 was not only a duplicate content issue, it was targeting over optimization, having 5,000 words on a page and keeping your keywords at a 10% ratio is over optimizing, having tons of internal links in your footer to pages you are trying to rank for can be overopting. In other words if you are creating your pages for a search engine you will not rank in the search engine. Unfortunately without your URL it is hard to give an exact answer to your question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Panda, rankings and other non-sense issues
Hello everyone I have a problem here. My website has been hit by Panda several times in the past, the first time back in 2011 (first Panda ever) and then another couple of times since then, and, lastly, the last June 2016 (either Panda or Phantom, not clear yet). In other words, it looks like my website is very prone to "quality" updates by big G: http://www.virtualsheetmusic.com/ Still trying to understand how to get rid of Panda related issues once for all after so many years of tweaking and cleaning my website of possible duplicate or thin content (301 redirects, noindexed pages, canonicals, etc), and I have tried everything, believe me. You name it. We recovered several times though, but once in a while, we are still hit by that damn animal. It really looks like we are in the so called "grey" area of Panda, where we are "randomly" hit by it once in a while. Interestingly enough, some of our competitors live joyful lives, at the top of the rankings, without caring at all about Panda and such, and I can't really make a sense of it. Take for example this competitors of ours: http://8notes.com They have a much smaller catalog than ours, worse quality of offered music, thousands of duplicate pages, ads everywhere, and yet... they are able to rank 1st on the 1st page of Google for most of our keywords. And for most, I mean, 99.99% of them. Take for example "violin sheet music", "piano sheet music", "classical sheet music", "free sheet music", etc... they are always first. As I said, they have a much smaller website than ours, with a much smaller offering than ours, their content quality is questionable (not cured by professional musicians, and highly sloppy done content as well as design), and yet they have over 480,000 pages indexed on Google, mostly duplicate pages. They don't care about canonicals to avoid duplicate content, 301s, noindex, robot tags, etc, nor to add text or user reviews to avoid "thin content" penalties... they really don't care about anything of that, and yet, they rank 1st. So... to all the experts out there, my question is: Why's that? What's the sense or the logic beyond that? And please, don't tell me they have a stronger domain authority, linking root domains, etc. because according to the duplicate and thin issues I see on that site, nothing can justify their positions in my opinion and, mostly, I can't find a reason why we instead are so much penalized by Panda and such kind of "quality" updates when they are released, whereas websites like that one (8notes.com) rank 1st making fun of all the mighty Panda all year around. Thoughts???!!!
Intermediate & Advanced SEO | | fablau0 -
How to know website is hit with panda or penguin?
My Website traffic and keywords dropped day by day. How can I know website is hit with panda or penguin. Website is - 24hourpassportandvisas. com
Intermediate & Advanced SEO | | bondhoward0 -
Panda Recovery Question
Dear Friends, One of my customers was hit by the Panda, we were working on improve the tiny content on several pages and the remaining pages were: 1 NOINDEX/FOLLOW 2. Removed from sitemap.xml 3. Un-linked from the site (no one page on the site link to the pour content) As conclusion we can't see any improvement, my question is should I remove the pour content pages (404)? What is your recommendation? Thank you for your time Claudio
Intermediate & Advanced SEO | | SharewarePros0 -
3 Wordpress sites 1 Tumblr site coming under 1domain(4subdomains) WPMU: Proper Redirect?
Hey Guys, witnessSF.org (WP), witnessLA.org(Tumblr), witnessTO.com(WP), witnessHK.com(WP), and witnessSEOUL.com(new site no redirects needed) are being moved over to sf.ourwitness.com, la.ourwitness.com and so forth. All under on large Wordpress MU instance. Some have hundreds of articles/links others a bit less. What is the best method to take, I understand there are easy redirects, and the complete fully manual one link at a time approach. Even the WP to WP the permalinks are changing from domain.com/date/post-name to domain.com/post-name? Here are some options: Just redirect all previous witinessla.org/* to la.ourwitness.org/ (automatic direct all pages to home page deal) (easiest not the best)2) Download Google Analytics top redirected domains about 50 urls have significant ranking and traffic (in LA's sample) and just redirect those to custom links. (most bang for the buck for the articles that rank manually set up to the correct place) 3) Best of the both worlds may be possible? Automated perhaps?I prefer working with .htaccess vs a redirect plugin for speed issues. Please advise. Thanks guys!
Intermediate & Advanced SEO | | vmialik0 -
Backlinking 3 sites from same domain and backlinking main site too
Hello, we have 4 sites, in which 1 is a main site and rest 3 are niche sites All these 3 sites have dofollow links to main site from home page We got a high quality backlink - through which all 3 niche sites have got it from that domain Is it worth to add backlink from that domain to main site too, despite the fact the 3 sites already have recvd it and they all link to main site many thanks
Intermediate & Advanced SEO | | Modi0 -
Fastest Way To Remove Footer Link? (post-Panda)
Hello, I have a website with 1k+ links pointed directly to an inner page and home page from blogspot domains. There are 3 links in the footer that points to different locations. 1st anchor text points to the person who designed the page template and links to their website (this doesn't affect us) 2nd anchor text uses a direct keyword that I am trying to rank for and links to the inner page. 3rd anchor text uses my website name and links to the home page I know that these are not good links and the content inside the pages are irrelevant to my own website. The links are embedded into the template on the footer and is site wide.I have already contacted the designer and have the links removed but those that have downloaded the templates still have the footer link. What would be the best way to remove all these footer links? Trying to contact each individual person who is using the template is not working out as most have not responded and some of the websites have not seen an update in years! Any thoughts? If you need additional information feel free to send me a direct message so I can send you an exact link.
Intermediate & Advanced SEO | | Shawn1240 -
Can't seem to get traffic back post Panda / Penguin. WHY?
I have done and am doing everything I can think of to bring back lost traffic after the late 2012 updates from google hit us. I just is not working. We had some issues with our out of house web developers which screwed up our site in 2012 and after taking it in house we have Eden doing damage control form months now. We think we have fixed pretty much everything. URL structure filling up with good unique content(under way. Lots still to do) making better category descriptions redesigned homepage. Updated product pages (CMS is holding things back on that part otherwise they would be better. New CMS under construction) started more link building(its a real weak spot on our SEO as far as I can see) audited bad links from dodgy irelavent sites. hired writers to create content and link bait articles. Begun making high quality video's for both YouTube (brand awareness and viral) and on site hosting (link building and conversions) (in the pipeline not online yet). Flattened out site architecture. optimise internal link flow (got this wrong by using nofollows. In the process of thinking of a better way by reducing nun wanted Nav links on page.) i realise its not all done but I have been working ever since the drop in traffic and I'm just seeing no increase at all. I have been asking a few questions on here for the past few days but still can't put my finger on the issue. Am I just impatient and need to wait on the traffic as I am doing all the correct things? Or have I missed something and need to fix it. you anyone would like to have a quick look at my site and see if there is an obvious issue I have missed It would be great as I have been tearing my hair out trying to find the issues with my site. It's www.centralsaddlery.co.uk Criticism would me much appreciated.
Intermediate & Advanced SEO | | mark_baird0 -
How to compete with duplicate content in post panda world?
I want to fix duplicate content issues over my eCommerce website. I have read very valuable blog post on SEOmoz regarding duplicate content in post panda world and applied all strategy to my website. I want to give one example to know more about it. http://www.vistastores.com/outdoor-umbrellas Non WWW version: http://vistastores.com/outdoor-umbrellas redirect to home page. For HTTPS pages: https://www.vistastores.com/outdoor-umbrellas I have created Robots.txt file for all HTTPS pages as follow. https://www.vistastores.com/robots.txt And, set Rel=canonical to HTTP page as follow. http://www.vistastores.com/outdoor-umbrellas Narrow by search: My website have narrow by search and contain pages with same Meta info as follow. http://www.vistastores.com/outdoor-umbrellas?cat=7 http://www.vistastores.com/outdoor-umbrellas?manufacturer=Bond+MFG http://www.vistastores.com/outdoor-umbrellas?finish_search=Aluminum I have restricted all dynamic pages by Robots.txt which are generated by narrow by search. http://www.vistastores.com/robots.txt And, I have set Rel=Canonical to base URL on each dynamic pages. Order by pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name I have restrict all pages with robots.txt and set Rel=Canonical to base URL. For pagination pages: http://www.vistastores.com/outdoor-umbrellas?dir=asc&order=name&p=2 I have restrict all pages with robots.txt and set Rel=Next & Rel=Prev to all paginated pages. I have also set Rel=Canonical to base URL. I have done & apply all SEO suggestions to my website but, Google is crawling and indexing 21K+ pages. My website have only 9K product pages. Google search result: https://www.google.com/search?num=100&hl=en&safe=off&pws=0&gl=US&q=site:www.vistastores.com&biw=1366&bih=520 Since last 7 days, my website have affected with 75% down of impression & CTR. I want to recover it and perform better as previous one. I have explained my question in long manner because, want to recover my traffic as soon as possible.
Intermediate & Advanced SEO | | CommercePundit0