Panda 3.7 recovery?
-
In June '12 we got a bunch of our keyword rankings wacked by Panda 3.7 and haven't recovered. This is really frustrating since we had been hit by the original Panda and spent months fixing our site so that we had recovered in December '11.
I've read what I can find about this update to see if there is something specific about it that would have knocked us down again and can't find anything.
Does anyone know of any specific issues that this update supposedly "fixed"?
-
Hi Ian,
From a quick glance I can see that your internal linking structure is extreme: http://www.seomoz.org/blog/smarter-internal-linking-whiteboard-friday
Watch this video it should help you get some ideas.
Also read this: http://www.seomoz.org/blog/internal-linking-strategies-for-2012-and-beyond
-
Ian- Here is some info from Dr. Pete in SEOMOZ. Dr. Pete is my go to guy when it comes to digging "deep into the weeds" of a particular GOOGLE update. He dissects googles algorithm updates like I eat pop tarts for breakfast. Hopefully some of his findings will help you refine your search for clues.....here is his post on the subject. Make sure you follow him....
http://www.seomoz.org/blog/the-bigfoot-update-aka-dr-pete-goes-crazySeo
I would make one other suggestion. Remember, Panda is about content. Instead of only taking a reactive stance to Pandas algorithm updates, start to work through your existing content and new content strategy and make sure that you are generating relevant content that isnt over optimized. With the drop in specific keywords you know where you can start to focus but make sure you stay ahead of the curve and proactively focus on other keywords not effected but where you might have similar content issues.
Good luck. Hope this helps..
Mark
-
Link to our site. Where did you get the info on what Panda 3.7 was about?
-
Panda 3.7 was not only a duplicate content issue, it was targeting over optimization, having 5,000 words on a page and keeping your keywords at a 10% ratio is over optimizing, having tons of internal links in your footer to pages you are trying to rank for can be overopting. In other words if you are creating your pages for a search engine you will not rank in the search engine. Unfortunately without your URL it is hard to give an exact answer to your question.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help, no organic traffic recovery after new site launch (it's been 6 months)!
I worked with a team of developers to launch a new site back in March. I was (and still am) in charge of SEO for the site, including combining 4 sites into 1. I made sure 301 redirects were in place to combine the sites and pretty much every SEO tactic I can think of to make sure the site would maintain rankings following launch. However, here we are 6 months later and YoY numbers are down -70% on average for organic traffic. Anyone mind taking a look at http://www.guestguidepublications.com and seeing if there's a glaring mistake I'm missing?!?!?! Thanks ahead of time!
Intermediate & Advanced SEO | | Annapurna-Digital1 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Better SEO Option, 1 Site 3 Subdomains or 4 Separate Sites?
Hey Mozzers, I'm working with a client who wants to redo their web presence. They have a a main website for the umbrella and then 3 divisions which have their own website as well. My question is: Is it better to have the main site on the main domain and then have the 3 separate sites be subdomains? Or 4 different domains with a linking structure to tie them all together? To my understanding option 1 would include high traffic for 1 domain and option 2 would be building Page Authority by having 4 different sites linking to each other? My guess would be option 2, only if all 4 sites start getting relevant authority to make the links of value. But right out of the gates option 1 might be more beneficial. A little advice/clarification would be great!
Intermediate & Advanced SEO | | MonsterWeb280 -
Our quilting site was hit by Panda/Penguin...should we start a second "traffic" site?
I built a website for my wife who is a quilter called LearnHowToMakeQuilts.com. However, it has been hit by Panda or Penguin (I’m not quite sure) and am scared to tell her to go ahead and keep building the site up. She really wants to post on her blog on Learnhowtomakequilts.com, but I’m afraid it will be in vain for Google’s search engine. Yahoo and Bing still rank well. I don’t want her to produce good content that will never rank well if the whole site is penalized in some way. I’ve overly optimized in linking strongly to the keywords “how to make a quilt” for our main keyword, mainly to the home page and I think that is one of the main reasons we are incurring some kind of penalty. First main question: From looking at the attached Google Analytics image, does anyone know if it was Panda or Penguin that we were “hit” by? And, what can be done about it? (We originally wanted to build a nice content website, but were lured in by a get rich quick personality to rather make a “squeeze page” for the Home page and force all your people through that page to get to the really good content. Thus, our avenge time on site per person is terrible and Pages per Visit is low at: 1.2. We really want to try to improve it some day. She has a local business website, Customcarequilts.com that did not get hit. Second question: Should we start a second site rather than invest the time in trying to repair the damage from my bad link building and article marketing? We do need to keep the site up and running because it has her online quilting course for beginner quilters to learn how to quilt their first quilt. We host the videos through Amazon S3 and were selling at least one course every other day. But now that the Google drop has hit, we are lucky to sell one quilting course per month. So, if we start a second site we can use that to build as a big content site that we can use to introduce people to learnhowtomakequilts.com that has Martha’s quilting course. So, should we go ahead and start a new fresh site rather than to repair the damage done by my bad over optimizing? (We’ve already picked out a great website name that would work really well with her personal facebook page.) Or, here’s a second option, which is to use her local business website: customcarequilts.com. She created it in 2003 and has had it ever since. It is only PR 1. Would this be an option? Anyway I’m looking for guidance on whether we should pursue repairing the damage and whether we should start a second fresh site or use an existing site to create new content (for getting new quilters to eventually purchase her course). Brad & Martha Novacek rnUXcWd
Intermediate & Advanced SEO | | BradNovi0 -
Penguin: Recovery from Algorithm
Hi Mozzers, A quick question regarding Google Penguin recovery. A domain I have was hit by Penguin and we got a message in Webmaster Tools. We went to work fixing links we thought were most harmful, documented the evidence and did a reconsideration request. Google's reply was "No manual spam actions found". If reconsideration requests aren't the way to go then of course I will continue to build good natural links. But if I remove more links which I consider to be harmful will I ever know if the penalty is removed? Is there a point at which the algorithm would remove the penalty and inform me? Thanks!
Intermediate & Advanced SEO | | panini0 -
E-Commerce Selling Air Filter. Only 3 Qualities Options in 50 Sizes, Total 150 items. HELP!!!
My online store is selling air furnace filters. We only have 3 different filters to sell. (standard, mid-range quality and high quality) Each filters is available in 50 different sizes. This is a TOTAL of 150 products or 3 products with 50 options!!! My store is setup with the ''150 products'' option. MY PROBLEMS: All the page Title are the same, only the filter is change in the page title. ex: 10x20x1 furnace filters - shop at furnace filters canada 12x20x1 furnace filters - shop at furnace filters canada 14x20x1 furnace filters - shop at furnace filters canada ect... It is the same with the Meta Description, all the same only the size change. It is the same with the product description, all the same, only the size is changing. Trying to come out with 150 different page title, meta and product description is almost impossible. And you know like me, most shoppers will use there filters sizes in there keywords search term or phrase. YES, I have duplicate content all over my store. Is there a solution to this? This is my online store http://www.furnacefilterscanada.com/ Thank you, BigBlaze
Intermediate & Advanced SEO | | BigBlaze2050 -
Panda/Penguin & Ecommerce Sites in similar niches
Hello, We have a few online stores that are in similar niches. How do we make sure that we don't get penalized for this (Panda/Penguin) We have the sites interlinked, but our newest one is not going to be linked to the others. Also, will rewriting descriptions help if the product is on more than one site? Thanks!
Intermediate & Advanced SEO | | BobGW0 -
Are widgets dangerous after the Panda update?
My site provides widgets (online polls) which were developed so that each one would embed a do follow text link into the customers website. With Panda's unnatural link algorithm now in place should I modify these links to be nofollow and give up on this strategy or alternatively just set the text as my sites domain name? The only other option I could think of was to only embed links where the customers site had a certain page rank or above? Any thoughts?
Intermediate & Advanced SEO | | Blendfish1