Panda penalty removal advice
-
Hi everyone! I'm after a second (or third, or fourth!) opinion here!
I'm working on the website www.workingvoices.com that has a Panda penalty dating from the late March 2012 update. I have made a number of changes to remove potential Panda issues but haven't seen any rankings movement in the last 7 weeks and was wondering if I've missed something...
The main issues I identified and fixed were:
- Keyword stuffed near duplicate title tags - fixed with relevant unique title tags
- Copies of the website on other domains creating duplicate content issues - fixed by taking these offline
- Thin content - fixed by adding content to some pages, and noindexing other thin/tag/category pages.
Any thoughts on other areas of the site that might still be setting off the mighty Panda are appreciated!
Cheers
Damon.
-
Our site was scraped by a past empoyee who started up a competing buisness with our inside trade secrets, client list and designs. As they launched they immediately tried to put us out of buisness by:
A. hired hacks to hook us up with tons of spammy links along with a high mix of porn and virus injections sites.
B. hired hacks from the same cesspool and had them submit our images to same bad types of sites that would take the customer somewhere else
C. signed up email address to our newseltter so that when we sent out an email it would initiate a chain reaction to zombie computers and launch a DDOS attack on our site and make our own email campaigns stop sales and trash the confidence in the rest of the customers on the mailing list.
D. Gave out every know email address in our company to spammers, to the point of making it difficult to get or send emails to customers.
E. Submitted our phone number to every robot call and junk call site possible tieing up our phones and filling our voice mail."Regarding Panda timing- the site took the big hit three years ago." We to had this exact timing happen to us on top of everything else because we were too busy defending ourselves to keep up with the Google changes.
Regardless of all the horrifying past events, we have completely rebuilt the business from the inside out and migrated to a BigCommerce website from our custom site, plus added 5 social media platforms. BUT..."having to wait for Google" to reindex and give us another chance is killing us and we are concern that we may never get back in the good graces of this SE titian.
All though we have survived the battle, we still may loose the war! Even with continuing efforts to optimize our site to death and with only a fraction of the traffic, orders and income, we have to wonder:
A. What else is wrong such as trying to determine if there are duplicate content on sites out their we are unaware of.
B. Seriously considering dumping our domain (owned since 2000) and going to a new domain that would have to be reindexed and treated as fresh, hopefully optimzed content per the Google requirements, and take our chances.Input on considerations of A & B would be appreciated as we are pretty worn out after 3 years working at this.
-
In this case it was easy as they had created the duplicate domains themselves and they had control over them, so it was just a case of getting them taken down.
-
How did you find ..."opies of the website on other domains creating duplicate content issues"?
How did you ..."- fixed by taking these offline"?
We have been dealing with the same issues but did not think of the above and would like to find out if we have the same "duplicate" issues.
-
Yes, we do have Bing Webmaster Tools set up - I agree, even through Bing is limited in terms of traffic volume, Bing Webmaster Tools does give a slightly different take on things compared to Search Console.
Damon.
-
I'm also curious to know whether you've monitored Bing/Yahoo value over the course of your work. While it's rarely anywhere near Google's potential volume, I've seen good value gained from those as clients have implemented recommendations, even when Panda was a prime issue (and the subsequent panda refresh was a problem).
Overall it does sound like you're on the right track though.
-
Hello again Alan!
Agree with you 100% that this is a ongoing process. I asked the question with regards to getting the new hosting set up asap - if it wasn't going to be taken into account for the latest Panda update we would have a little more time.
As you say, having to wait for Google for almost a year to rerun Panda is really difficult for everyone (not just us). It's a really pity that we didn't pick this up earlier when Panda was running more regularly.
I've just run another crawl and we have 79x 30* redirects and 26x 40* pages, most of which are thumbnail jpgs and category pages (which are noindexed anyway). As stated above, I'll get these fixed this week.
We completed a competitor content analysis and redeveloped our main landing pages around this, and, together with our backlink profile, we think we've got a good chance of hitting the top ten SERP results - we are targeting some quite specific keywords with not particularly strong competition and have gained some excellent backlinks over the last few months.
Once again, thanks for your insight and help!
Damon.
-
Regarding 404/301 issues. The numbers I gave were for a small partial crawl of a hundred URLs. So a full Screaming Frog crawl would help to determine if it's worse. Even if its not, think of the concept where a site might have a dozen core problems, and twenty problems that by themselves might seem insignificant. At a certain point, something becomes the straw that breaks the camels back.
Regarding content - how many courses offered are actually up against competitors that have entire sections devoted to the topic just a single course page has on that site? How many have entire sites devoted to that? Understanding content depth requires understanding the scale of real and perceived competition. And if it's a course page, it may not be a "main" landing page, yet it's important in its own right.
Regarding panda timing - the site took the big hit three years ago. Waiting for, and hoping that the next update is the one that will magically reflect whatever you've done to that point isn't, in my experience, a wise perspective.
While it's true that once Google has locked a data set to then be applied to a specific algorithmic update, not taking action at a high enough level, and with enough consistency is gambling. Since true best practices marketing as a whole needs to be ongoing, efforts to strengthen on-site signals and signal relationships also needs to be ongoing. Because even if Panda weren't a factor, the competitive landscape is ever marching forward.
-
Hi Alan
Thanks for your comprehensive response - you make some very good points.
1. Host: The client is currently changing host as the current host is very entry level and we were aware that we had a problem - having said that the response times are a lot slower than when I last looked so we'll get in touch with the current host to see what they can do now.
2. 404/301 pages: Again these are on the list for the team to pick up on. I didn't actually think that there were enough to cause a problem - I can imagine if there were hundreds we might have an issue, but I would have thought 20 or so would have been OK? I'll chase to get these fixed in any case.
3. Content: I guess this is the gray area between a page not ranking due to poor page quality and a website being "algorthmically adjusted" because of poor page quality. We've worked on all our main landing pages to make them more comprehensive and from the research we have done we felt that we had done enough. We did consider noindexing the blog as well, but felt that as it was unique, while not particulary comprehensive, it shouldn't causing any Panda problems.
Quick question - is it your experience that once Panda starts running it is to late to make changes to your website? I've read that it is in a few places, but not in others places. I guess when it was running monthly it wasn't such an issue.
Once again, thank you very much for having a look - it's great to get a fresh set of eyes on the site.
Best
Damon.
-
Damon,
To start, let's be clear - Panda isn't a "penalty" - it's an algorithmic adjustment based on quality, uniqueness, relevance and trust signals.
Having audited many sites hit by the range of Panda updates, I have a pretty good understanding of what it usually takes. so having said that, I took a quick look at the site. While Andy may be correct in that you may only need to wait and hope the next or some future Panda update acknowledges the changes you've made to this point, that very well may not be enough.
1st obvious problem - your site's response times are toxic. - a crawl using Screaming Frog shows many of the pages have a response time of between 3 and 7 seconds. That's a major red flag - response times are the amount of time it takes to get to each URL. If it takes more than 2 seconds, that's typically an indicator that crawl efficiency is very weak. Crawl efficiency is a cornerstone of Panda because it reflects what is almost certainly a larger overall page processing time problem. Since Google sets a standard "ideal" page processing time of between one and three seconds, if it takes more than that just to ping the URL, the total processing time is likely going to be significantly worse.
While it's not required to always get a one to three second total process time, if too many pages are too slow across enough connection types for your visitors, that will definitely harm your site from a quality perspective.
And if too many pages have severely slow response times, Google will often abandon site crawl, which is another problem.
Next, I checked Google Page Speed Insights. Your home page scored a dismal 68 out of a possible 100 points for desktop users (85 is generally considered a good passing grade). That reinforces my concern about crawl inefficiency and poor page processing. It was even worse for mobile - scoring only 53 out of 100 points. In my second test, I got 63/100 for desktop and 49 for mobile. The different results for the two tests is due to the fact that speeds are worse at different times than others.
Just one of the issues GPSI lists is server response time (which confirms the very poor response times I saw in Screaming Frog).
Next, a partial crawl using Screaming Frog crawled 20 URLs that resulted in 404 (not found) status, which means you have internal links on your site pointing to dead ends - another quality hit. And SF found 25 internal URLs that redirect via 301 - further reinforcing crawl inefficiency. Since this was a partial crawl, those problems could be even bigger scaled across the site.
Then I poked around the site itself. http://www.workingvoices.com/courses/presentation-skills-training/keynote-speaker/ is indexed in google, as it's one of your courses. That page is possibly problematic due to the fact that there is hardly any content on that page overall. So while you may think you've dealt with thin content already, I don't think you fully grasp the need for strong, robust depth of content specific to each topic you consider important.
That's nowhere near a full audit, however the above are all examples of issues that absolutely relate to working toward a highly trusted site from Google's algorithmic perspective.
-
Hi
It looks like you have done everything correct, but you might have to wait for the next big Panda update before you start seeing any movements.
Thanks
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Manual Penalty Reconsideration Request Help
Hi All, I'm currently in the process of creating a reconsideration request for an 'Impact Links' manual penalty. So far I have downloaded all LIVE backlinks from multiple sources and audited them into groups; Domains that I'm keeping (good quality, natural links). Domains that I'm changing to No Follow (relevant good quality links that are good for the user but may be affiliated with my company, therefore changing the links to no follow rather than removing). Domains that I'm getting rid of. (poor quality sites with optimised anchor text, directories, articles sites etc.). One of my next steps is to review every historical back link to my website that is NO LONGER LIVE. To be thorough, I have planned to go through every domain (even if its no longer linking to my site) that has previously linked and straight up disavow the domain (if its poor quality).But I want to first check whether this is completely necessary for a successful reconsideration request? My concerns are that its extremely time consuming (as I'm going through the domains to avoid disavowing a good quality domain that might link back to me in future and also because the historical list is the largest list of them all!) and there is also some risk involved as some good domains might get caught in the disavowing crossfire, therefore I only really want to carry this out if its completely necessary for the success of the reconsideration request. Obviously I understand that reconsideration requests are meant to be time consuming as I'm repenting against previous SEO sin (and believe me I've already spent weeks getting to the stage I'm at right now)... But as an in house Digital Marketer with many other digital avenues to look after for my company too, I can't justify spending such a long time on something if its not 100% necessary. So overall - with a manual penalty request, would you bother sifting through domains that either don't exist anymore or no longer link to your site and disavow them for a thorough reconsideration request? Is this a necessary requirement to revoke the penalty or is Google only interested in links that are currently or recently live? All responses, thoughts and ideas are appreciated 🙂 Kind Regards Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
Anchor text penalties and indexed links
Hi! I'm working on a site that got hit by a manual penalty some time ago. I got that removed, cleaned up a bunch of links and disavowed the rest. That was about six months ago. Rankings improved, but the big money terms still aren't doing great. I recently ran a Searchmetrics anchor text report though, and it said that direct match anchors still made up the largest part of the overall portfolio. However, when I started looking at individual links with direct anchors, nearly every one had been removed or disavowed. My question is, could an anchor text penalty be in place because these removed links have not been reindexed? If so, what are my options? We've waited for this to happen naturally, but it hasn't occurred after quite a few months. I could ping them - could this have any impact? Thanks!
Intermediate & Advanced SEO | | Blink-SEO0 -
Making unresponsive site responsive, should I expect any ranking penalties?
Hello, I have a website made with asp.net and ranking quite well for a number of competitive keywords like in google top 10 results for more than a dozen competitive keywords. Recently in order for better user experience, I am having it developed so it is fully responsive for all screen resolutions. Basically all the design element / site text will remain the same including color scheme / layout etc outwardly but internally this will change everything all the css / page html (tables converted to divs) etc. Now my question is: 1. Will this considered by bots a complete site overhaul and ranking will take a hit even if I stay with current platform i.e. asp.net? 2. While making design responsive I can also develop a wordpress theme, which will make it easier to work with the website as the site does not require any programming. So if I also change the platform like from MS IIS/asp to Apache / php how will search engine bots take this? 3. If above in fact will result in ranking drop, how much time will it take for the rankings to get back to normal? Note that I use extensionless urls so the urls will remain the same as well even if we convert from asp to php. Sorry for long details but question is bugging me from weeks.
Intermediate & Advanced SEO | | hpk0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
First attempt at manual penalty removal fails - all example links provided by Google not in Majestic, GWT, Ahrefs, LinkDetox, or OSE.
Hello all, I am trying to recover a site from a manual penalty. I already submitted once. Here's what we did. We took the link profile from webmaster tools, majestic seo, ahrefs, link detox, and ose. We manually looked at every link to exclude good links. Then used a tool to run the removal campaign. Submitted a disavow file and reconsideration request. Google came back with a denial. When I looked at the three example links that Google provided, they were definitely spammy (forum profile and comment spam). But none of them were in any of the original csv downloads from GWT, Ahrefs, Majestic, OSE, or LinkDetox. What can I do? Thanks in advance for any help.
Intermediate & Advanced SEO | | NicoleDeLeon0 -
Google Manual Penalty - Unnatural Links
Hi, We are in the process of trying to remove a partial manual penalty for unnatural links. I would like to do a complete link audit of our site, where can I get complete data on sites linking to my website? Webmaster tools only appears to show the top 1000 domains. Thanks
Intermediate & Advanced SEO | | halloranc0 -
Ever had a case where publication of products & descriptions in ebay or amazon caused Panda penalty?
One of our shops got a Panda penalty back in september. We sell all our items with same product name and same product description also on amazon.com , amazon.co.uk, ebay.com and ebay.co.uk. Did you ever have a case where such multichannel sales caused panda penalty?
Intermediate & Advanced SEO | | lcourse0 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0