Panda penalty removal advice
-
Hi everyone! I'm after a second (or third, or fourth!) opinion here!
I'm working on the website www.workingvoices.com that has a Panda penalty dating from the late March 2012 update. I have made a number of changes to remove potential Panda issues but haven't seen any rankings movement in the last 7 weeks and was wondering if I've missed something...
The main issues I identified and fixed were:
- Keyword stuffed near duplicate title tags - fixed with relevant unique title tags
- Copies of the website on other domains creating duplicate content issues - fixed by taking these offline
- Thin content - fixed by adding content to some pages, and noindexing other thin/tag/category pages.
Any thoughts on other areas of the site that might still be setting off the mighty Panda are appreciated!
Cheers
Damon.
-
Our site was scraped by a past empoyee who started up a competing buisness with our inside trade secrets, client list and designs. As they launched they immediately tried to put us out of buisness by:
A. hired hacks to hook us up with tons of spammy links along with a high mix of porn and virus injections sites.
B. hired hacks from the same cesspool and had them submit our images to same bad types of sites that would take the customer somewhere else
C. signed up email address to our newseltter so that when we sent out an email it would initiate a chain reaction to zombie computers and launch a DDOS attack on our site and make our own email campaigns stop sales and trash the confidence in the rest of the customers on the mailing list.
D. Gave out every know email address in our company to spammers, to the point of making it difficult to get or send emails to customers.
E. Submitted our phone number to every robot call and junk call site possible tieing up our phones and filling our voice mail."Regarding Panda timing- the site took the big hit three years ago." We to had this exact timing happen to us on top of everything else because we were too busy defending ourselves to keep up with the Google changes.
Regardless of all the horrifying past events, we have completely rebuilt the business from the inside out and migrated to a BigCommerce website from our custom site, plus added 5 social media platforms. BUT..."having to wait for Google" to reindex and give us another chance is killing us and we are concern that we may never get back in the good graces of this SE titian.
All though we have survived the battle, we still may loose the war! Even with continuing efforts to optimize our site to death and with only a fraction of the traffic, orders and income, we have to wonder:
A. What else is wrong such as trying to determine if there are duplicate content on sites out their we are unaware of.
B. Seriously considering dumping our domain (owned since 2000) and going to a new domain that would have to be reindexed and treated as fresh, hopefully optimzed content per the Google requirements, and take our chances.Input on considerations of A & B would be appreciated as we are pretty worn out after 3 years working at this.
-
In this case it was easy as they had created the duplicate domains themselves and they had control over them, so it was just a case of getting them taken down.
-
How did you find ..."opies of the website on other domains creating duplicate content issues"?
How did you ..."- fixed by taking these offline"?
We have been dealing with the same issues but did not think of the above and would like to find out if we have the same "duplicate" issues.
-
Yes, we do have Bing Webmaster Tools set up - I agree, even through Bing is limited in terms of traffic volume, Bing Webmaster Tools does give a slightly different take on things compared to Search Console.
Damon.
-
I'm also curious to know whether you've monitored Bing/Yahoo value over the course of your work. While it's rarely anywhere near Google's potential volume, I've seen good value gained from those as clients have implemented recommendations, even when Panda was a prime issue (and the subsequent panda refresh was a problem).
Overall it does sound like you're on the right track though.
-
Hello again Alan!
Agree with you 100% that this is a ongoing process. I asked the question with regards to getting the new hosting set up asap - if it wasn't going to be taken into account for the latest Panda update we would have a little more time.
As you say, having to wait for Google for almost a year to rerun Panda is really difficult for everyone (not just us). It's a really pity that we didn't pick this up earlier when Panda was running more regularly.
I've just run another crawl and we have 79x 30* redirects and 26x 40* pages, most of which are thumbnail jpgs and category pages (which are noindexed anyway). As stated above, I'll get these fixed this week.
We completed a competitor content analysis and redeveloped our main landing pages around this, and, together with our backlink profile, we think we've got a good chance of hitting the top ten SERP results - we are targeting some quite specific keywords with not particularly strong competition and have gained some excellent backlinks over the last few months.
Once again, thanks for your insight and help!
Damon.
-
Regarding 404/301 issues. The numbers I gave were for a small partial crawl of a hundred URLs. So a full Screaming Frog crawl would help to determine if it's worse. Even if its not, think of the concept where a site might have a dozen core problems, and twenty problems that by themselves might seem insignificant. At a certain point, something becomes the straw that breaks the camels back.
Regarding content - how many courses offered are actually up against competitors that have entire sections devoted to the topic just a single course page has on that site? How many have entire sites devoted to that? Understanding content depth requires understanding the scale of real and perceived competition. And if it's a course page, it may not be a "main" landing page, yet it's important in its own right.
Regarding panda timing - the site took the big hit three years ago. Waiting for, and hoping that the next update is the one that will magically reflect whatever you've done to that point isn't, in my experience, a wise perspective.
While it's true that once Google has locked a data set to then be applied to a specific algorithmic update, not taking action at a high enough level, and with enough consistency is gambling. Since true best practices marketing as a whole needs to be ongoing, efforts to strengthen on-site signals and signal relationships also needs to be ongoing. Because even if Panda weren't a factor, the competitive landscape is ever marching forward.
-
Hi Alan
Thanks for your comprehensive response - you make some very good points.
1. Host: The client is currently changing host as the current host is very entry level and we were aware that we had a problem - having said that the response times are a lot slower than when I last looked so we'll get in touch with the current host to see what they can do now.
2. 404/301 pages: Again these are on the list for the team to pick up on. I didn't actually think that there were enough to cause a problem - I can imagine if there were hundreds we might have an issue, but I would have thought 20 or so would have been OK? I'll chase to get these fixed in any case.
3. Content: I guess this is the gray area between a page not ranking due to poor page quality and a website being "algorthmically adjusted" because of poor page quality. We've worked on all our main landing pages to make them more comprehensive and from the research we have done we felt that we had done enough. We did consider noindexing the blog as well, but felt that as it was unique, while not particulary comprehensive, it shouldn't causing any Panda problems.
Quick question - is it your experience that once Panda starts running it is to late to make changes to your website? I've read that it is in a few places, but not in others places. I guess when it was running monthly it wasn't such an issue.
Once again, thank you very much for having a look - it's great to get a fresh set of eyes on the site.
Best
Damon.
-
Damon,
To start, let's be clear - Panda isn't a "penalty" - it's an algorithmic adjustment based on quality, uniqueness, relevance and trust signals.
Having audited many sites hit by the range of Panda updates, I have a pretty good understanding of what it usually takes. so having said that, I took a quick look at the site. While Andy may be correct in that you may only need to wait and hope the next or some future Panda update acknowledges the changes you've made to this point, that very well may not be enough.
1st obvious problem - your site's response times are toxic. - a crawl using Screaming Frog shows many of the pages have a response time of between 3 and 7 seconds. That's a major red flag - response times are the amount of time it takes to get to each URL. If it takes more than 2 seconds, that's typically an indicator that crawl efficiency is very weak. Crawl efficiency is a cornerstone of Panda because it reflects what is almost certainly a larger overall page processing time problem. Since Google sets a standard "ideal" page processing time of between one and three seconds, if it takes more than that just to ping the URL, the total processing time is likely going to be significantly worse.
While it's not required to always get a one to three second total process time, if too many pages are too slow across enough connection types for your visitors, that will definitely harm your site from a quality perspective.
And if too many pages have severely slow response times, Google will often abandon site crawl, which is another problem.
Next, I checked Google Page Speed Insights. Your home page scored a dismal 68 out of a possible 100 points for desktop users (85 is generally considered a good passing grade). That reinforces my concern about crawl inefficiency and poor page processing. It was even worse for mobile - scoring only 53 out of 100 points. In my second test, I got 63/100 for desktop and 49 for mobile. The different results for the two tests is due to the fact that speeds are worse at different times than others.
Just one of the issues GPSI lists is server response time (which confirms the very poor response times I saw in Screaming Frog).
Next, a partial crawl using Screaming Frog crawled 20 URLs that resulted in 404 (not found) status, which means you have internal links on your site pointing to dead ends - another quality hit. And SF found 25 internal URLs that redirect via 301 - further reinforcing crawl inefficiency. Since this was a partial crawl, those problems could be even bigger scaled across the site.
Then I poked around the site itself. http://www.workingvoices.com/courses/presentation-skills-training/keynote-speaker/ is indexed in google, as it's one of your courses. That page is possibly problematic due to the fact that there is hardly any content on that page overall. So while you may think you've dealt with thin content already, I don't think you fully grasp the need for strong, robust depth of content specific to each topic you consider important.
That's nowhere near a full audit, however the above are all examples of issues that absolutely relate to working toward a highly trusted site from Google's algorithmic perspective.
-
Hi
It looks like you have done everything correct, but you might have to wait for the next big Panda update before you start seeing any movements.
Thanks
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to remove the certain backlinks completely ?
Hello, I have deleted certain backlinks manually a few months ago. they dont longer exist. but even today Google Webmaster tools and MOZ backlink tool shows this backlinks. Why ? what i need to do to remove them completely ? Thank you 🙂
Intermediate & Advanced SEO | | Ivek990 -
Penalty from Google due to spam that was not our doing
Hi, My company has enjoyed pretty good rankings for our main keywords in Google for the past 13+ years we have been in business. We have always been very white-hat about our SEO -- always erring on the side of not doing anything rather than risking a penalty. Well, last Thursday, we received the dreaded Google penalty due to a pattern of unnatural links. The hit is devastating - we are not even in the top 50 for our own company name anymore. Through research, we believe we have found the culprit -- and it has nothing to do with any of our own actions. We operate a discussion forum, and there was a link to one of the threads that was being used as the target for a lot of link spam -- Chinese blog comments, etc. We had nothing to do with this, but obviously someone had an agenda and was working on spamming to this page. Who knows - it may have even been the company that was being discussed negatively in the thread, attempting to have us blocked. We discovered it the same day we received the penalty notice, and issued a reconsideration request, detailing what we believe we found. So far, we haven't disavowed any of the links, but I am thinking we should. We have asked Google if they're able to just turn off any link juice for that one page, especially since we don't know who is doing this spamming, and whether they will continue. Has anyone experienced something similar? How does one prevent themselves from receiving a penalty that they had nothing to do with? What is there to keep any competitor from launching a spammy link-building campaign to get their competitor removed from Google? Is there anything we can do to resolve this? Thanks for any and all thoughts...
Intermediate & Advanced SEO | | kylie_rw0 -
Migrating EMD to brand name domain. Risk of Penguin Penalty?
We would like to migrate from an EMD to a brand name domain, since our service offer has become much broader than indicated by the current EMD. The current domain name is a money keyword. Do you believe there is a big risk of suffering a penguin penalty if we go ahead with the domain migration, due to large share of anchor texts containing keyword of old domain name? Quick facts about our site:
Intermediate & Advanced SEO | | lcourse
-about 500.000 pages indexed by google PR6 10 years old 1200 linking root domains 30% of linking root domains contain our domain name with domain ending as anchor text 5% of linking root domains have just the domain keyword as anchor text Any thoughts?
Thanks0 -
.htaccess question/opinion/advice needed
Hello, I am trying to achieve 3 different things on my .htaccess I just want to make sure I am doing it the right or best way because I don't have much experience working on this kind of files. I am trying to: a) Redirect www.mysite.com/index.html to www.mysite.com so I don't get a duplicate content/tag error. b) Redirect mysite.com to www.mysite.com c) Get rid of the file extensions; www.mysite.com/stuff.html to www.mysite.com/stuff This is the code that I'm currently using and it seems to work fine, however I would like someone with experience to take a look so I can avoid internal server errors and other kinds of issues. I grabbed each piece of code from different posts and tutorials. Options +FollowSymlinks
Intermediate & Advanced SEO | | Eblan
RewriteEngine on Index Rewrite RewriteRule ^index.(htm|html|php) http://www.mysite.com/ [R=301,L] RewriteRule ^(.*)/index.(htm|html|php) http://www.mysite.com/$1/ [R=301,L] RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME}.html -f
RewriteRule ^(.*)$ $1.html Options +FollowSymlinks
RewriteEngine on
Rewritecond %{http_host} ^mysite.com [nc]
Rewriterule ^(.*)$ http://www.mysite.com/$1 [r=301,nc] Thanks a lot!0 -
Should I remove the ?replytocom variables in wordpress?
I'm using Yoast's wordpress plugin and there is an option to remove the replytocom variables. I'm curious what everyone's thoughts were on that, and if I should do it. Here's the site if you need to see it. Thanks!
Intermediate & Advanced SEO | | NoahsDad0 -
Strange recovery from Panda
I have 2 business sites. www.affordable-uncontested-divorce.com is a homestead template site which is old and clunky but has given me steady traffic despite little maintenance. It was unafected by the various Panda updates. It does load very fast. www.uncontesteddivorce-nyc I put up about 18 months ago it is a Thesis Theme Wordpress site with the usual bells and whistles. I put a lot of work into it and around May its traffic finally surpassed my old site. In June traffic to the new site started tanking, ultimately about 30% off. A friendly SEO thought that there was some duplication between the 2 sites and Google might have seen the older site as the authority site and the newere as the scraper. I tried the usual fixes and the decline finally bottomed out but no recovery. I read someone who said that Wordpress sites are problamatical with Panda because of inherent duplicate content issues unless you don't use them as blogs, just as CMS. So I got rid of all the blog posts save one. Around about 3 months ago my traffic started to go up again and now it once again has surpassed the older site. The strange thing about it is that since the recovery my Analytic numbers like bounce rate number of page views and time on site have gone down and are much worse on the new site than they are on the old site. Does anyone have any idea of what' s up? Thx Paul
Intermediate & Advanced SEO | | diogenes0 -
Panda 2.5
I'm sure we have all read about the latest round of Google's algorithm changes also known as the "Panda 2.5" updates. This latest update seems to have hit some pretty large press release sites including PR Newswire and Businesswire (both of these have a great page rank and domain authority making them a great tool for SEO's in regards to inbounds links). Ultimately this update has directly affected their sites traffic, keyword rankings, and the number of indexed pages in Google. But what will this do to our smaller sites that benefit from these great links? Will these panda updates continue to target these content farms and lower their domain authority? Will that extrapolate out and effect the domain authority of our sites? What are your thoughts for those of us that utilize these services, should we re-evaluate our process? I look forward to a great discussion. Regards - Kyle
Intermediate & Advanced SEO | | kchandler0 -
Removing large section of content with traffic, what is best de-indexing option?
If we are removing 100 old urls (archives of authors that no longer write for us), what is the best option? we could 301 traffic to the main directory de-index using no-index, follow 404 the pages Thanks!
Intermediate & Advanced SEO | | nicole.healthline0