Panda penalty removal advice
-
Hi everyone! I'm after a second (or third, or fourth!) opinion here!
I'm working on the website www.workingvoices.com that has a Panda penalty dating from the late March 2012 update. I have made a number of changes to remove potential Panda issues but haven't seen any rankings movement in the last 7 weeks and was wondering if I've missed something...
The main issues I identified and fixed were:
- Keyword stuffed near duplicate title tags - fixed with relevant unique title tags
- Copies of the website on other domains creating duplicate content issues - fixed by taking these offline
- Thin content - fixed by adding content to some pages, and noindexing other thin/tag/category pages.
Any thoughts on other areas of the site that might still be setting off the mighty Panda are appreciated!
Cheers
Damon.
-
Our site was scraped by a past empoyee who started up a competing buisness with our inside trade secrets, client list and designs. As they launched they immediately tried to put us out of buisness by:
A. hired hacks to hook us up with tons of spammy links along with a high mix of porn and virus injections sites.
B. hired hacks from the same cesspool and had them submit our images to same bad types of sites that would take the customer somewhere else
C. signed up email address to our newseltter so that when we sent out an email it would initiate a chain reaction to zombie computers and launch a DDOS attack on our site and make our own email campaigns stop sales and trash the confidence in the rest of the customers on the mailing list.
D. Gave out every know email address in our company to spammers, to the point of making it difficult to get or send emails to customers.
E. Submitted our phone number to every robot call and junk call site possible tieing up our phones and filling our voice mail."Regarding Panda timing- the site took the big hit three years ago." We to had this exact timing happen to us on top of everything else because we were too busy defending ourselves to keep up with the Google changes.
Regardless of all the horrifying past events, we have completely rebuilt the business from the inside out and migrated to a BigCommerce website from our custom site, plus added 5 social media platforms. BUT..."having to wait for Google" to reindex and give us another chance is killing us and we are concern that we may never get back in the good graces of this SE titian.
All though we have survived the battle, we still may loose the war! Even with continuing efforts to optimize our site to death and with only a fraction of the traffic, orders and income, we have to wonder:
A. What else is wrong such as trying to determine if there are duplicate content on sites out their we are unaware of.
B. Seriously considering dumping our domain (owned since 2000) and going to a new domain that would have to be reindexed and treated as fresh, hopefully optimzed content per the Google requirements, and take our chances.Input on considerations of A & B would be appreciated as we are pretty worn out after 3 years working at this.
-
In this case it was easy as they had created the duplicate domains themselves and they had control over them, so it was just a case of getting them taken down.
-
How did you find ..."opies of the website on other domains creating duplicate content issues"?
How did you ..."- fixed by taking these offline"?
We have been dealing with the same issues but did not think of the above and would like to find out if we have the same "duplicate" issues.
-
Yes, we do have Bing Webmaster Tools set up - I agree, even through Bing is limited in terms of traffic volume, Bing Webmaster Tools does give a slightly different take on things compared to Search Console.
Damon.
-
I'm also curious to know whether you've monitored Bing/Yahoo value over the course of your work. While it's rarely anywhere near Google's potential volume, I've seen good value gained from those as clients have implemented recommendations, even when Panda was a prime issue (and the subsequent panda refresh was a problem).
Overall it does sound like you're on the right track though.
-
Hello again Alan!
Agree with you 100% that this is a ongoing process. I asked the question with regards to getting the new hosting set up asap - if it wasn't going to be taken into account for the latest Panda update we would have a little more time.
As you say, having to wait for Google for almost a year to rerun Panda is really difficult for everyone (not just us). It's a really pity that we didn't pick this up earlier when Panda was running more regularly.
I've just run another crawl and we have 79x 30* redirects and 26x 40* pages, most of which are thumbnail jpgs and category pages (which are noindexed anyway). As stated above, I'll get these fixed this week.
We completed a competitor content analysis and redeveloped our main landing pages around this, and, together with our backlink profile, we think we've got a good chance of hitting the top ten SERP results - we are targeting some quite specific keywords with not particularly strong competition and have gained some excellent backlinks over the last few months.
Once again, thanks for your insight and help!
Damon.
-
Regarding 404/301 issues. The numbers I gave were for a small partial crawl of a hundred URLs. So a full Screaming Frog crawl would help to determine if it's worse. Even if its not, think of the concept where a site might have a dozen core problems, and twenty problems that by themselves might seem insignificant. At a certain point, something becomes the straw that breaks the camels back.
Regarding content - how many courses offered are actually up against competitors that have entire sections devoted to the topic just a single course page has on that site? How many have entire sites devoted to that? Understanding content depth requires understanding the scale of real and perceived competition. And if it's a course page, it may not be a "main" landing page, yet it's important in its own right.
Regarding panda timing - the site took the big hit three years ago. Waiting for, and hoping that the next update is the one that will magically reflect whatever you've done to that point isn't, in my experience, a wise perspective.
While it's true that once Google has locked a data set to then be applied to a specific algorithmic update, not taking action at a high enough level, and with enough consistency is gambling. Since true best practices marketing as a whole needs to be ongoing, efforts to strengthen on-site signals and signal relationships also needs to be ongoing. Because even if Panda weren't a factor, the competitive landscape is ever marching forward.
-
Hi Alan
Thanks for your comprehensive response - you make some very good points.
1. Host: The client is currently changing host as the current host is very entry level and we were aware that we had a problem - having said that the response times are a lot slower than when I last looked so we'll get in touch with the current host to see what they can do now.
2. 404/301 pages: Again these are on the list for the team to pick up on. I didn't actually think that there were enough to cause a problem - I can imagine if there were hundreds we might have an issue, but I would have thought 20 or so would have been OK? I'll chase to get these fixed in any case.
3. Content: I guess this is the gray area between a page not ranking due to poor page quality and a website being "algorthmically adjusted" because of poor page quality. We've worked on all our main landing pages to make them more comprehensive and from the research we have done we felt that we had done enough. We did consider noindexing the blog as well, but felt that as it was unique, while not particulary comprehensive, it shouldn't causing any Panda problems.
Quick question - is it your experience that once Panda starts running it is to late to make changes to your website? I've read that it is in a few places, but not in others places. I guess when it was running monthly it wasn't such an issue.
Once again, thank you very much for having a look - it's great to get a fresh set of eyes on the site.
Best
Damon.
-
Damon,
To start, let's be clear - Panda isn't a "penalty" - it's an algorithmic adjustment based on quality, uniqueness, relevance and trust signals.
Having audited many sites hit by the range of Panda updates, I have a pretty good understanding of what it usually takes. so having said that, I took a quick look at the site. While Andy may be correct in that you may only need to wait and hope the next or some future Panda update acknowledges the changes you've made to this point, that very well may not be enough.
1st obvious problem - your site's response times are toxic. - a crawl using Screaming Frog shows many of the pages have a response time of between 3 and 7 seconds. That's a major red flag - response times are the amount of time it takes to get to each URL. If it takes more than 2 seconds, that's typically an indicator that crawl efficiency is very weak. Crawl efficiency is a cornerstone of Panda because it reflects what is almost certainly a larger overall page processing time problem. Since Google sets a standard "ideal" page processing time of between one and three seconds, if it takes more than that just to ping the URL, the total processing time is likely going to be significantly worse.
While it's not required to always get a one to three second total process time, if too many pages are too slow across enough connection types for your visitors, that will definitely harm your site from a quality perspective.
And if too many pages have severely slow response times, Google will often abandon site crawl, which is another problem.
Next, I checked Google Page Speed Insights. Your home page scored a dismal 68 out of a possible 100 points for desktop users (85 is generally considered a good passing grade). That reinforces my concern about crawl inefficiency and poor page processing. It was even worse for mobile - scoring only 53 out of 100 points. In my second test, I got 63/100 for desktop and 49 for mobile. The different results for the two tests is due to the fact that speeds are worse at different times than others.
Just one of the issues GPSI lists is server response time (which confirms the very poor response times I saw in Screaming Frog).
Next, a partial crawl using Screaming Frog crawled 20 URLs that resulted in 404 (not found) status, which means you have internal links on your site pointing to dead ends - another quality hit. And SF found 25 internal URLs that redirect via 301 - further reinforcing crawl inefficiency. Since this was a partial crawl, those problems could be even bigger scaled across the site.
Then I poked around the site itself. http://www.workingvoices.com/courses/presentation-skills-training/keynote-speaker/ is indexed in google, as it's one of your courses. That page is possibly problematic due to the fact that there is hardly any content on that page overall. So while you may think you've dealt with thin content already, I don't think you fully grasp the need for strong, robust depth of content specific to each topic you consider important.
That's nowhere near a full audit, however the above are all examples of issues that absolutely relate to working toward a highly trusted site from Google's algorithmic perspective.
-
Hi
It looks like you have done everything correct, but you might have to wait for the next big Panda update before you start seeing any movements.
Thanks
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalties for duplicate content
Hello!We have a website with various city tours and activities listed on a single page (http://vaiduokliai.lt/). The list changes accordingly depending on filtering (birthday in Vilnius, bachelor party in Kaunas, etc.). The URL doesn't change. Content changes dynamically. We need to make URL visible for each category, then optimize it for different keywords (for example city tours in Vilnius for a list of tours and activities in Vilnius with appropriate URL /tours-in-Vilnius).The problem is that activities overlap very often in different categories, so there will be a lot of duplicate content on different pages. In such case, how severe penalty could be for duplicate content?
Intermediate & Advanced SEO | | jpuzakov0 -
Google webmaster tool (GWT) owner removal issue
Hi! I have a new client, the former agency added the client property with the agency account so we had to create a new GA account (as you can’t transfer ownership at the account level) but we also kept access to the former account to keep historical data. We were granted owner access to the GWT (which is more flexible, you can remove owners and creators) and we now want to remove former agency users. We have 3 adresses. One was verified with delegation method (no pb for removal), one with meta tag (no pb) and one with Google Analytics. Here it becomes tricky as Google says regarding GA verif method “If this account was verified using a Google Analytics tracking code, you should make sure that the user you want to unverify is no longer an administrator on the Analytics account. Otherwise, removal may not be permanent”. The thing is that this user has the same email address as the one used to create the agency GA account (no ownership transfer) so I basically can’t remove admin rights. The other possibility, as Google mentions when I try to unlink this user, is “remove the administrator status in Google Analytics or delete the Google Analytics tracking code on the website”. But we don’t want to remove the code as we still want to track data with the former account for historical analysis purposes. Has anyone ever faced this situation? Do you know how to handle this? Do you think that unlinking the GWT and the GA accounts will unverify the GA method? Many thanks in advance ! Ennick
Intermediate & Advanced SEO | | ennick0 -
An Unfair Content related penalty :(
Hi Guys, Google.com.au
Intermediate & Advanced SEO | | jarrodb
website: http://partysuppliesnow.com.au/ We had a massive drop in search queries in WMT around the 11th of september this year, I investigated and it seemed as though there were no updates around this time. Our site is only receiving branded search now - and after investigating i am led to believe that Google has mistakingly affected our website in the panda algorithm. There are no manual penalties applies on this site as confirmed by WMT. Our product descriptions are pretty much all unique but i have noticed that when typing a portion of text from these pages into google search using quotation marks, shopping affiliate sites which we use are being displayed first and our page no where to be seen or last in the results. This leads me to believe that Google thinks we have scraped the content from these sites when in actual fact they have from us. We also have G+ authorship setup. Typing a products full name into Google (tried a handful) our site is not in the top 100 or 200 at times, i think this further clarifies that we are penalised. We would really appreciate some opinions on this. Any course of actions would be great. We don't particularly want to invest in writing content again. From our point of view it looks like Google is stopping our site from ranking because it's getting mixed up with who the originator for our content is. Thanks and really appreciate it.0 -
Site Penalty After Changing Hosting Companies?
In one week's time, we've dropped from #3 on Page 1 of Google to Page 7 (similar on Bing). It looks like our traffic started to drop on 9/5 to 9/7 and has been a steady, rapid decline ever since. 1000s of pages are indexed, just suddenly ranking poorly -- even for branded terms. History:
Intermediate & Advanced SEO | | ddwilliamson
--In January, we switched to a web redesign & new domain
--In August, our hosting server was slow & kept crashing so we migrated our site to a new hosting company. We're not currently using the old hosting server. All domains, redirects, .htaccess files should now be correct and site speeds are improved.
--In early September, our NEW hosting company had a DNS issue causing more slow speeds and downtime for about 1 wk. Originally they thought it was htaccess so they changed our htaccess file - no luck - then discovered it was DNS. DNS issue was finally resolved on September 6th -- one day before the penalty/traffic issue seemed to begin.
-- According to GWMT, it looks like there were crawls completed around 9/4-9/5 What we've tried:
--Webmaster Tools - Googlebot dropoff since 9/5 (see attached screenshot). Nothing flagged. No site health alerts. Fetch as Google works. No manual webspam actions found.
-- W3C link checker, screamingfrog SEO spider, Xenu Link Sleuth, OSE (found some 4xx errors so we've updated those links)
-- Majestic SEO - backlinks reviewed 9/3 to 9/8
-- spoke to two different Adwords salespeople; unable to help
-- Bing Webmaster Tools
-- not showing organic search traffic since 9/6
-- 15% fewer pages crawled this month
-- top keywords are very odd -- stuff like "mt1 google apis" and "aaremel"
-- there are 4xx crawl errors under Crawl Information. We've fixed those URLs but they still appear in Webmaster Tools
-- some missing h1's and meta's, and dup titles, which we're working to fix
-- spike in crawl errors 9/11-9/12 and again on 9/14-9/15 It's been one thing after another this year, but all issues are now resolved with the exception of this newly-discovered penalty. We also have sites on a separate hosting server (with a different hosting company) that rank just fine. googlebot-crawls.jpg0 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1 -
Google penguin penalty(s), please help
Hi MozFans, I have got a question out of the field about www.coloringpagesabc.com.
Intermediate & Advanced SEO | | MaartenvandenBos
Question is why the rankings and traffic are going down down down the last 4 months. Costumer thinks he got hit by google penguin update(s). The site has about 600 page’s/posts al ‘optimized’ for old seo:
- Almost all posts are superb optimized for one keyword combination (like … coloring pages) there is a high keyword density on the keyword titles and descriptions are all the same like: <keyword>and this is the rest of my title, This is my description <keyword>and i like it internal linking is all with a ‘perfect’ keyword anchor text there is a ok backlink profile, not much links to inner pages
- there are social signals the content quality is low The site to me looks like a seo over optimized content farm Competition:
When I look at the competition. The most coloring pages websites don’t offer a lot of content (text) on there page. The offer a small text and the coloring pages (What it is about :-)) How to get the rankings back:
What I was thinking to do. rewrite the content to a smaller text. Low keyword density on the keyword and put the coloring pages up front. rewrite all titles and descriptions to unique titles and descriptions Make some internal links to related posts with a other anchor text. get linkbuilding going on inner pages get more social signals Am I on the right track? I can use some advise what to do, and where to start. Thanks!!</keyword></keyword> Maarten0 -
Removed Duplicate Domains, What Should I Expect?
Hi All, So I have been at my current company for 5 months now. I quickly realized that they previously bought multiple domains. The domains do make sense (they are mostly our products, etc). However they did not just redirect to our main website, instead, they were a direct copy of our main website. They had it setup so that when we made changes to our main website, www.mainwebsite.com, that the same exact change went to www.productwebsite.com. Basically we had about 7 of the SAME EXACT websites with a different root domain. So I explained to them the problem with having duplicate content on the web and how we are basically just self cannibalizing our online efforts. This problem is fixed now and I am just wondering if anybody has seen the results before? To tell you the truth we already do pretty well SEO-wise. Just wondering if this will make it even better? I am assuming that this will also take a little while to take effect? Thanks! Pat
Intermediate & Advanced SEO | | PatBausemer0 -
Please advice
Hi there I was wondering if is any profit from an aged domain. Can anyone advice how to take out the most of seo benefits? White hat only techniques. Thanks in advance
Intermediate & Advanced SEO | | nyanainc0