Panda penalty removal advice
-
Hi everyone! I'm after a second (or third, or fourth!) opinion here!
I'm working on the website www.workingvoices.com that has a Panda penalty dating from the late March 2012 update. I have made a number of changes to remove potential Panda issues but haven't seen any rankings movement in the last 7 weeks and was wondering if I've missed something...
The main issues I identified and fixed were:
- Keyword stuffed near duplicate title tags - fixed with relevant unique title tags
- Copies of the website on other domains creating duplicate content issues - fixed by taking these offline
- Thin content - fixed by adding content to some pages, and noindexing other thin/tag/category pages.
Any thoughts on other areas of the site that might still be setting off the mighty Panda are appreciated!
Cheers
Damon.
-
Our site was scraped by a past empoyee who started up a competing buisness with our inside trade secrets, client list and designs. As they launched they immediately tried to put us out of buisness by:
A. hired hacks to hook us up with tons of spammy links along with a high mix of porn and virus injections sites.
B. hired hacks from the same cesspool and had them submit our images to same bad types of sites that would take the customer somewhere else
C. signed up email address to our newseltter so that when we sent out an email it would initiate a chain reaction to zombie computers and launch a DDOS attack on our site and make our own email campaigns stop sales and trash the confidence in the rest of the customers on the mailing list.
D. Gave out every know email address in our company to spammers, to the point of making it difficult to get or send emails to customers.
E. Submitted our phone number to every robot call and junk call site possible tieing up our phones and filling our voice mail."Regarding Panda timing- the site took the big hit three years ago." We to had this exact timing happen to us on top of everything else because we were too busy defending ourselves to keep up with the Google changes.
Regardless of all the horrifying past events, we have completely rebuilt the business from the inside out and migrated to a BigCommerce website from our custom site, plus added 5 social media platforms. BUT..."having to wait for Google" to reindex and give us another chance is killing us and we are concern that we may never get back in the good graces of this SE titian.
All though we have survived the battle, we still may loose the war! Even with continuing efforts to optimize our site to death and with only a fraction of the traffic, orders and income, we have to wonder:
A. What else is wrong such as trying to determine if there are duplicate content on sites out their we are unaware of.
B. Seriously considering dumping our domain (owned since 2000) and going to a new domain that would have to be reindexed and treated as fresh, hopefully optimzed content per the Google requirements, and take our chances.Input on considerations of A & B would be appreciated as we are pretty worn out after 3 years working at this.
-
In this case it was easy as they had created the duplicate domains themselves and they had control over them, so it was just a case of getting them taken down.
-
How did you find ..."opies of the website on other domains creating duplicate content issues"?
How did you ..."- fixed by taking these offline"?
We have been dealing with the same issues but did not think of the above and would like to find out if we have the same "duplicate" issues.
-
Yes, we do have Bing Webmaster Tools set up - I agree, even through Bing is limited in terms of traffic volume, Bing Webmaster Tools does give a slightly different take on things compared to Search Console.
Damon.
-
I'm also curious to know whether you've monitored Bing/Yahoo value over the course of your work. While it's rarely anywhere near Google's potential volume, I've seen good value gained from those as clients have implemented recommendations, even when Panda was a prime issue (and the subsequent panda refresh was a problem).
Overall it does sound like you're on the right track though.
-
Hello again Alan!
Agree with you 100% that this is a ongoing process. I asked the question with regards to getting the new hosting set up asap - if it wasn't going to be taken into account for the latest Panda update we would have a little more time.
As you say, having to wait for Google for almost a year to rerun Panda is really difficult for everyone (not just us). It's a really pity that we didn't pick this up earlier when Panda was running more regularly.
I've just run another crawl and we have 79x 30* redirects and 26x 40* pages, most of which are thumbnail jpgs and category pages (which are noindexed anyway). As stated above, I'll get these fixed this week.
We completed a competitor content analysis and redeveloped our main landing pages around this, and, together with our backlink profile, we think we've got a good chance of hitting the top ten SERP results - we are targeting some quite specific keywords with not particularly strong competition and have gained some excellent backlinks over the last few months.
Once again, thanks for your insight and help!
Damon.
-
Regarding 404/301 issues. The numbers I gave were for a small partial crawl of a hundred URLs. So a full Screaming Frog crawl would help to determine if it's worse. Even if its not, think of the concept where a site might have a dozen core problems, and twenty problems that by themselves might seem insignificant. At a certain point, something becomes the straw that breaks the camels back.
Regarding content - how many courses offered are actually up against competitors that have entire sections devoted to the topic just a single course page has on that site? How many have entire sites devoted to that? Understanding content depth requires understanding the scale of real and perceived competition. And if it's a course page, it may not be a "main" landing page, yet it's important in its own right.
Regarding panda timing - the site took the big hit three years ago. Waiting for, and hoping that the next update is the one that will magically reflect whatever you've done to that point isn't, in my experience, a wise perspective.
While it's true that once Google has locked a data set to then be applied to a specific algorithmic update, not taking action at a high enough level, and with enough consistency is gambling. Since true best practices marketing as a whole needs to be ongoing, efforts to strengthen on-site signals and signal relationships also needs to be ongoing. Because even if Panda weren't a factor, the competitive landscape is ever marching forward.
-
Hi Alan
Thanks for your comprehensive response - you make some very good points.
1. Host: The client is currently changing host as the current host is very entry level and we were aware that we had a problem - having said that the response times are a lot slower than when I last looked so we'll get in touch with the current host to see what they can do now.
2. 404/301 pages: Again these are on the list for the team to pick up on. I didn't actually think that there were enough to cause a problem - I can imagine if there were hundreds we might have an issue, but I would have thought 20 or so would have been OK? I'll chase to get these fixed in any case.
3. Content: I guess this is the gray area between a page not ranking due to poor page quality and a website being "algorthmically adjusted" because of poor page quality. We've worked on all our main landing pages to make them more comprehensive and from the research we have done we felt that we had done enough. We did consider noindexing the blog as well, but felt that as it was unique, while not particulary comprehensive, it shouldn't causing any Panda problems.
Quick question - is it your experience that once Panda starts running it is to late to make changes to your website? I've read that it is in a few places, but not in others places. I guess when it was running monthly it wasn't such an issue.
Once again, thank you very much for having a look - it's great to get a fresh set of eyes on the site.
Best
Damon.
-
Damon,
To start, let's be clear - Panda isn't a "penalty" - it's an algorithmic adjustment based on quality, uniqueness, relevance and trust signals.
Having audited many sites hit by the range of Panda updates, I have a pretty good understanding of what it usually takes. so having said that, I took a quick look at the site. While Andy may be correct in that you may only need to wait and hope the next or some future Panda update acknowledges the changes you've made to this point, that very well may not be enough.
1st obvious problem - your site's response times are toxic. - a crawl using Screaming Frog shows many of the pages have a response time of between 3 and 7 seconds. That's a major red flag - response times are the amount of time it takes to get to each URL. If it takes more than 2 seconds, that's typically an indicator that crawl efficiency is very weak. Crawl efficiency is a cornerstone of Panda because it reflects what is almost certainly a larger overall page processing time problem. Since Google sets a standard "ideal" page processing time of between one and three seconds, if it takes more than that just to ping the URL, the total processing time is likely going to be significantly worse.
While it's not required to always get a one to three second total process time, if too many pages are too slow across enough connection types for your visitors, that will definitely harm your site from a quality perspective.
And if too many pages have severely slow response times, Google will often abandon site crawl, which is another problem.
Next, I checked Google Page Speed Insights. Your home page scored a dismal 68 out of a possible 100 points for desktop users (85 is generally considered a good passing grade). That reinforces my concern about crawl inefficiency and poor page processing. It was even worse for mobile - scoring only 53 out of 100 points. In my second test, I got 63/100 for desktop and 49 for mobile. The different results for the two tests is due to the fact that speeds are worse at different times than others.
Just one of the issues GPSI lists is server response time (which confirms the very poor response times I saw in Screaming Frog).
Next, a partial crawl using Screaming Frog crawled 20 URLs that resulted in 404 (not found) status, which means you have internal links on your site pointing to dead ends - another quality hit. And SF found 25 internal URLs that redirect via 301 - further reinforcing crawl inefficiency. Since this was a partial crawl, those problems could be even bigger scaled across the site.
Then I poked around the site itself. http://www.workingvoices.com/courses/presentation-skills-training/keynote-speaker/ is indexed in google, as it's one of your courses. That page is possibly problematic due to the fact that there is hardly any content on that page overall. So while you may think you've dealt with thin content already, I don't think you fully grasp the need for strong, robust depth of content specific to each topic you consider important.
That's nowhere near a full audit, however the above are all examples of issues that absolutely relate to working toward a highly trusted site from Google's algorithmic perspective.
-
Hi
It looks like you have done everything correct, but you might have to wait for the next big Panda update before you start seeing any movements.
Thanks
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WhoIs penalty
Does anyone know if it's possible to get a penalty on WHOIS data and a shared IP address? We had some bad SEO done (And at ranking demolished) on one of our company websites which has the same WHOIS data and is on the same IP address as another side which is just seems to have taken a knock. Is it possible Google could have associated both and penalised accordingly?
Intermediate & Advanced SEO | | seoman100 -
SEO advice with having a blog on sub domain.
Righto, so: I've been working on our company website www.nursesfornurses.com.au which is built on .asp which is a real pain because the site is built so messy and on a very dated CMS which means I have to go back to the dev every time I want to make a change. We've made the decision to move the site over to Wordpress in stages. So, (and I hope logically), i've started by making them a proper blog with better architecture to start targeting industry related keywords. I had to put it on a sub domain as the current hosting does not support Wordpress http://news.nursesfornurses.com.au/Nursing-news/
Intermediate & Advanced SEO | | 9868john
The previous blog is here: http://www.nursesfornurses.com.au/blog Its not live yet, so I'm just looking for SEO advice or issues I might encounter by having the blog on a sub domain. In terms of user experience, I realise that there needs a clearer link back to the main website, I'm just trying to work out the best way to do it... Any advice / criticism is greatly welcomed. Thanks0 -
Removing A Blog From Site...
Hi Everyone, One of my clients I am doing marketing consulting for is a big law firm. For the past 3 years they have been paying someone to write blog posts everyday in hopes of improving search traffic to site. The blog did indeed increase traffic to the site, but analyzing the stats, the firm generates no leads (via form or phone) from any of the search traffic that lands in the blog. Furthermore, I'm seeing Google send many search queries that people use to get to the site to blog pages, when it would be much more beneficial to have that traffic go to the main part of the website. In short, the law firm's blog provides little to no value to end users and was written entirely for SEO purposes. Now the law firm's website has 6,000 unique pages, and only 400 pages of the site are NON-blog pages (the good stuff, essentially). About 35% of the site's total site traffic lands on the blog pages from search, but again... this traffic does not convert, has very high bounce rate and I doubt there is any branding benefit either. With all that said, I didn't know if it would be best to delete the blog, redirect blog pages to some other page on the site, etc? The law firm has ceased writing new blog posts upon my recommendation, as well. I am afraid of doing something ill-advised with the blog since it accounts now for 95% of the pages of the website. But again, it's useless drivel in my eyes that adds no value and was simply a misguided SEO effort from another marketer that heard blogs are good for SEO. I would certainly appreciate any guidance or advice on how best to handle this situation. Thank you for your kind help!
Intermediate & Advanced SEO | | gbkevin0 -
Penguin & Panda: Geographic Penalities?
Has anyone ever come across information about a website appearing strongly in SERP's in one region, but poorly in another? (ie: great in Europe, not so great in N. America) If so, perhaps it is a Panda or Penguin issue?
Intermediate & Advanced SEO | | Prospector-Plastics0 -
Internal Anchor Text Penalty Clarification
I believe we may be seeing the initial stages of a penalty for over-using internal anchor text on our ecommerce site. Per Rand and other training, we added related product links and popular category links to our product and category pages. At the time, we did not have an html sitemap in the footer. We're a small to medium sized site with 1,700+ products. We have since added an html sitemap of our categories to our footer. Now we have category links in the sitemap and category pages and product pages with targeted anchor text. I'm beginning to see downward movement on some of those targeted categories. If I have an html sitemap in the footer (category index) should I get rid of the popular category links throughout the rest of the site? Also, with more frequency, I'm seeing a "product index" and "category index" in footers. Is this a best practice? Thanks.
Intermediate & Advanced SEO | | AWCthreads0 -
Were small sites hit by Panda?
It seems that primarily large sites were hit by Panda, but does any one know of / own a small site that was hit by Panda?
Intermediate & Advanced SEO | | nicole.healthline0 -
How long does a Google penalty last if you have fixed the problem??
Hi I stupidly thought that it would be a good idea to set up a reciprocal links page on my website named 'links'. I did this because my competitors were linking to these pages so I though it would be a good idea and I genuinely didn't know that you could be punished for this. Within about 3 weeks my rank dropped about 3 pages. I have since removed the links and the page was cached last Friday but the site still appears to have a penalty. I assumed when Google cached the page and saw the links were not there anymore that the penalty would be lifted. Anyone got any ideas? ps. The competitor websites had broken their links pages into various categories relating to the website i.e. related directories etc. so this might be why they weren't penalized.
Intermediate & Advanced SEO | | BelfastSEO0 -
Need advice on local search optimization
HI all, I've found myself in a puzzling position and not quite sure which direction to push my current SEO project so if anyone who's done this particular type of SEO can offer some suggestions I'd be eternaly grateful. I am currently working on a project for a Law Firm based in New Jersey. Lets say the town they are in is Garfield. What I really want to try and achieve is see them appearing in the number one spot whenever anyone within Garfield or the immediate area searches for a lawyer relating to the individuals need. E..g searches like "personal injury lawyers", "real estate lawyer". The problem is I can see how I can easily make it to the number one position if people are specific and enter garfield in the search term but in reality they wouldn't be doing that. An additional problem is that peoples ISP's in garfield aren't located in Garfield, in some cases they're as far away as Newark so when they're doing a search for 'real estate lawyer' google is bringing up results for the Newark based firms. It seems using tools like market samurai to look at the traffic and competition is proving useless as searches like the ones I'm doing for local business are so closely tied to the ISP location I don't really know whether to target broad range searches like "Real Estate Lawyer", or to be really specific and include the town name in my page titles, H1 tags etc... I hope I put across my dilemma and someone can help me chose which direction to go in.. Thanks
Intermediate & Advanced SEO | | davebrown19750