Panda penalty removal advice
-
Hi everyone! I'm after a second (or third, or fourth!) opinion here!
I'm working on the website www.workingvoices.com that has a Panda penalty dating from the late March 2012 update. I have made a number of changes to remove potential Panda issues but haven't seen any rankings movement in the last 7 weeks and was wondering if I've missed something...
The main issues I identified and fixed were:
- Keyword stuffed near duplicate title tags - fixed with relevant unique title tags
- Copies of the website on other domains creating duplicate content issues - fixed by taking these offline
- Thin content - fixed by adding content to some pages, and noindexing other thin/tag/category pages.
Any thoughts on other areas of the site that might still be setting off the mighty Panda are appreciated!
Cheers
Damon.
-
Our site was scraped by a past empoyee who started up a competing buisness with our inside trade secrets, client list and designs. As they launched they immediately tried to put us out of buisness by:
A. hired hacks to hook us up with tons of spammy links along with a high mix of porn and virus injections sites.
B. hired hacks from the same cesspool and had them submit our images to same bad types of sites that would take the customer somewhere else
C. signed up email address to our newseltter so that when we sent out an email it would initiate a chain reaction to zombie computers and launch a DDOS attack on our site and make our own email campaigns stop sales and trash the confidence in the rest of the customers on the mailing list.
D. Gave out every know email address in our company to spammers, to the point of making it difficult to get or send emails to customers.
E. Submitted our phone number to every robot call and junk call site possible tieing up our phones and filling our voice mail."Regarding Panda timing- the site took the big hit three years ago." We to had this exact timing happen to us on top of everything else because we were too busy defending ourselves to keep up with the Google changes.
Regardless of all the horrifying past events, we have completely rebuilt the business from the inside out and migrated to a BigCommerce website from our custom site, plus added 5 social media platforms. BUT..."having to wait for Google" to reindex and give us another chance is killing us and we are concern that we may never get back in the good graces of this SE titian.
All though we have survived the battle, we still may loose the war! Even with continuing efforts to optimize our site to death and with only a fraction of the traffic, orders and income, we have to wonder:
A. What else is wrong such as trying to determine if there are duplicate content on sites out their we are unaware of.
B. Seriously considering dumping our domain (owned since 2000) and going to a new domain that would have to be reindexed and treated as fresh, hopefully optimzed content per the Google requirements, and take our chances.Input on considerations of A & B would be appreciated as we are pretty worn out after 3 years working at this.
-
In this case it was easy as they had created the duplicate domains themselves and they had control over them, so it was just a case of getting them taken down.
-
How did you find ..."opies of the website on other domains creating duplicate content issues"?
How did you ..."- fixed by taking these offline"?
We have been dealing with the same issues but did not think of the above and would like to find out if we have the same "duplicate" issues.
-
Yes, we do have Bing Webmaster Tools set up - I agree, even through Bing is limited in terms of traffic volume, Bing Webmaster Tools does give a slightly different take on things compared to Search Console.
Damon.
-
I'm also curious to know whether you've monitored Bing/Yahoo value over the course of your work. While it's rarely anywhere near Google's potential volume, I've seen good value gained from those as clients have implemented recommendations, even when Panda was a prime issue (and the subsequent panda refresh was a problem).
Overall it does sound like you're on the right track though.
-
Hello again Alan!
Agree with you 100% that this is a ongoing process. I asked the question with regards to getting the new hosting set up asap - if it wasn't going to be taken into account for the latest Panda update we would have a little more time.
As you say, having to wait for Google for almost a year to rerun Panda is really difficult for everyone (not just us). It's a really pity that we didn't pick this up earlier when Panda was running more regularly.
I've just run another crawl and we have 79x 30* redirects and 26x 40* pages, most of which are thumbnail jpgs and category pages (which are noindexed anyway). As stated above, I'll get these fixed this week.
We completed a competitor content analysis and redeveloped our main landing pages around this, and, together with our backlink profile, we think we've got a good chance of hitting the top ten SERP results - we are targeting some quite specific keywords with not particularly strong competition and have gained some excellent backlinks over the last few months.
Once again, thanks for your insight and help!
Damon.
-
Regarding 404/301 issues. The numbers I gave were for a small partial crawl of a hundred URLs. So a full Screaming Frog crawl would help to determine if it's worse. Even if its not, think of the concept where a site might have a dozen core problems, and twenty problems that by themselves might seem insignificant. At a certain point, something becomes the straw that breaks the camels back.
Regarding content - how many courses offered are actually up against competitors that have entire sections devoted to the topic just a single course page has on that site? How many have entire sites devoted to that? Understanding content depth requires understanding the scale of real and perceived competition. And if it's a course page, it may not be a "main" landing page, yet it's important in its own right.
Regarding panda timing - the site took the big hit three years ago. Waiting for, and hoping that the next update is the one that will magically reflect whatever you've done to that point isn't, in my experience, a wise perspective.
While it's true that once Google has locked a data set to then be applied to a specific algorithmic update, not taking action at a high enough level, and with enough consistency is gambling. Since true best practices marketing as a whole needs to be ongoing, efforts to strengthen on-site signals and signal relationships also needs to be ongoing. Because even if Panda weren't a factor, the competitive landscape is ever marching forward.
-
Hi Alan
Thanks for your comprehensive response - you make some very good points.
1. Host: The client is currently changing host as the current host is very entry level and we were aware that we had a problem - having said that the response times are a lot slower than when I last looked so we'll get in touch with the current host to see what they can do now.
2. 404/301 pages: Again these are on the list for the team to pick up on. I didn't actually think that there were enough to cause a problem - I can imagine if there were hundreds we might have an issue, but I would have thought 20 or so would have been OK? I'll chase to get these fixed in any case.
3. Content: I guess this is the gray area between a page not ranking due to poor page quality and a website being "algorthmically adjusted" because of poor page quality. We've worked on all our main landing pages to make them more comprehensive and from the research we have done we felt that we had done enough. We did consider noindexing the blog as well, but felt that as it was unique, while not particulary comprehensive, it shouldn't causing any Panda problems.
Quick question - is it your experience that once Panda starts running it is to late to make changes to your website? I've read that it is in a few places, but not in others places. I guess when it was running monthly it wasn't such an issue.
Once again, thank you very much for having a look - it's great to get a fresh set of eyes on the site.
Best
Damon.
-
Damon,
To start, let's be clear - Panda isn't a "penalty" - it's an algorithmic adjustment based on quality, uniqueness, relevance and trust signals.
Having audited many sites hit by the range of Panda updates, I have a pretty good understanding of what it usually takes. so having said that, I took a quick look at the site. While Andy may be correct in that you may only need to wait and hope the next or some future Panda update acknowledges the changes you've made to this point, that very well may not be enough.
1st obvious problem - your site's response times are toxic. - a crawl using Screaming Frog shows many of the pages have a response time of between 3 and 7 seconds. That's a major red flag - response times are the amount of time it takes to get to each URL. If it takes more than 2 seconds, that's typically an indicator that crawl efficiency is very weak. Crawl efficiency is a cornerstone of Panda because it reflects what is almost certainly a larger overall page processing time problem. Since Google sets a standard "ideal" page processing time of between one and three seconds, if it takes more than that just to ping the URL, the total processing time is likely going to be significantly worse.
While it's not required to always get a one to three second total process time, if too many pages are too slow across enough connection types for your visitors, that will definitely harm your site from a quality perspective.
And if too many pages have severely slow response times, Google will often abandon site crawl, which is another problem.
Next, I checked Google Page Speed Insights. Your home page scored a dismal 68 out of a possible 100 points for desktop users (85 is generally considered a good passing grade). That reinforces my concern about crawl inefficiency and poor page processing. It was even worse for mobile - scoring only 53 out of 100 points. In my second test, I got 63/100 for desktop and 49 for mobile. The different results for the two tests is due to the fact that speeds are worse at different times than others.
Just one of the issues GPSI lists is server response time (which confirms the very poor response times I saw in Screaming Frog).
Next, a partial crawl using Screaming Frog crawled 20 URLs that resulted in 404 (not found) status, which means you have internal links on your site pointing to dead ends - another quality hit. And SF found 25 internal URLs that redirect via 301 - further reinforcing crawl inefficiency. Since this was a partial crawl, those problems could be even bigger scaled across the site.
Then I poked around the site itself. http://www.workingvoices.com/courses/presentation-skills-training/keynote-speaker/ is indexed in google, as it's one of your courses. That page is possibly problematic due to the fact that there is hardly any content on that page overall. So while you may think you've dealt with thin content already, I don't think you fully grasp the need for strong, robust depth of content specific to each topic you consider important.
That's nowhere near a full audit, however the above are all examples of issues that absolutely relate to working toward a highly trusted site from Google's algorithmic perspective.
-
Hi
It looks like you have done everything correct, but you might have to wait for the next big Panda update before you start seeing any movements.
Thanks
Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doorway page penalty
Has Google changed their interpretation of Doorway pages?We do not sell widgets but allow me to use Widget for this example;If we sold 25 very different widgets an online vendor would typically have 1 "mother" website with 25 different inner pages, each page to explain each type of widget they sell.However, for the past 9 years our approach is to have 25 different websites, one for each widget. With these 25 sites we concentrated on ranking the home page only . All these sites link back to our (No idexed) "Mother' site via no follow links where we have our Shopping Cart and Terms of Business. We did this partly to avoid having 25 separate Shopping Carts and to avoid having to change our Terms 25 times each time that became necessary. But yes we also did this as it was so much easier to rank each different type of widget in the SERPS. Also we think its a better user experience as in our business buyers of yellow widgets will not be interested in blue widgetsWe have been reading for years that google does not like doorways pages but we were not 100% certain if they might regard our sites as such .This is because our approach has worked great for nine years. That is until December last year when all 95% our sites fell dramatically in the SERPS usually from page 1 to page 2 or 3. First thing we did was to go through all our sites and search for the obvious; toxic links, duplicate content, keyword density, https issues, mobility issues, anchor text, etc etc and of course content. We found no obvious problems that could affect 95% of the sites at the same time but we ordered new homepage content for most of our sites from expert seo writers. However, after putting on this new content 3 -4 weeks ago our sites have not moved up the SERPS at all.So we are left with the inescapable conclusion that our problem is because google sees and devalues our sites as doorway pages especially as 95% of your sites have been affected all at the same time Would any SEO experts on this forum agree or be able to offer an opinion?If so, what might be the solution going forward? We have 2 solutions under consideration;1) Remove all links from each of our 25 sites to our "mother Site" and put a shopping cart and our TOS on each of the 25 sites so they are all truly independent stand alone websites.2) Create 25 inner pages on our mother site (after removing the no index) , for each of the 25 widgets we sell , then 301 each of the 25 individual sites home pages to its inner page on the mother site . I think this might be the best solution partly as almost all of our higher ranking competitors are ranking their inner pages not their homepage. But I worry if these 25 sites will really pass much link juice if they have been devalued by Google.?Any advice will be gratefully received.
Intermediate & Advanced SEO | | apcsilver90 -
6 .htaccess Rewrites: Remove index.html, Remove .html, Force non-www, Force Trailing Slash
i've to give some information about my website Environment 1. i have static webpage in the root. 2. Wordpress installed in sub-dictionary www.domain.com/blog/ 3. I have two .htaccess , one in the root and one in the wordpress
Intermediate & Advanced SEO | | NeatIT
folder. i want to www to non on all URLs Remove index.html from url Remove all .html extension / Re-direct 301 to url
without .html extension Add trailing slash to the static webpages / Re-direct 301 from non-trailing slash Force trailing slash to the Wordpress Webpages / Re-direct 301 from non-trailing slash Some examples domain.tld/index.html >> domain.tld/ domain.tld/file.html >> domain.tld/file/ domain.tld/file.html/ >> domain.tld/file/ domain.tld/wordpress/post-name >> domain.tld/wordpress/post-name/ My code in ROOT htaccess is <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews RewriteEngine On
RewriteBase / #removing trailing slash
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)/$ $1 [R=301,L] #www to non
RewriteCond %{HTTP_HOST} ^www.(([a-z0-9_]+.)?domain.com)$ [NC]
RewriteRule .? http://%1%{REQUEST_URI} [R=301,L] #html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L] #index redirect
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/
RewriteRule ^index.html$ http://domain.com/ [R=301,L]
RewriteCond %{THE_REQUEST} .html
RewriteRule ^(.*).html$ /$1 [R=301,L]</ifmodule> The above code do 1. redirect www to non-www
2. Remove trailing slash at the end (if exists)
3. Remove index.html
4. Remove all .html
5. Redirect 301 to filename but doesn't add trailing slash at the end0 -
Interstitial Penalty?
We have an ecommerce website, and we show a popup for first time visitors to our desktop site to join our email list. Google has cached pages with the popup. Can I assume that this is a problem?
Intermediate & Advanced SEO | | AMHC0 -
Change domain whilst under a partial manual links penalty
Hi there We're currently under a manual penalty for some unnatural links to our domain and have been working on fixing that but had our first re-consideration request rejected so we're doing a second round of link removals The issue we have is that we were planning to change our domain before the SSL certificate expires in a couple of weeks and renew the certificate with the new domain but are unsure whether to stop working on the reconsideration request, change the domain and wait until the manual penalty moves to the new domain before continuing the link removal. Alternatively try and use the domain change to select which links are 301'd to the new site and leave behind the bad links in the hope that the manual penalty wouldn't be applied to the new domain Any thoughts or advice would be appreciated
Intermediate & Advanced SEO | | Ham19790 -
Advice on URL structure for competing against EMDs of a hot keyword
Here is the question, illustrated with an example: A law client focuses on personal injury. Their domain is nondescript. The question comes into the URL structure for an article section of the site (I think I know what most people here will say, but want to raise this anyway). This section will have several hundred 'personal injury' articles at launch, with 100+ added each month by writers. Most articles do not mention 'personal injury' in the titles or in the content, but focus on the many areas in which people can hurt themselves :-). Spreading a single keyword emphasis across many pages/posts is considered poor form by many, but the counter-argument is that hundreds of articles, all with 'personal injury' in the URL, could increase the overall authority of the site for that term (and may compete more strongly with EMD competitors). For instance, let's say Competitor A has this article: www.acmepersonalinjury.com/articles/tips-if-in-car-accident And we had the following options: Option A: www.baddomain.com/articles/tips-if-in-car-accident Option B: www.baddomain.com/personal-injury-articles/tips-if-in-car-accident Of course, for the term "car accident", Option A seems on equal footing with the ACME competitor. But, what about the overall performance of the "personal injury" keyword (a HOT keyword in this space)? Would ACME always have an advantage (however slight) due to its domain? Would Option B help in this regard? The downside of course is that this pushes "car accident" further down in the URL string, making all articles perhaps less competitive on their individual keywords.
Intermediate & Advanced SEO | | warpsmith0 -
Redirect advice
My website has two versions of the homepage: http://www.nile-cruises-4u.co.uk/http://www.nile-cruises-4u.co.uk/index.cfmI wondered if I could set up a 301 redirect in the .htaccess file so that only the http://www.nile-cruises-4u.co.uk page was returned as the homepage?Colin
Intermediate & Advanced SEO | | NileCruises0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
What is next from Google Panda and Google Penguin?
Does anyone know what we can expect next from Google Panda/Penguin? We did prepare for this latest update and so far so good.
Intermediate & Advanced SEO | | jjgonza0