Panda recovery. Is it possible ?
-
Dear all,
To begin, english is not my native language, so I'm very sorry if I make some mistake.
On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic.
The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that.
So, in may, I have made a new version. Here you can see the most important modifications :
- A smallest header (-100px height).
- 2 columns website (the oldest website had 3 columns)
- I have deleted the category menu with the list of all categories and the alphabetical menu.
- less ads on the website (since few days I have also deleted the 2 adense blocks)
- The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only.
- I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex.
- I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used).
- I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam.
- Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment.
- All the merchants pages without promo codes have a noindex on the robot tag.
- Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes.
- Affiliate links are created on JS which open a new window (a redirect page with noindex).
That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS...
At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery.
I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites...
I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do.
I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations.
Many thanks for all.
Sincerely,
Florent
-
Dear Edward,
You are right. I have seen some US/UK coupons websites with informations about the company or the website (payment methods, shipping methods...). I think it's a good idea to have better content. On France, there are not a lot of coupons websites with these types of informations. The market on our country is important but less thant country like USA. So, a lot of websites are made automaticly, in few days some webmasters have few coupons website and they win money because they have a network with power.
I think I will use this option on the near future. 2600 merchants, it will be long to add these informations but if it's possible to have a Panda Recovery, I think it's not a hard work, just necessary.
Thx for your help.
Sincerely,
F.
-
I've worked with a few Coupon/Promo Codes sites since the launch of Penguin with some success and some failure. The biggest issues I've found between coupon sites is a lack of truly original content and very thin content, with pages frequently saying the same thing as other pages but slightly re-worded. Duplicate content issues are usually common as well.
Ex: "Check out our coupon codes for [company/website]...[more filler text here]."
One strategy that seems to be fairly effective (and logical) for such sites is filling the retailer coupon pages with information relevant to the coupons (which obviously vary) as well as the company. Ex: company history, background, etc. -- content that's truly unique from page to page.
-
Dear eyepaq,
Many thanks for your reply, that's great.
Like you say, I'm sure that my change are good for the future so when the panda filter (sorry for having used the word penality, in France we used the word "filtre", it's difficult to speak in other language :p).
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
I'm ok with that, that's the only point on which I have no idea to be better. I have see all the most important US coupons websites to help me but they are to big for me. Technically they are better, better design and I think they have a lot of persons which work every day for the website. In France, there are less competitors, 5 biggest and all the other are very simple website like mine.
+ panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
I know, but havind a better speed is good for visitor. I think it's good for Google to show that speed page is important for the website.
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
I will work on this :).
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Ok but it's very difficult. I use 2 websites on my niche that have better traffic than. Why these 2 websites ? Only because they have a simple design and not a lot of options on the merchant pages. These 2 websites have problems which I haven't got but no pand filter and better traffic. The reason ? They are oldest than me and they have a lot of links (one have more 1 million link). So it's not very clean but they rank well.
Just a last question. Do you think it's better for me to "crypt" the coupon code (on the html code) ? Why ? Just because Google can see that we have all the same code. If I use a crypt code, perhaps it will be better to say "I have unique content" ? Do you think that it'a a good idea ?
Once again many thanks for your post. You are very clear for me and you give me another vision to solve my problem :).
Sincerely,
Florent
-
Hi Florent,
All the changes that you did are very good and for sure it will help your site - but not necessarily on the Panda issue.
When talking about panda you need to consider a few things:
-
panda is not a penalty - it's a filter (very important difference)
-
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
-
panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
So, fi you are 100% that Panda is to blame for your drop in rankings you need to compare your self with the competition first and see how can you be better then them.
Just put your self in Google shoes - if you have 10 sites on the same niche with more or less the same content you want to keep 1 or 2, populate the rest of the results with diverse results and move the rest -50 or whatever.
If you are not the one on the 1 or 2 set then you are one of the one that just got moved back - way back (down).
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Hope it helps. Is it clear or I am beating around the bush ?
Cheers !
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving my website to Wordpress, what are the possible consequences?
I am moving my website to WordPress after years. I am curious to know what the best route to maintain as much SEO value as possible would be. We have a lot of 301 redirects currently, and we will maintain those, but what else should we do or know? Thanks for any help.
Intermediate & Advanced SEO | | hwade0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
Is it possible to direct HTTP www 301 to HTTPS non www?
I have a question that has been stumping me and if someone could help I would gladly buy your coffee for a month. I have a website that used to be www and http a year or two ago. Now it is https and non www. A lot of my older links point to the www and http version of my site. This results in two 301 redirects. I.e. A link on another site to my site points to http://www.mysite.com The network waterfall shows: http://www.mysite.com 301 -> http://mysite.com http://mysite.com 301 -> https://mysite.com https://mysite.com (finally) **2 part question. ** **--Do you think that this two 301 redirect hop would affect SEO performance? I can see it did affect page authority through Moz. ** --Is there away around this? I.e. to redirect http:// AND http://www directly to https:// with no hops in between. Thank you!
Intermediate & Advanced SEO | | Stodzy0 -
Recovery from a HTTP to HTTPs migration using 302s ?
If a website did an HTTP to HTTPS migration using 302 re-directs, that were corrected to 301s about 4 months later, what is the expected impact? Will the website see a full recovery or has the damage been done? Thanks to anyone who can shed some light on this...
Intermediate & Advanced SEO | | yaelslater0 -
Help, no organic traffic recovery after new site launch (it's been 6 months)!
I worked with a team of developers to launch a new site back in March. I was (and still am) in charge of SEO for the site, including combining 4 sites into 1. I made sure 301 redirects were in place to combine the sites and pretty much every SEO tactic I can think of to make sure the site would maintain rankings following launch. However, here we are 6 months later and YoY numbers are down -70% on average for organic traffic. Anyone mind taking a look at http://www.guestguidepublications.com and seeing if there's a glaring mistake I'm missing?!?!?! Thanks ahead of time!
Intermediate & Advanced SEO | | Annapurna-Digital1 -
Is Google sandboxing > 6 months possible or should we move on?
The below chart shows what we believed on of our domains being caught in the Google sandbox for the last 6 months; the sites has built site has built significant, relevant, high quality backlinks and content and is outperforming the majority of sites ranked in the top 10. The site was a reset project of a site caught in Panda however we tried to not use any 301 redirects or linkages between the site. After 6 months without organic traffic I am leaning on the community for advice if this could still be legitimate Google sandboxing or if we should try to repoint our external links to a new domain and see if the site can gain rankings quicker? From past project the Google sandbox issue has been resolved within 2 months. Any advice welcome. matIgqE
Intermediate & Advanced SEO | | spanish_socapro0 -
Can You Help Confirming That 1stWebDesigner Was hit by Panda 4.0 and PayDay Loans 2.0,3.0?
Hello, really hoped for your feedback here. So my site is http://www.1stwebdesigner.com I just used http://barracuda.digital/panguin-tool/ - and came to realization, that indeed we might have been hit by famous Panda 4.0 update. Here is screenshot from Barracuda tool - https://www.evernote.com/l/AELGaZ6nyxBE1aK7oQJVHBPKJuwjUh5JWk8 and in attachment there is Google Analytics screenshot for that time - especially May, 2014. Can you please help me confirm that we indeed have been hit with the Penalty? It has been 1.5 years already and since then, the traffic never has grown more. Before May, 2014 our site received 1.5 million pageviews/mo, after it..until this moment traffic has never been more than 600,000 pageviews/mo. Now if I look back even more with Barracuda tool, I can see it was also affected by Top Heavy 1.0 update,Thu Jan 19 2012. Even back then before all these updates site grew to 2.5 million pageviews/mo. What is the painful part I always focused on time-consuming link building. Submitting links to design news sites, leaving quality comments under related articles and always have written unique quality content on site for years. Can you tell based on screenshots and confirm we indeed have penalty. And maybe suggest what are the actions to take now? I have ahrefs pro subscription and started using Disavow tool to prepare for submission to Google. Would love to hear feedback on this..it has been painful throughout years. Update: here is one more screenshot from Barracuda looking from 2012 year to traffic - https://www.evernote.com/l/AEIX6iP4USBGtbyRLZ1oTmcACgBtXHBP2rw SpRPIyY
Intermediate & Advanced SEO | | researchninja0 -
Panda Prevention Plan (PPP)
Hi SEOMOzers, I'm planning to prepare Panda deployment, by creating a check-list from thinks to do in SEO to prevent mass trafic pert. I would like to spread these ideas with SEOMoz community and SEOMoz staff in order to build help ressources for other marketers. Here are some ideas for content website : the main one is to block duplicate content (robots.txt, noindex tag, according to the different canonical case) same issue on very low quality content (questions / answers, forums), by inserting canonical redirect or noindex on threads with few answers
Intermediate & Advanced SEO | | Palbertus1