Panda recovery. Is it possible ?
-
Dear all,
To begin, english is not my native language, so I'm very sorry if I make some mistake.
On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic.
The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that.
So, in may, I have made a new version. Here you can see the most important modifications :
- A smallest header (-100px height).
- 2 columns website (the oldest website had 3 columns)
- I have deleted the category menu with the list of all categories and the alphabetical menu.
- less ads on the website (since few days I have also deleted the 2 adense blocks)
- The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only.
- I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex.
- I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used).
- I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam.
- Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment.
- All the merchants pages without promo codes have a noindex on the robot tag.
- Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes.
- Affiliate links are created on JS which open a new window (a redirect page with noindex).
That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS...
At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery.
I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites...
I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do.
I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations.
Many thanks for all.
Sincerely,
Florent
-
Dear Edward,
You are right. I have seen some US/UK coupons websites with informations about the company or the website (payment methods, shipping methods...). I think it's a good idea to have better content. On France, there are not a lot of coupons websites with these types of informations. The market on our country is important but less thant country like USA. So, a lot of websites are made automaticly, in few days some webmasters have few coupons website and they win money because they have a network with power.
I think I will use this option on the near future. 2600 merchants, it will be long to add these informations but if it's possible to have a Panda Recovery, I think it's not a hard work, just necessary.
Thx for your help.
Sincerely,
F.
-
I've worked with a few Coupon/Promo Codes sites since the launch of Penguin with some success and some failure. The biggest issues I've found between coupon sites is a lack of truly original content and very thin content, with pages frequently saying the same thing as other pages but slightly re-worded. Duplicate content issues are usually common as well.
Ex: "Check out our coupon codes for [company/website]...[more filler text here]."
One strategy that seems to be fairly effective (and logical) for such sites is filling the retailer coupon pages with information relevant to the coupons (which obviously vary) as well as the company. Ex: company history, background, etc. -- content that's truly unique from page to page.
-
Dear eyepaq,
Many thanks for your reply, that's great.
Like you say, I'm sure that my change are good for the future so when the panda filter (sorry for having used the word penality, in France we used the word "filtre", it's difficult to speak in other language :p).
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
I'm ok with that, that's the only point on which I have no idea to be better. I have see all the most important US coupons websites to help me but they are to big for me. Technically they are better, better design and I think they have a lot of persons which work every day for the website. In France, there are less competitors, 5 biggest and all the other are very simple website like mine.
+ panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
I know, but havind a better speed is good for visitor. I think it's good for Google to show that speed page is important for the website.
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
I will work on this :).
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Ok but it's very difficult. I use 2 websites on my niche that have better traffic than. Why these 2 websites ? Only because they have a simple design and not a lot of options on the merchant pages. These 2 websites have problems which I haven't got but no pand filter and better traffic. The reason ? They are oldest than me and they have a lot of links (one have more 1 million link). So it's not very clean but they rank well.
Just a last question. Do you think it's better for me to "crypt" the coupon code (on the html code) ? Why ? Just because Google can see that we have all the same code. If I use a crypt code, perhaps it will be better to say "I have unique content" ? Do you think that it'a a good idea ?
Once again many thanks for your post. You are very clear for me and you give me another vision to solve my problem :).
Sincerely,
Florent
-
Hi Florent,
All the changes that you did are very good and for sure it will help your site - but not necessarily on the Panda issue.
When talking about panda you need to consider a few things:
-
panda is not a penalty - it's a filter (very important difference)
-
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
-
panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
So, fi you are 100% that Panda is to blame for your drop in rankings you need to compare your self with the competition first and see how can you be better then them.
Just put your self in Google shoes - if you have 10 sites on the same niche with more or less the same content you want to keep 1 or 2, populate the rest of the results with diverse results and move the rest -50 or whatever.
If you are not the one on the 1 or 2 set then you are one of the one that just got moved back - way back (down).
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Hope it helps. Is it clear or I am beating around the bush ?
Cheers !
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is it possible to direct HTTP www 301 to HTTPS non www?
I have a question that has been stumping me and if someone could help I would gladly buy your coffee for a month. I have a website that used to be www and http a year or two ago. Now it is https and non www. A lot of my older links point to the www and http version of my site. This results in two 301 redirects. I.e. A link on another site to my site points to http://www.mysite.com The network waterfall shows: http://www.mysite.com 301 -> http://mysite.com http://mysite.com 301 -> https://mysite.com https://mysite.com (finally) **2 part question. ** **--Do you think that this two 301 redirect hop would affect SEO performance? I can see it did affect page authority through Moz. ** --Is there away around this? I.e. to redirect http:// AND http://www directly to https:// with no hops in between. Thank you!
Intermediate & Advanced SEO | | Stodzy0 -
Curious what risk we are for Panda 4.2 update
We are curious to know what the MOZ community thinks about our level of unique content on the following profile and if the community thinks we are currently at risk / susceptible for a Panda 4.2 Penalty. We have profiles on over 4,000 colleges (https://www.noodle.com/colleges/coUD/williams-college) some are more populated with content than others. We've already taken action to noindex several hundred thousand profile URLs (https://www.noodle.com/tutoring/tc2d9be/eye-level-center-of-tribeca) which currently publish less original content. Curious how other major vertical search websites approach this problem (a la Glassdoor, Yelp, TripAdvisor, etc) As always - the feedback from this community is priceless!!
Intermediate & Advanced SEO | | abargmann0 -
Is it possible to rank in google mexico when you don't have a local site?
Hello, someone is asking me why we don't rank in google mexico search engine. I mentioned we don't have a google mexico site, but have a USA site, so we may rank, but not as well as if we had the mexico site. IS there anyway to improve rankings or tips? THanks! Laura Robinson
Intermediate & Advanced SEO | | lauramrobinson321 -
Panda Recovery - What is the best way to shrink your index and make Google aware?
We have been hit significantly with Panda and assume that our large index with some pages holding thin/duplicate content being the reason. We have reduced our index size by 95% and have done significant content development on the remaining 5% pages. For the old, removed pages, we have installed 410 responses (Page does not exist any longer) and made sure that they are removed from the sitempa submitted to Google; however after over a month we still see Google spider returning to the same pages and the webmaster tools shows no indicator that Google is shrinking our index size. Are there more effective and automated ways to make Google aware of a smaller index size in hope of Panda recovery? Potentially using the robots.txt file, GWT URL removal tool etc? Thanks /sp80
Intermediate & Advanced SEO | | sp800 -
Penguin recovery - Any suggestions?
Hey Guys. To be honest i havent heard about any case of penguin recovery yet except from the famous WMPU revocery (if u ask me it dosent count really) Any way I own a few sites.. that got hit by Penguin. I did remove 90% of the "suspected as bad links" and apply to google reconsideration. in return i got an answer after a few days from google saying :"We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google. There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.... and so on" So what they saying is that the engine still think that your links are spammy.. what should i do?? i truy removed most of the links but homepage isnt really showing any progress.. he is just no where.. Idaes? Case Studies?
Intermediate & Advanced SEO | | Idoz0 -
Duplicate content: is it possible to write a page, delete it and use it for a different site?
Hi, I've a simple question. Some time ago I built a site and added pages to it. I have found out that the site was penalized by Google and I have neglected it. The problem is that I had written well-optimized pages on that site, which I would like to use on another website. Thus, my question is: if I delete a page I had written on site 1, can use it on page 2 without being penalized by Google due to duplicate content? Please note: site one would still be online. I will simply delete some pages and use them on site 2. Thank you.
Intermediate & Advanced SEO | | salvyy0 -
Examples of sites other than Hubpages that have used subdomains to recover from Panda?
Everyone knows subdomains worked for Hubpages to recover from Panda. Does anyone know of other examples of sites that have recovered from Panda using subdomains?
Intermediate & Advanced SEO | | nicole.healthline0 -
Need help identifying why my content rich site was hurt by the Panda Update.
Hi - I run a hobby related niche new / article / resource site (http://tinyurl.com/4eavaj4) which has been heavily impacted by the Panda update. Honestly I have no idea why my Google rankings dropped off. I've hired 2 different SEO experts to look into it and no one has been able to figure it out. My link profile is totally white hat and stronger then the majority of my competitors, I have 4000+ or so pages of unique, high quality content, am a Google News source, and publish about 5 new unique articles every day. I ended up deleting a 100 or so thin video pages on my site, did some url reorganization (using 301s), and fixed all the broken links. That appeared to be helping as my traffic was returning to normal. Then the bottom dropped out again. Since Saturday my daily traffic has dropped by 50%. I am really baffled at this point as to what to do so any help would be sincerely appreciated. Thanks, Mike jamescrayton2003@yahoo.com
Intermediate & Advanced SEO | | MikeATL0