Panda recovery. Is it possible ?
-
Dear all,
To begin, english is not my native language, so I'm very sorry if I make some mistake.
On the 23th march 2012, Panda penalized my website (a coupon website) Avec-Reduction (dot com for the url). At this date, I have lost more than 70% of my traffic.
The structure of the website was like an e-commerce website. Categories -> merchant page -> coupon page. The content was to thin for Google, I'm agree wit that.
So, in may, I have made a new version. Here you can see the most important modifications :
- A smallest header (-100px height).
- 2 columns website (the oldest website had 3 columns)
- I have deleted the category menu with the list of all categories and the alphabetical menu.
- less ads on the website (since few days I have also deleted the 2 adense blocks)
- The coupons was promoted with the thumbs of the merchant on the home listing. Now I have few top lists in text only.
- I have deleted all the categories pages (one page by category of merchant, with the listing of all the merchants of the category). Now I have only one page for this. Same thing with the alphabetical pages. All these deleted pages have a 301 redirect. The 2 new pages (categorie page and alphabetical page) are on noindex.
- I have deleted all the promo codes pages, all the coupons are now on the merchant page (301 redirect used).
- I have create an anti-spam system for the review codes (I have a lot of spam on these forms, even if I cleaned every day/2days). Now, I have no spam.
- Visitors have now the possibility to put a note and a review for each merchant. This fonctionality is new, so not a lot of reviews for the moment.
- All the merchants pages without promo codes have a noindex on the robot tag.
- Since july, I have the possibility to use the "title" of each promo code. I can personnalised the promo code. On the same time, to have more content, I have the possibility to add sales or great promos for each merchants, not only promo codes.
- Affiliate links are created on JS which open a new window (a redirect page with noindex).
That's the end of the most important changes on my website. I have a better speed page (/2 since july) because I have optimized my images, CSS, JS...
At the end of july, I had health problem and the website has no update until the first days of october. Now, the website is updated every day, but between july and october I have no panda recovery.
I have no duplicate content, I try to add the most content I can. So I don't understand why Google Panda penalized me again and again. Some of my competitors have a lot of keyword stuffing (4, 5, 6, ... 10 lines of KS on each merchant pages). Some of them have only affiliate merchants, automatic script to put coupons on websites), few "same" websites...
I have less than 30% of affiliated merchant, I validate all the coupons or promo manually, I personalized all my coupons... So I don't understand what to do.
I will appreciate all help. If you see problems on my wesite or if you know tips to have a panda recovery, I will be very happy to have informations.
Many thanks for all.
Sincerely,
Florent
-
Dear Edward,
You are right. I have seen some US/UK coupons websites with informations about the company or the website (payment methods, shipping methods...). I think it's a good idea to have better content. On France, there are not a lot of coupons websites with these types of informations. The market on our country is important but less thant country like USA. So, a lot of websites are made automaticly, in few days some webmasters have few coupons website and they win money because they have a network with power.
I think I will use this option on the near future. 2600 merchants, it will be long to add these informations but if it's possible to have a Panda Recovery, I think it's not a hard work, just necessary.
Thx for your help.
Sincerely,
F.
-
I've worked with a few Coupon/Promo Codes sites since the launch of Penguin with some success and some failure. The biggest issues I've found between coupon sites is a lack of truly original content and very thin content, with pages frequently saying the same thing as other pages but slightly re-worded. Duplicate content issues are usually common as well.
Ex: "Check out our coupon codes for [company/website]...[more filler text here]."
One strategy that seems to be fairly effective (and logical) for such sites is filling the retailer coupon pages with information relevant to the coupons (which obviously vary) as well as the company. Ex: company history, background, etc. -- content that's truly unique from page to page.
-
Dear eyepaq,
Many thanks for your reply, that's great.
Like you say, I'm sure that my change are good for the future so when the panda filter (sorry for having used the word penality, in France we used the word "filtre", it's difficult to speak in other language :p).
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
I'm ok with that, that's the only point on which I have no idea to be better. I have see all the most important US coupons websites to help me but they are to big for me. Technically they are better, better design and I think they have a lot of persons which work every day for the website. In France, there are less competitors, 5 biggest and all the other are very simple website like mine.
+ panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
I know, but havind a better speed is good for visitor. I think it's good for Google to show that speed page is important for the website.
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
I will work on this :).
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Ok but it's very difficult. I use 2 websites on my niche that have better traffic than. Why these 2 websites ? Only because they have a simple design and not a lot of options on the merchant pages. These 2 websites have problems which I haven't got but no pand filter and better traffic. The reason ? They are oldest than me and they have a lot of links (one have more 1 million link). So it's not very clean but they rank well.
Just a last question. Do you think it's better for me to "crypt" the coupon code (on the html code) ? Why ? Just because Google can see that we have all the same code. If I use a crypt code, perhaps it will be better to say "I have unique content" ? Do you think that it'a a good idea ?
Once again many thanks for your post. You are very clear for me and you give me another vision to solve my problem :).
Sincerely,
Florent
-
Hi Florent,
All the changes that you did are very good and for sure it will help your site - but not necessarily on the Panda issue.
When talking about panda you need to consider a few things:
-
panda is not a penalty - it's a filter (very important difference)
-
you don't have to be good to stay safe or recover from panda, you need to be better then the rest of the sites that are covering the same niche.
-
panda it's only about the content (site speed, affiliate format, links etc are not taking into account - for those you have other filters and penalties)
So, fi you are 100% that Panda is to blame for your drop in rankings you need to compare your self with the competition first and see how can you be better then them.
Just put your self in Google shoes - if you have 10 sites on the same niche with more or less the same content you want to keep 1 or 2, populate the rest of the results with diverse results and move the rest -50 or whatever.
If you are not the one on the 1 or 2 set then you are one of the one that just got moved back - way back (down).
Spot the ones that are still up and try to improve the content in order to be better then those are have a different approach for the same niche / area in order to get a spot for the diverse spots. In this way you will get back.
First assess the situation and see if you can be better - in terms of content - then those that are still on top in visibility that are targeting more or less the same content as you do. If you can beat them - change your content strategy and approach the same content in a different format and flow in order to make it on the top results as one of the sites that are part of the diversity results.
Hope it helps. Is it clear or I am beating around the bush ?
Cheers !
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Possible duplicate content issues on same page with urls to multiple tabs?
Hello everyone! I'm first time here, and glad to be part of Moz community! Jumping right into the question I have. For a type of pages we have on our website, there are multiple tabs on each page. To give an example, let's say a page is for the information about a place called "Ladakh". Now the various urls that the page is accessible from, can take the form of: mywanderlust.in/place/ladakh/ mywanderlust.in/place/ladakh/photos/ mywanderlust.in/place/ladakh/places-to-visit/ and so on. To keep the UX smooth when the user switches from one tab to another, we load everything in advance with AJAX but it remains hidden till the user switches to the required tab. Now since the content is actually there in the html, does Google count it as duplicate content? I'm afraid this might be the case as when I Google for a text that's visible only on one of the tabs, I still see all tabs in Google results. I also see internal links on GSC to say a page mywanderlust.in/questions which is only supposed to be linked from one tab, but GSC telling internal links to this page (mywanderlust.in/questions) from all those 3 tabs. Also, Moz Pro crawl reports informed me about duplicate content issues, although surprisingly it says the issue exists only on a small fraction of our indexable pages. Is it hurting our SEO? Any suggestions on how we could handle the url structure better to make it optimal for indexing. FWIW, we're using a fully responsive design with the displayed content being exactly same for both desktop and mobile web. Thanks a ton in advance!
Intermediate & Advanced SEO | | atulgoyal0 -
Possible duplicate content issue
Hi, Here is a rather detailed overview of our problem, any feedback / suggestions is most welcome. We currently have 6 sites targeting the various markets (countries) we operate in all websites are on one wordpress install but are separate sites in a multisite network, content and structure is pretty much the same barring a few regional differences. The UK site has held a pretty strong position in search engines the past few years. Here is where we have the problem. Our strongest page (from an organic point of view) has dropped off the search results completely for Google.co.uk, we've picked this up through a drop in search visibility in SEMRush, and confirmed this by looking at our organic landing page traffic in Google Analytics and Search Analytics in Search Console. Here are a few of the assumptions we've made and things we've checked: Checked for any Crawl or technical issues, nothing serious found Bad backlinks, no new spammy backlinks Geotarggetting, this was fine for the UK site, however the US site a .com (not a cctld) was not set to the US (we suspect this to be the issue, but more below) On-site issues, nothing wrong here - the page was edited recently which coincided with the drop in traffic (more below), but these changes did not impact things such as title, h1, url or body content - we replaced some call to action blocks from a custom one to one that was built into the framework (Div) Manual or algorithmic penalties: Nothing reported by search console HTTPs change: We did transition over to http at the start of june. The sites are not too big (around 6K pages) and all redirects were put in place. Here is what we suspect has happened, the https change triggered google to re-crawl and reindex the whole site (we anticipated this), during this process, an edit was made to the key page, and through some technical fault the page title was changed to match the US version of the page, and because geotargetting was not turned on for the US site, Google filtered out the duplicate content page on the UK site, there by dropping it off the index. What further contributes to this theory is that a search of Google.co.uk returns the US version of the page. With country targeting on (ie only return pages from the UK) that UK version of the page is not returned. Also a site: query from google.co.uk DOES return the Uk version of that page, but with the old US title. All these factors leads me to believe that its a duplicate content filter issue due to incorrect geo-targetting - what does surprise me is that the co.uk site has much more search equity than the US site, so it was odd that it choose to filter out the UK version of the page. What we have done to counter this is as follows: Turned on Geo targeting for US site Ensured that the title of the UK page says UK and not US Edited both pages to trigger a last modified date and so the 2 pages share less similarities Recreated a site map and resubmitted to Google Re-crawled and requested a re-index of the whole site Fixed a few of the smaller issues If our theory is right and our actions do help, I believe its now a waiting game for Google to re-crawl and reindex. Unfortunately, Search Console is still only showing data from a few days ago, so its hard to tell if there has been any changes in the index. I am happy to wait it out, but you can appreciate that some of snr management are very nervous given the impact of loosing this page and are keen to get a second opinion on the matter. Does the Moz Community have any further ideas or insights on how we can speed up the indexing of the site? Kind regards, Jason
Intermediate & Advanced SEO | | Clickmetrics0 -
How to best add affiliate links in a way that minimizes panda risk?
We have a site of about 100.000 pages that is getting several million of visitors per year via organic search. We plan to add about 50.000 new pages gradually in the next couple of months and would like to add affiliate links to the new pages. All these 50.000 new pages will have unique quality data that a team has been researching for a while. I would like to add in the area under the fold or towards the end of the pages in an unobstrusive way affiliate links to about 5 different affiliate programs with affiliate links customized to page content and of real value to visitors. Since affiliate links are one of the factors that may trigger panda I am a bit nervous whether we should add the affiliate links and if there is any way of implementing the affiliate links in a way that they may be less likely to trigger panda. E.g. would you consider hiding affiliate links from google by linking to intermediate URL (which I would mark as noindex nofolllow) on our domain which then redirects to the final affiliate landing page (but google may notice via chrome or android data) ? Any other idea?
Intermediate & Advanced SEO | | lcourse0 -
Help, no organic traffic recovery after new site launch (it's been 6 months)!
I worked with a team of developers to launch a new site back in March. I was (and still am) in charge of SEO for the site, including combining 4 sites into 1. I made sure 301 redirects were in place to combine the sites and pretty much every SEO tactic I can think of to make sure the site would maintain rankings following launch. However, here we are 6 months later and YoY numbers are down -70% on average for organic traffic. Anyone mind taking a look at http://www.guestguidepublications.com and seeing if there's a glaring mistake I'm missing?!?!?! Thanks ahead of time!
Intermediate & Advanced SEO | | Annapurna-Digital1 -
Viewing search results for 'We possibly have internal links that link to 404 pages. What is the most efficient way to check our sites internal links?
We possibly have internal links on our site that point to 404 pages as well as links that point to old pages. I need to tidy this up as efficiently as possible and would like some advice on the best way to go about this.
Intermediate & Advanced SEO | | andyheath0 -
Link Building, when to give up on a possible backlink provider
Hello, I'm doing content marketing, it's working but it's slow and steady. I'm looking for some wisdom here from people that have done a lot of linkbuilding. Phone calls are tending to be a great resource in our niche. But I've got about 10 sites that I've called about ten times each and they're taking up a lot of time. I've mostly narrowed down who to contact but that person won't answer the phone. My question is, when do you give up on a good lead? Do I just try to contact a good person 5?, or 10? times then leave a message, leave a message a week later, then give up? I don't think leaving a message for content marketing in this case would help much. Thanks for the advice.
Intermediate & Advanced SEO | | BobGW0 -
Reality of Panda 3.9 Refresh
I have had a 10 page website(registered in 1999) rank for my top keywords(top 5) for over 4 years. No changes have been made to the website. (Static website). July 11, 2012, most of the keywords, and all the major keywords were dropped from Google. They remain steady in Bing and Yahoo. I saw that some people referred to a Panda 3.9 refresh on that day, but also saw that Google(Matt Cutts) denied the refresh. Given the simplicity of the website and the strong backlinks, which remain, what are other reasons I could see a drastic drop in 1 day. Any ideas on where to target my search for solving this very serious issue? Any thoughts would be appreciated.
Intermediate & Advanced SEO | | FidelityOne0 -
Panda Update - Challenge!
I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas. Here are a few details to give you some background. The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place. I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues. The only thing I can come up with is that it is either: Something off-page related to links or Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google. If anyone has ideas or would like more info to help please send me a message. I greatly appreciate any feedback. Thank you, friends! LHC
Intermediate & Advanced SEO | | lhc670