Ongoing problems with Panda
-
Hey all,
I’ve been following Moz Q&A for quite a few years. As we now are fighting with some huge problems, I think it is time to write my first post.
Our French website http://bit.ly/1l0efCC has been struggling with the Google’s Pinguin and Panda algos for years now.
At last we were hit by Panda on May this year.
Over the last weeks we made several changes, both onpage and offpage. Unfortunately none of this has shown any impact yet.
I therefor would be very grateful for any help you can provide. A recommendation for a good agency with much experience in SEO on the French market would also be very helpful.
Our onpage problems and what we did so far:
- Overoptimization: During our analysis we realized that our site is strongly overoptimized. We therefor:
- Altered the titles and descriptions of all seo pages to make them more unique and less stuffed with keywords
- Changed the urls of our landing pages
- Changed our internal anchor texts
- Content quality: Some of our landing pages contain only little unique content and the text quality is low. To improve our content quality, we
- Made new, fresh, longer and much better texts for a few hundred pages
- We set up additional landing pages for relevant keywords with unique content
- We moved our blog from a subdomain to a folder
- Irrelevant content: Our system creates many pages with irrelevant content. To reduce the irrelevant pages indexed by Google, we aggregated our customer feedback pages and set up canonicals on thousands of pages for single customer feedbacks that refer to the aggregated pages.
What we did offpage:
- We invested weeks to analyze each and every backlink to our site and in the end we had to remove a huge amount of these links.
All this doesn’t seem to be enough though.
There are some other things we are working on right now, but we are running out of ideas:
- We want to gather all pages (that can be indexed) and compare them to the pages that have had organic traffic in the past 3 month at all, to identify more useless pages. We then will remove, deindex or revise them.
- We are still doing content improvements on our existing pages
- The structure of our rendered site is different to the source text structure. We placed the most seo relevant parts of the page on top in the source code and then moved them around on the rendered page via css. Do you think this might give a spam signal to Google?
- We try to improve our design. But as we need to do a bunch of A/B tests before we can relaunch our site, this will take some time.
- We will change our internal link structure to have less links on every page and have a stronger thematically connection between them.
I’m looking forward for your tips and ideas!
Regards,
Sandra
- Overoptimization: During our analysis we realized that our site is strongly overoptimized. We therefor:
-
Hi Sandra,
I apologize it has taken me this long to get back to you about this post.
It looks like you are in good hands and it is all settled.
Sincerely,
Thomas
-
Thank you for your help!
We are currently auditing our pages and found rhousands of sites that should be noindexed or improved. We will go right after that issue.
LG
Sandra
-
Hi Sandra,
Greetings from Austria ! - as I see you are in Germany
Just a few comments related with your post:
1. The organic visibility dose indeed shows some drops that overlaps with both panda and Penguin. However you should also overlap the visibility with GA data in order to see what you rank for but you don't get any traffic - as this is a strong panda signal. Organic Visibility: http://screencast.com/t/0T0OdPBR As far as ranking distribution - it looks bad but I've seen much worst so it's not a lost case. ( http://screencast.com/t/z1wfh0vEjNbK )
2. With Panda you need to identify the pages with issues. If you do improve some pages (Made new, fresh, longer and much better texts for a few hundred pages) that's fine but it might just not do the trick. First you need to slice the site down and split it into chunks of type of pages (thin content, idle pages, low ctr & very high impressions, ratio of boiler plate vs content, downstream traffic etc). Then improve or delete or noindex those. After that consolidate as much as you can the rest. new landing pages are good - as it can change the threshold for Panda...
3. Over optimization is not a panda issue - you can focus if it make sense to improve overall and you might overlap with some panda signals but my personal opinion is that is not the main focus now.
4. Panda is "mainly" on page: by mainly I don't mean links I mean it's a bigger picture as it's about serps, it's about competitors. It's about being better then the other sites that are targeting the same terms - it's not about being perfect. Penguin on the other hand is about spam: on page and off page (un-natural links). If you do clean your link profile make sure you clean it base don some good metrics. Also include the links you remove in the disavow file. The disavow file will be processed only when Google will roll out Penguin again. Until then don't expect any changes (unless you also have a manual action and when that is processed then the disavow will also be processed.
I would say Panda is easy to fix in your case - as it anyway doesn’t look like a "kill shoot". Penguin on the other hand will probably be harder but since soon we'll see another update you should hurry and hope you get on this train as otherwise you might need to wait for months and months....
LG
***I bet some US folks would think that those (LG) are my initials
-
Hey all,
is there anyone of you, who has further ideas or tipps? Has anyone of you lifted a Panda penalty recently and maybe can share some experiences?
-
Hi Travis,
thank you for your quick reply.
We already marked ? Session parameters in GWT.
As for the feedback Pages:
http://www.locationdevoiture.fr/location-voiture-Luxembourg-LU-evaluation-4.html contains a canonical pointing to http://www.locationdevoiture.fr/location-voiture-LU-evaluation.html
-
I probably should have crawled the site this morning. I thought it would probably be a while before some problems became evident via crawler. I was incorrect.
There's at least a couple more problems that occur on-site.
First, session ID in the URL. Yes, they're redirected - but to the version of the lower case URL with session ID. The session ID URLs are set to index follow. There isn't a canonical link element for the instances that I have seen. This happens to a number of the URLs I have crawled thus far. Google doesn't appear to be indexing it, but that doesn't mean they aren't counting it against the site.
This is likely happening thousands of times.
Further, you can't be certain that the site is getting fully indexed regularly. So much crawl budget is being wasted crawling these duplicate content redirects to more duplicate content.
- http://www.locationdevoiture.fr/Camping-Car-nouvelle-zelande.html?PHPSESSID=1c7622jkmeoe0hqko8nv2lo584
- http://www.locationdevoiture.fr/camping-car-nouvelle-zelande.html?PHPSESSID=1c7622jkmeoe0hqko8nv2lo584
Recommendation: Immediately set a noindex follow to URLs with session IDs. Point a canonical tag at the rightful page. Check Google Analytics for traffic/referrers to pages with session IDs. Referrers with session IDs indicate pages you will want to 301 to the canonical due to backlinks. (e.g. www.locationdevoiture.fr/camping-car-nouvelle-zelande.html) Anything that doesn't get traffic/referrals/backlinks - set to a 410 status code.
You're going to want something like Deep Crawl for a site of this size. Sure, you can do it with Screaming Frog - but you will need to customize it's RAM usage. If you're not comfortable with that - you will probably save a lot of time using Deep Crawl.
You may find it advantageous to ask webmasters to correct the links. You can also tell Google via Google Webmaster Tools to ignore the query (?) parameter. But I'm not sure if that will affect inbound links to pages with session IDs, especially at this scale. So you would likely want to wait until a thorough page by page link audit/correction campaign has been done, just to be safe.
Second: Many of the URLs have been changed. It seems like "location voiture city" was fairly close to what you want - even though those are competitive queries. Now a lot of URLs are "louer city". In my opinion, losing voiture from the slug was probably not the best move. Even if one was relying upon voiture in the root domain, I still think having a keyword in the slug is not only okay; but quite necessary.
Still "louer voiture" appears to be even more competitive - with ~13.4 million results, while "location voiture" has ~2.6 million. Either way, you're still competing with Avis and all the others. It would make sense to populate the lesser set of results with higher query volume.
Third: It appears some of the reviews pages are mostly duplicate content. Have you considered an iframe for reviews on some pages? That would further reduce the amount of duplicate content on the page.
Examples:
- http://www.locationdevoiture.fr/location-voiture-LU-evaluation.html
- http://www.locationdevoiture.fr/location-voiture-Luxembourg-LU-evaluation-4.html
Do you actually need both pages anyway?
It also appears I'm having problems getting the site to 404. It would be a good idea to let that happen. Give the user some search options which are properly blocked from search engines.
Finally, I don't think the loss of the links around April was the root cause. It may have contributed slightly, but your Panda hunch seems to be the culprit. So apologies for the misdirection. And if it's in the budget - consider videos for on-page with proper semantic markup and transcripts, where applicable.
However, I have attached a screenshot - just to show you the settings of Majestic SEO. I like using it for cursory examination - then I dive deeper with multiple backlink data sources.
I think my recommendations will help. But if anyone feels I'm incorrect - feel free to lash me with a wet noodle.
-
Hi,
thank you very much for your tipps.
@Travis: This text link generator produces links for affiliates. These links are nofollow. Were did you get the information that we lost a lot of links? We can't see that in our tools.
@MoosaHemani: The problem isn't that we don't know what hit us. We were hit by Panda in May and we were also hit by Pinguin before. We did several things to get out of those filters, but none of our ideas has been successful yet.
-
That's odd, since the Silver Tours GmbH site (billiger-mietwagen.de) seems to be doing better than ever, on Google.de. They even have a page that refers to automated text link generation... hmmm.
"Mit unserem Textlink-Generator erstellen Sie Textlinks, die Ihre Nutzer direkt zu unseren Mietwagen-Angeboten in einem bestimmten Land oder einer bestimmten Stadt führen."
Your site also has a site-wide link from this improbable powerhouse. What happened to cheaper-mietwagen.de? That would be an interesting thing to know. It seems like the parent company owns a lot of domains. The problem may be bigger than your site.
Whois information can also be used to judge the quality of a site. It looks like the parent company owns a large amount of domains. So I would take a look there as well. I just don't have the time this morning.
Also, you may want to consider what the site lost prior to the drop. Around March/April the site lost thousands of links. Granted, a lot of the links were comment/forum spam, but thousands of links are thousands of links. But it was probably for the best in the long run.
I really wish I could have given this more time - but now I'm running late. It would be nice if my German language skills were better. The same goes for French, I suppose.
-
Hello Sandra,
If I would be at your place I would have gone through the following process.
Step 1: I would have gone through the Google Analytics and check on which dates the traffic is dramatically going down then compare it with the algo updates to see exactly what update hit the website.
If it is panda, most probably the problem is with the content of the website and in case of penguin the problem is with the backlinks that are pointing back to the website.
In case of Penguin Update:
Collect all the back links from Google Webmaster tool, Open Site Explorer, Open Link profiler, Aherfs.com and Majestic SEO and then put it in Link Detox to see how many of them bad links are. One you get the list of bad links (in order to complete the list of bad links you need to do the manual audit as well).
Once you have the bad links, remove as many bad links as possible by manually reaching out to them and then update the link disavow file accordingly.
In case of Panda Update:
If you are majorly attacked by panda update, chances are you need to completely re develop your content of the website as Panda usually attack when you have low quality or thin content on the website.
What you ideally should be doing:
- Recheck what update you are facing at the moment
- In case of Penguin follow the process given above
- Build quality links continuous basis so that Google can get a clue that you are working on quality stuff.
- In case of Panda update, redevelop the content of the website to make it user friendly and unique
- Try to get as many social shares on your website content as possible.
- Try to plan and execute a strategy that gives Google a hint of your quality work.
If you have a manual penalty, you probably should work on it to lift it at step1 and later work on aggressive promotion of your website and get quality and relevant links to regain Google’s attention and rankings from the desired key phrases.
This might take few months but you will get the rankings and traffic back.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problems with spam score
Hi I am having problems with the spam score of my links. Although I have removed the "bad ones" and disavow them in many cases, it is still showing a bad score. My site is https://www.way2net.com/ Is there any way to improve this faster? Thanks!
Link Building | | way2netseo0 -
Suddenly Getting more than a 100 links. Is it a problem?
Hi Everyone, My client writes articles for a very prominent news source in the area where I live. Attached to each article there is her bio, but she never asked the news source to link to her website. So far, she has written more than 100 articles. What would happen if the news source suddenly links 100 times to her site? Would this be considered spammy from google or unnatural link building? Thank you very much for your answers. Arben
Link Building | | Arben_K0 -
Many of our best links are to our old domain. Problem?
Hi, around 7 years ago the organization I'm with switched their domain name. When doing link analysis some of our best links are to the old domain including a followed link from an article on the BBC news site. The old domain has been 301 redirecting to the new domain for years. Is it worth the time to contact the sites that are linking to the old domain? Or, will all or the majority of the link juice be given to the newer domain with the 301 redirect? Thanks!
Link Building | | limited70 -
Has Suite101 recovered from Panda?
Is it worth getting links from Suite101 post-Panda? They claim to have tidied up. But what does Google think?
Link Building | | MarkHodson0 -
Tool based websites after Panda and Penguin
I have several sites that I classify as "tool" based sites. These type of sites includes anagram solvers for word games, ip address lookups and other various based tools. I like to specialize in these kind of sites because frankly I am not a blogger and hate to write. Before Panda I was doing fairly well, I had many users coming to my sites everyday. The Panda update "tips" that I read all over including on google sites seems to always refer to articles and how to deal with those. The only problem is that I am not an article based site. Yes I have content that describes my tools, but we are talking 1 to at most 2 paragraphs per page. I know Panda effected my ratings because I lost ranking with some big keywords. Since then I have been able to gain some ground, but I am having a hard time trying to figure out what exactly google wants from a site like me. The main path I am getting from Panda is concentrating on the users, and what will make their experience better. I focus on that whenever I make a change. I figure the easier I make it for them, the happier they are going to be. I am wondering what you guys think about some things I can look into creating or focusing on to enhance my site. I use social media heavily (FB, tweets, posterous, blog, RSS Feed) already, but is there something else I can look at that might help me that I am not seeing. I seem to be running out of ideas. Maybe add more content to my pages? Check link structure (its pretty basic as it is) add more article based blog posts? Just want some ideas, I will send my URL to anyone who wants to take a look and see what I may be missing. C
Link Building | | cbielich0 -
Directory submission after Google panda update
I am very much concerned about directory submission after Google panda update. I have few questions in this regard 1- Should I submit the home page URL or deep links in directories? What is better practice? 2- Can I submit more than one link in one directory? What are the consequences of submitting both (home page and deep link) in same directory? 3- Should I start directory submission aggressively or I need to take care about the quantity? Regards
Link Building | | shaz_lhr1 -
Panda Update: Isn't a link still a link?
I was doing some link building and some SEO's said that the Panda update affected many websites. I am going to use eZineArticles.com as my example. EzineArticles was affected by the Panda update and does not show up in the SERPs as much as before. But they still have doFollow Links coming from the articles I am submitting. QUESTION: Regardless if EzineArticles was affected by the Panda Update, isn't a "Follow Link" still a "Follow Link" OR am I completely wasting my time on this devalued website? Edit: Yes I know a PR 0 page is not as valuable as a PR 9 page. I am asking from the standpoint of the affected Panda Update domains overall.
Link Building | | Francisco_Meza0 -
Is publishing press releases for SEO thorugh newswire sites as effective post panda?
Does anyone have any knowledge/thoughts on how leading newswire sites have been effected by Google's latest updates, i.e. Panda? To date this has been a method (alongside others) I have sucessfully used to build links, but the results of late have not been as good for some clients and I'm beginning to question how effective this is - so I would be very interested to hear anyone else's experiences in this area?
Link Building | | simon-1453280