Ongoing problems with Panda
-
Hey all,
I’ve been following Moz Q&A for quite a few years. As we now are fighting with some huge problems, I think it is time to write my first post.
Our French website http://bit.ly/1l0efCC has been struggling with the Google’s Pinguin and Panda algos for years now.
At last we were hit by Panda on May this year.
Over the last weeks we made several changes, both onpage and offpage. Unfortunately none of this has shown any impact yet.
I therefor would be very grateful for any help you can provide. A recommendation for a good agency with much experience in SEO on the French market would also be very helpful.
Our onpage problems and what we did so far:
- Overoptimization: During our analysis we realized that our site is strongly overoptimized. We therefor:
- Altered the titles and descriptions of all seo pages to make them more unique and less stuffed with keywords
- Changed the urls of our landing pages
- Changed our internal anchor texts
- Content quality: Some of our landing pages contain only little unique content and the text quality is low. To improve our content quality, we
- Made new, fresh, longer and much better texts for a few hundred pages
- We set up additional landing pages for relevant keywords with unique content
- We moved our blog from a subdomain to a folder
- Irrelevant content: Our system creates many pages with irrelevant content. To reduce the irrelevant pages indexed by Google, we aggregated our customer feedback pages and set up canonicals on thousands of pages for single customer feedbacks that refer to the aggregated pages.
What we did offpage:
- We invested weeks to analyze each and every backlink to our site and in the end we had to remove a huge amount of these links.
All this doesn’t seem to be enough though.
There are some other things we are working on right now, but we are running out of ideas:
- We want to gather all pages (that can be indexed) and compare them to the pages that have had organic traffic in the past 3 month at all, to identify more useless pages. We then will remove, deindex or revise them.
- We are still doing content improvements on our existing pages
- The structure of our rendered site is different to the source text structure. We placed the most seo relevant parts of the page on top in the source code and then moved them around on the rendered page via css. Do you think this might give a spam signal to Google?
- We try to improve our design. But as we need to do a bunch of A/B tests before we can relaunch our site, this will take some time.
- We will change our internal link structure to have less links on every page and have a stronger thematically connection between them.
I’m looking forward for your tips and ideas!
Regards,
Sandra
- Overoptimization: During our analysis we realized that our site is strongly overoptimized. We therefor:
-
Hi Sandra,
I apologize it has taken me this long to get back to you about this post.
It looks like you are in good hands and it is all settled.
Sincerely,
Thomas
-
Thank you for your help!
We are currently auditing our pages and found rhousands of sites that should be noindexed or improved. We will go right after that issue.
LG
Sandra
-
Hi Sandra,
Greetings from Austria ! - as I see you are in Germany
Just a few comments related with your post:
1. The organic visibility dose indeed shows some drops that overlaps with both panda and Penguin. However you should also overlap the visibility with GA data in order to see what you rank for but you don't get any traffic - as this is a strong panda signal. Organic Visibility: http://screencast.com/t/0T0OdPBR As far as ranking distribution - it looks bad but I've seen much worst so it's not a lost case. ( http://screencast.com/t/z1wfh0vEjNbK )
2. With Panda you need to identify the pages with issues. If you do improve some pages (Made new, fresh, longer and much better texts for a few hundred pages) that's fine but it might just not do the trick. First you need to slice the site down and split it into chunks of type of pages (thin content, idle pages, low ctr & very high impressions, ratio of boiler plate vs content, downstream traffic etc). Then improve or delete or noindex those. After that consolidate as much as you can the rest. new landing pages are good - as it can change the threshold for Panda...
3. Over optimization is not a panda issue - you can focus if it make sense to improve overall and you might overlap with some panda signals but my personal opinion is that is not the main focus now.
4. Panda is "mainly" on page: by mainly I don't mean links I mean it's a bigger picture as it's about serps, it's about competitors. It's about being better then the other sites that are targeting the same terms - it's not about being perfect. Penguin on the other hand is about spam: on page and off page (un-natural links). If you do clean your link profile make sure you clean it base don some good metrics. Also include the links you remove in the disavow file. The disavow file will be processed only when Google will roll out Penguin again. Until then don't expect any changes (unless you also have a manual action and when that is processed then the disavow will also be processed.
I would say Panda is easy to fix in your case - as it anyway doesn’t look like a "kill shoot". Penguin on the other hand will probably be harder but since soon we'll see another update you should hurry and hope you get on this train as otherwise you might need to wait for months and months....
LG
***I bet some US folks would think that those (LG) are my initials
-
Hey all,
is there anyone of you, who has further ideas or tipps? Has anyone of you lifted a Panda penalty recently and maybe can share some experiences?
-
Hi Travis,
thank you for your quick reply.
We already marked ? Session parameters in GWT.
As for the feedback Pages:
http://www.locationdevoiture.fr/location-voiture-Luxembourg-LU-evaluation-4.html contains a canonical pointing to http://www.locationdevoiture.fr/location-voiture-LU-evaluation.html
-
I probably should have crawled the site this morning. I thought it would probably be a while before some problems became evident via crawler. I was incorrect.
There's at least a couple more problems that occur on-site.
First, session ID in the URL. Yes, they're redirected - but to the version of the lower case URL with session ID. The session ID URLs are set to index follow. There isn't a canonical link element for the instances that I have seen. This happens to a number of the URLs I have crawled thus far. Google doesn't appear to be indexing it, but that doesn't mean they aren't counting it against the site.
This is likely happening thousands of times.
Further, you can't be certain that the site is getting fully indexed regularly. So much crawl budget is being wasted crawling these duplicate content redirects to more duplicate content.
- http://www.locationdevoiture.fr/Camping-Car-nouvelle-zelande.html?PHPSESSID=1c7622jkmeoe0hqko8nv2lo584
- http://www.locationdevoiture.fr/camping-car-nouvelle-zelande.html?PHPSESSID=1c7622jkmeoe0hqko8nv2lo584
Recommendation: Immediately set a noindex follow to URLs with session IDs. Point a canonical tag at the rightful page. Check Google Analytics for traffic/referrers to pages with session IDs. Referrers with session IDs indicate pages you will want to 301 to the canonical due to backlinks. (e.g. www.locationdevoiture.fr/camping-car-nouvelle-zelande.html) Anything that doesn't get traffic/referrals/backlinks - set to a 410 status code.
You're going to want something like Deep Crawl for a site of this size. Sure, you can do it with Screaming Frog - but you will need to customize it's RAM usage. If you're not comfortable with that - you will probably save a lot of time using Deep Crawl.
You may find it advantageous to ask webmasters to correct the links. You can also tell Google via Google Webmaster Tools to ignore the query (?) parameter. But I'm not sure if that will affect inbound links to pages with session IDs, especially at this scale. So you would likely want to wait until a thorough page by page link audit/correction campaign has been done, just to be safe.
Second: Many of the URLs have been changed. It seems like "location voiture city" was fairly close to what you want - even though those are competitive queries. Now a lot of URLs are "louer city". In my opinion, losing voiture from the slug was probably not the best move. Even if one was relying upon voiture in the root domain, I still think having a keyword in the slug is not only okay; but quite necessary.
Still "louer voiture" appears to be even more competitive - with ~13.4 million results, while "location voiture" has ~2.6 million. Either way, you're still competing with Avis and all the others. It would make sense to populate the lesser set of results with higher query volume.
Third: It appears some of the reviews pages are mostly duplicate content. Have you considered an iframe for reviews on some pages? That would further reduce the amount of duplicate content on the page.
Examples:
- http://www.locationdevoiture.fr/location-voiture-LU-evaluation.html
- http://www.locationdevoiture.fr/location-voiture-Luxembourg-LU-evaluation-4.html
Do you actually need both pages anyway?
It also appears I'm having problems getting the site to 404. It would be a good idea to let that happen. Give the user some search options which are properly blocked from search engines.
Finally, I don't think the loss of the links around April was the root cause. It may have contributed slightly, but your Panda hunch seems to be the culprit. So apologies for the misdirection. And if it's in the budget - consider videos for on-page with proper semantic markup and transcripts, where applicable.
However, I have attached a screenshot - just to show you the settings of Majestic SEO. I like using it for cursory examination - then I dive deeper with multiple backlink data sources.
I think my recommendations will help. But if anyone feels I'm incorrect - feel free to lash me with a wet noodle.
-
Hi,
thank you very much for your tipps.
@Travis: This text link generator produces links for affiliates. These links are nofollow. Were did you get the information that we lost a lot of links? We can't see that in our tools.
@MoosaHemani: The problem isn't that we don't know what hit us. We were hit by Panda in May and we were also hit by Pinguin before. We did several things to get out of those filters, but none of our ideas has been successful yet.
-
That's odd, since the Silver Tours GmbH site (billiger-mietwagen.de) seems to be doing better than ever, on Google.de. They even have a page that refers to automated text link generation... hmmm.
"Mit unserem Textlink-Generator erstellen Sie Textlinks, die Ihre Nutzer direkt zu unseren Mietwagen-Angeboten in einem bestimmten Land oder einer bestimmten Stadt führen."
Your site also has a site-wide link from this improbable powerhouse. What happened to cheaper-mietwagen.de? That would be an interesting thing to know. It seems like the parent company owns a lot of domains. The problem may be bigger than your site.
Whois information can also be used to judge the quality of a site. It looks like the parent company owns a large amount of domains. So I would take a look there as well. I just don't have the time this morning.
Also, you may want to consider what the site lost prior to the drop. Around March/April the site lost thousands of links. Granted, a lot of the links were comment/forum spam, but thousands of links are thousands of links. But it was probably for the best in the long run.
I really wish I could have given this more time - but now I'm running late. It would be nice if my German language skills were better. The same goes for French, I suppose.
-
Hello Sandra,
If I would be at your place I would have gone through the following process.
Step 1: I would have gone through the Google Analytics and check on which dates the traffic is dramatically going down then compare it with the algo updates to see exactly what update hit the website.
If it is panda, most probably the problem is with the content of the website and in case of penguin the problem is with the backlinks that are pointing back to the website.
In case of Penguin Update:
Collect all the back links from Google Webmaster tool, Open Site Explorer, Open Link profiler, Aherfs.com and Majestic SEO and then put it in Link Detox to see how many of them bad links are. One you get the list of bad links (in order to complete the list of bad links you need to do the manual audit as well).
Once you have the bad links, remove as many bad links as possible by manually reaching out to them and then update the link disavow file accordingly.
In case of Panda Update:
If you are majorly attacked by panda update, chances are you need to completely re develop your content of the website as Panda usually attack when you have low quality or thin content on the website.
What you ideally should be doing:
- Recheck what update you are facing at the moment
- In case of Penguin follow the process given above
- Build quality links continuous basis so that Google can get a clue that you are working on quality stuff.
- In case of Panda update, redevelop the content of the website to make it user friendly and unique
- Try to get as many social shares on your website content as possible.
- Try to plan and execute a strategy that gives Google a hint of your quality work.
If you have a manual penalty, you probably should work on it to lift it at step1 and later work on aggressive promotion of your website and get quality and relevant links to regain Google’s attention and rankings from the desired key phrases.
This might take few months but you will get the rankings and traffic back.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search console Problem
I'm submitting my site map via search console and the site map is waiting for approval. I've been taking this error for two or three months. my site is not getting index. Therefore, I have experienced a decline in many words. Can the new search console solve my error? help please . for this Ekran Fiyatları
Link Building | | netfocg0 -
Removing many pages from my site a problem?
I have to remove two blog categories from my site which amounts to about 98 pages. I will redirect the URLS to other appropriate pages on my site. Will Google rank my site different if I lose that many pages?
Link Building | | artsp1 -
Google Website Ranking Problem
I have a website with not so big amount of backlinks and PA 16 DA 6. I try to search it in semrush and its do not show me any result for it. But I have different website with DA 18 and PA 9 and it is indexed in google. It is have 4 keyword not so well but at least it is indexed and I can begin to bring backlinks and blogs and it will grow up. What can be problem about this website. I do not give website link here to avoid problem. But i can send link in private. But what you can recommend to me to do to make this website show up in google.
Link Building | | flickshine0 -
Main keyword problem
Hello Since the latest Google update our main keyword has been dropping. We employed a SEO company about 6 weeks ago and since then the keyword has dropped a couple of pages further on Google. We have been using the Remove'em site to check our backlinks and a blog this SEO company have done has appeared on the list. The DA is good but the PA is low. In the blog they have linked the exact keyword to our site homepage. Do you think the blogs the SEO company are doing will be causing us more problems? Any help is very much appreciated. Thanks
Link Building | | Palmbourne0 -
Sitemap indexing problem
I changed some links of my website and deleted some links and created some links. After that I generated a sitemap again and re-submitted to google webmaster. I used 301 redirects for deleted or changed links and there are no broken links on my site (this is confirmed) sitemap was generated from here http://www.xml-sitemaps.com/ in my sitemap 285 links are present but google webmaster is showing that only 218 links have been indexed. Also my traffic felt down considerably since last 2 days ever since i submitted my new sitemap. Also when I searched for my sitename in google, it showed top 6 results within the site now it is just showing top 2 results. for reference my website is www.geekwik.com HAVE I DONE ANYTHING WRONG? OR IS THERE A MORE AUTHENTIC PLACE TO GENERATE A SITEMAP. REALLY APPRECIATE ANY HELP ON THIS.
Link Building | | geekwik0 -
Best Anchor Text Practices Post Penguin Panda
my question is what are the best practices Post Penguin and Panda for anchor texts? I have heard info about using brand names, and not having duplicate anchor texts. But i'm confused; if its best to have brand names then likely it will end up being duplicate. Also is having no anchor bad? I have also heard that it is best just to use the website URL. It would be great if someone could clear this up for me thanks.
Link Building | | ilyaelbert0 -
What is white hat seo for google penguin and google panda?
Hi, This is my 1st question to some experts. I need advice on link building. Are following good and white hat seo? 1: Bookmarkings on top sites 2: Top directory submission, or directories that seomoz recommended? 3: Top article directories, yahoo voices, ezine etc.. 4: Top web 2.0, wordpress, blogger, tumblr 5: Guest posts on quality blogs. I was use to with black hat methods (by mistake) , that i was not aware, Some are pure blackhat, that i never did. some time it hard to judge which are black hat. In past google like some links, now most are devalue or got penalty. Now i want to do quality seo work and slow process. any advice will be great regarding quality seo. Thanks.
Link Building | | chsajid110 -
Linkbuilding URL Problem
Hi, I work for Perfectly Engraved. A lot of our links point to http://www.perfectlyengraved.co.uk/. However that address forwards to - http://www.perfectlyengraved.co.uk/ecommerce/scripts/default.htm. Should we ensure our links point to http://www.perfectlyengraved.co.uk/ecommerce/scripts/default.htm or does it not make difference? I've put our forwarding code below. <%@ Language=VBScript %> <% Option explicit Response.Buffer = true Response.Redirect "http://www.perfectlyengraved.co.uk/ecommerce/scripts/default.htm" %> Also, would it help if we did a 301 redirect from http://perfectlyengraved.co.uk/ecommerce/scripts/default.htm to http://www.perfectlyengraved.co.uk/ecommerce/scripts/default.htm? As both addresses for every page on our site return the page with or without www Thanks
Link Building | | Aardvark0