Ongoing problems with Panda
-
Hey all,
I’ve been following Moz Q&A for quite a few years. As we now are fighting with some huge problems, I think it is time to write my first post.
Our French website http://bit.ly/1l0efCC has been struggling with the Google’s Pinguin and Panda algos for years now.
At last we were hit by Panda on May this year.
Over the last weeks we made several changes, both onpage and offpage. Unfortunately none of this has shown any impact yet.
I therefor would be very grateful for any help you can provide. A recommendation for a good agency with much experience in SEO on the French market would also be very helpful.
Our onpage problems and what we did so far:
- Overoptimization: During our analysis we realized that our site is strongly overoptimized. We therefor:
- Altered the titles and descriptions of all seo pages to make them more unique and less stuffed with keywords
- Changed the urls of our landing pages
- Changed our internal anchor texts
- Content quality: Some of our landing pages contain only little unique content and the text quality is low. To improve our content quality, we
- Made new, fresh, longer and much better texts for a few hundred pages
- We set up additional landing pages for relevant keywords with unique content
- We moved our blog from a subdomain to a folder
- Irrelevant content: Our system creates many pages with irrelevant content. To reduce the irrelevant pages indexed by Google, we aggregated our customer feedback pages and set up canonicals on thousands of pages for single customer feedbacks that refer to the aggregated pages.
What we did offpage:
- We invested weeks to analyze each and every backlink to our site and in the end we had to remove a huge amount of these links.
All this doesn’t seem to be enough though.
There are some other things we are working on right now, but we are running out of ideas:
- We want to gather all pages (that can be indexed) and compare them to the pages that have had organic traffic in the past 3 month at all, to identify more useless pages. We then will remove, deindex or revise them.
- We are still doing content improvements on our existing pages
- The structure of our rendered site is different to the source text structure. We placed the most seo relevant parts of the page on top in the source code and then moved them around on the rendered page via css. Do you think this might give a spam signal to Google?
- We try to improve our design. But as we need to do a bunch of A/B tests before we can relaunch our site, this will take some time.
- We will change our internal link structure to have less links on every page and have a stronger thematically connection between them.
I’m looking forward for your tips and ideas!
Regards,
Sandra
- Overoptimization: During our analysis we realized that our site is strongly overoptimized. We therefor:
-
Hi Sandra,
I apologize it has taken me this long to get back to you about this post.
It looks like you are in good hands and it is all settled.
Sincerely,
Thomas
-
Thank you for your help!
We are currently auditing our pages and found rhousands of sites that should be noindexed or improved. We will go right after that issue.
LG
Sandra
-
Hi Sandra,
Greetings from Austria ! - as I see you are in Germany
Just a few comments related with your post:
1. The organic visibility dose indeed shows some drops that overlaps with both panda and Penguin. However you should also overlap the visibility with GA data in order to see what you rank for but you don't get any traffic - as this is a strong panda signal. Organic Visibility: http://screencast.com/t/0T0OdPBR As far as ranking distribution - it looks bad but I've seen much worst so it's not a lost case. ( http://screencast.com/t/z1wfh0vEjNbK )
2. With Panda you need to identify the pages with issues. If you do improve some pages (Made new, fresh, longer and much better texts for a few hundred pages) that's fine but it might just not do the trick. First you need to slice the site down and split it into chunks of type of pages (thin content, idle pages, low ctr & very high impressions, ratio of boiler plate vs content, downstream traffic etc). Then improve or delete or noindex those. After that consolidate as much as you can the rest. new landing pages are good - as it can change the threshold for Panda...
3. Over optimization is not a panda issue - you can focus if it make sense to improve overall and you might overlap with some panda signals but my personal opinion is that is not the main focus now.
4. Panda is "mainly" on page: by mainly I don't mean links I mean it's a bigger picture as it's about serps, it's about competitors. It's about being better then the other sites that are targeting the same terms - it's not about being perfect. Penguin on the other hand is about spam: on page and off page (un-natural links). If you do clean your link profile make sure you clean it base don some good metrics. Also include the links you remove in the disavow file. The disavow file will be processed only when Google will roll out Penguin again. Until then don't expect any changes (unless you also have a manual action and when that is processed then the disavow will also be processed.
I would say Panda is easy to fix in your case - as it anyway doesn’t look like a "kill shoot". Penguin on the other hand will probably be harder but since soon we'll see another update you should hurry and hope you get on this train as otherwise you might need to wait for months and months....
LG
***I bet some US folks would think that those (LG) are my initials
-
Hey all,
is there anyone of you, who has further ideas or tipps? Has anyone of you lifted a Panda penalty recently and maybe can share some experiences?
-
Hi Travis,
thank you for your quick reply.
We already marked ? Session parameters in GWT.
As for the feedback Pages:
http://www.locationdevoiture.fr/location-voiture-Luxembourg-LU-evaluation-4.html contains a canonical pointing to http://www.locationdevoiture.fr/location-voiture-LU-evaluation.html
-
I probably should have crawled the site this morning. I thought it would probably be a while before some problems became evident via crawler. I was incorrect.
There's at least a couple more problems that occur on-site.
First, session ID in the URL. Yes, they're redirected - but to the version of the lower case URL with session ID. The session ID URLs are set to index follow. There isn't a canonical link element for the instances that I have seen. This happens to a number of the URLs I have crawled thus far. Google doesn't appear to be indexing it, but that doesn't mean they aren't counting it against the site.
This is likely happening thousands of times.
Further, you can't be certain that the site is getting fully indexed regularly. So much crawl budget is being wasted crawling these duplicate content redirects to more duplicate content.
- http://www.locationdevoiture.fr/Camping-Car-nouvelle-zelande.html?PHPSESSID=1c7622jkmeoe0hqko8nv2lo584
- http://www.locationdevoiture.fr/camping-car-nouvelle-zelande.html?PHPSESSID=1c7622jkmeoe0hqko8nv2lo584
Recommendation: Immediately set a noindex follow to URLs with session IDs. Point a canonical tag at the rightful page. Check Google Analytics for traffic/referrers to pages with session IDs. Referrers with session IDs indicate pages you will want to 301 to the canonical due to backlinks. (e.g. www.locationdevoiture.fr/camping-car-nouvelle-zelande.html) Anything that doesn't get traffic/referrals/backlinks - set to a 410 status code.
You're going to want something like Deep Crawl for a site of this size. Sure, you can do it with Screaming Frog - but you will need to customize it's RAM usage. If you're not comfortable with that - you will probably save a lot of time using Deep Crawl.
You may find it advantageous to ask webmasters to correct the links. You can also tell Google via Google Webmaster Tools to ignore the query (?) parameter. But I'm not sure if that will affect inbound links to pages with session IDs, especially at this scale. So you would likely want to wait until a thorough page by page link audit/correction campaign has been done, just to be safe.
Second: Many of the URLs have been changed. It seems like "location voiture city" was fairly close to what you want - even though those are competitive queries. Now a lot of URLs are "louer city". In my opinion, losing voiture from the slug was probably not the best move. Even if one was relying upon voiture in the root domain, I still think having a keyword in the slug is not only okay; but quite necessary.
Still "louer voiture" appears to be even more competitive - with ~13.4 million results, while "location voiture" has ~2.6 million. Either way, you're still competing with Avis and all the others. It would make sense to populate the lesser set of results with higher query volume.
Third: It appears some of the reviews pages are mostly duplicate content. Have you considered an iframe for reviews on some pages? That would further reduce the amount of duplicate content on the page.
Examples:
- http://www.locationdevoiture.fr/location-voiture-LU-evaluation.html
- http://www.locationdevoiture.fr/location-voiture-Luxembourg-LU-evaluation-4.html
Do you actually need both pages anyway?
It also appears I'm having problems getting the site to 404. It would be a good idea to let that happen. Give the user some search options which are properly blocked from search engines.
Finally, I don't think the loss of the links around April was the root cause. It may have contributed slightly, but your Panda hunch seems to be the culprit. So apologies for the misdirection. And if it's in the budget - consider videos for on-page with proper semantic markup and transcripts, where applicable.
However, I have attached a screenshot - just to show you the settings of Majestic SEO. I like using it for cursory examination - then I dive deeper with multiple backlink data sources.
I think my recommendations will help. But if anyone feels I'm incorrect - feel free to lash me with a wet noodle.
-
Hi,
thank you very much for your tipps.
@Travis: This text link generator produces links for affiliates. These links are nofollow. Were did you get the information that we lost a lot of links? We can't see that in our tools.
@MoosaHemani: The problem isn't that we don't know what hit us. We were hit by Panda in May and we were also hit by Pinguin before. We did several things to get out of those filters, but none of our ideas has been successful yet.
-
That's odd, since the Silver Tours GmbH site (billiger-mietwagen.de) seems to be doing better than ever, on Google.de. They even have a page that refers to automated text link generation... hmmm.
"Mit unserem Textlink-Generator erstellen Sie Textlinks, die Ihre Nutzer direkt zu unseren Mietwagen-Angeboten in einem bestimmten Land oder einer bestimmten Stadt führen."
Your site also has a site-wide link from this improbable powerhouse. What happened to cheaper-mietwagen.de? That would be an interesting thing to know. It seems like the parent company owns a lot of domains. The problem may be bigger than your site.
Whois information can also be used to judge the quality of a site. It looks like the parent company owns a large amount of domains. So I would take a look there as well. I just don't have the time this morning.
Also, you may want to consider what the site lost prior to the drop. Around March/April the site lost thousands of links. Granted, a lot of the links were comment/forum spam, but thousands of links are thousands of links. But it was probably for the best in the long run.
I really wish I could have given this more time - but now I'm running late. It would be nice if my German language skills were better. The same goes for French, I suppose.
-
Hello Sandra,
If I would be at your place I would have gone through the following process.
Step 1: I would have gone through the Google Analytics and check on which dates the traffic is dramatically going down then compare it with the algo updates to see exactly what update hit the website.
If it is panda, most probably the problem is with the content of the website and in case of penguin the problem is with the backlinks that are pointing back to the website.
In case of Penguin Update:
Collect all the back links from Google Webmaster tool, Open Site Explorer, Open Link profiler, Aherfs.com and Majestic SEO and then put it in Link Detox to see how many of them bad links are. One you get the list of bad links (in order to complete the list of bad links you need to do the manual audit as well).
Once you have the bad links, remove as many bad links as possible by manually reaching out to them and then update the link disavow file accordingly.
In case of Panda Update:
If you are majorly attacked by panda update, chances are you need to completely re develop your content of the website as Panda usually attack when you have low quality or thin content on the website.
What you ideally should be doing:
- Recheck what update you are facing at the moment
- In case of Penguin follow the process given above
- Build quality links continuous basis so that Google can get a clue that you are working on quality stuff.
- In case of Panda update, redevelop the content of the website to make it user friendly and unique
- Try to get as many social shares on your website content as possible.
- Try to plan and execute a strategy that gives Google a hint of your quality work.
If you have a manual penalty, you probably should work on it to lift it at step1 and later work on aggressive promotion of your website and get quality and relevant links to regain Google’s attention and rankings from the desired key phrases.
This might take few months but you will get the rankings and traffic back.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
link building problems
Hello everyone
Link Building | | mkeshavarzmozprodt
I am not sure about the way Moz rank websites but sometimes I think it's a bit confusing. As an example, it kept changing in 3 days by going up and down. In addition, It is not compatible with my Ahrefs ranking. Would you please tell me which one is more correct and accurate in terms of my website authority?
It would be my pleasure if you take look at it first. Thank you in advance.
https://www.englishdon.ir/0 -
Search console Problem
I'm submitting my site map via search console and the site map is waiting for approval. I've been taking this error for two or three months. my site is not getting index. Therefore, I have experienced a decline in many words. Can the new search console solve my error? help please . for this Ekran Fiyatları
Link Building | | netfocg0 -
Having some problems with understanding how to properly use anchor text for my keywords' better rankings. Please help!
Hello, Last week I did some work on my website's inner pages linking and some of my highly ranking keywords fell down as a result. I guess I did something wrong, please help me figure out what exactly was wrong! Here is the example of what I did. I put a link to the page that ranks for 'Thermador repair Brentwood' here: http://www.ifixappliancesla.com/blog/2016/09/12/appliance-repair-brentwood/ I put exactly same anchor text, 'Thermador repair Brentwood'. For my blog posts, I use Yoast plugin, so when adding the link, I received the message "You are linking to another page with the focus word you want this page rank for. Consider changing that if you truly want this page to rank." While this one's ranking, 'Thermador repair Brentwood', didn't fall down, others did. Could you please confirm if my understanding is right and I'd better not use the same keywords on two or more pages? If yes, what is the best practice for linking to the pages that are not in the main menu? Let's say, I decided to do blog posts for the areas my appliance repair company serves and to optimize those pages for respective keywords. So, I have pages like "Appliance Repair Service in Brentwood" (which is optimized for "appliance repair brentwood" keyword), "Appliance Repair Service in Beverly Hills" ("appliance repair beverly hills" keyword), etc. I thought that the best way to link to them is to put the same anchor texts: "appliance repair beverly hills", etc. But it looks like the best way would be to use the anchor texts like "Beverly Hills", "appliance servicemen in Beverly Hills". Please tell me if I am right? Any explanation, advise or referral would be much appreciated! Thank you!
Link Building | | kirupa0 -
Many of our best links are to our old domain. Problem?
Hi, around 7 years ago the organization I'm with switched their domain name. When doing link analysis some of our best links are to the old domain including a followed link from an article on the BBC news site. The old domain has been 301 redirecting to the new domain for years. Is it worth the time to contact the sites that are linking to the old domain? Or, will all or the majority of the link juice be given to the newer domain with the 301 redirect? Thanks!
Link Building | | limited70 -
Main keyword problem
Hello Since the latest Google update our main keyword has been dropping. We employed a SEO company about 6 weeks ago and since then the keyword has dropped a couple of pages further on Google. We have been using the Remove'em site to check our backlinks and a blog this SEO company have done has appeared on the list. The DA is good but the PA is low. In the blog they have linked the exact keyword to our site homepage. Do you think the blogs the SEO company are doing will be causing us more problems? Any help is very much appreciated. Thanks
Link Building | | Palmbourne0 -
Is it recomended to block a page using the robots file if it has recieved a manual Pengiun/Panda Penalty ?
The whole idea is creating a new page altogther instead of the page that is performing poorly after the pengiun/panda update.
Link Building | | Gmorgan0 -
Root domain registered in search engines, inbound links to www sub-domain. A problem?
I just discovered that our site is registered with the major search engines without the "www" sub domain. Both domains resolve directly to our site, which I need to get corrected. I had planned to have the root (honestabe.com) forwarded to the sub (www.honestabe.com). However, I then found that the sub-domain is not listed with the search engines. Of course, naturally almost all of our inbound links include www. Does Google differentiate between links with and without the sub-domain? In other words, if I forward the www address to the root, will I still get the SEO benefit of those inbound links using www? I'm trying to figure out how to approach this. I'm hoping someone is going to make me feel really stupid for asking this and say it's no big deal. However, I have a feeling this could be a mess.
Link Building | | honestabejosh0 -
Panda Update: Isn't a link still a link?
I was doing some link building and some SEO's said that the Panda update affected many websites. I am going to use eZineArticles.com as my example. EzineArticles was affected by the Panda update and does not show up in the SERPs as much as before. But they still have doFollow Links coming from the articles I am submitting. QUESTION: Regardless if EzineArticles was affected by the Panda Update, isn't a "Follow Link" still a "Follow Link" OR am I completely wasting my time on this devalued website? Edit: Yes I know a PR 0 page is not as valuable as a PR 9 page. I am asking from the standpoint of the affected Panda Update domains overall.
Link Building | | Francisco_Meza0