Panda Update - Challenge!
-
I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas.
Here are a few details to give you some background.
The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place.
I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues.
The only thing I can come up with is that it is either:
Something off-page related to links or
Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google.
If anyone has ideas or would like more info to help please send me a message.
I greatly appreciate any feedback.
Thank you, friends!
LHC
-
Mmm... yeah hard to guess without looking at the site then, on my own experience / research, these are some of the issues I found in many of the sites affected by Panda:
•Intrusive advertising, excessive use of Adsense, sites created only for Adsense or to solely promote a product•High amounts of duplicate content / scraped content•Bad user interface / “ugly” design•Usage data - low click-through-rate, low time-on-site, 100% bounce rate•Content analysis - not usable/readable/easily-consumable content•Excessive internal linking to one or two pages only
And I don't mean your site to be spammy, but some cases, like news sites with advertising, sometimes they get articles out with just a couple paragraphs of content, so that single page becomes more advertising than content.
Consider posting your site, it would be nice to take a look
and there is also the last reason: your site is innocent and just got hit by mistake, it happens.
-
Hi, Andrés-
They weren't running any AdSense - no advertising at all and the site isn't spammy.
-
Hi Lisa, besides what was already mentioned, one of the reasons many sites were affected is related to ads, if you have Adsense or any other kind of ads in an excessive way, where in some pages you have more ads than content, then that could be a signal of low quality.
-
We're seeing massive changes in rankings now. Not so much drops but far less in rises, and a few drops. This seems to be happening more recently and not immediately after the update.
We've guesstimated that it is down to the update, despite the delay in effects.
We've come to the conclusion that a lot of the links (both existing links and ones we've been building since) are not holding anywhere near as much weight as they once were. Especially links that were "easier to come by" i.e. blog comments, articles, etc...
Due to the fact that the sites the links were and are on, have been hit themselves it's logical to assume those links are now devalued. Lots of article sites were hit, and "low value" sites. Thankfully not all links were from such sites but some were, which explains the drops I think.
-
Hi Lisa, It will be a little tricky without actually looking at the website but my starting point would be what you have done, duplicate content elimination. If the rel=canonical is in place I would double check that it is a 301 re-direct. I would certainly take a look at the content to see if it is duped on third part websites. Some SEO firms just copy and paste segments of the website and add links. Another concern i would have is the backlink hosts status so I.E a website which seemed appropriate at the time has also been hit by the panda update and is now classed as spam. I would run a backlink checker + search for the content to see if it’s duped. Let me know how you get on with this, it will be interesting to see what the culprit is.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Disavow File Format and MOZ Spam Score Updates
Hi, Is there a defined file format for Google disavow file name? Does it has to be disavowlinks.txt or can we do this like domain-name-date.txt ? Also, since Google does not share their data with Moz, how does MOz updates its spam score after we disavow the bad links? Do we need to connect Google search console with Moz?
Intermediate & Advanced SEO | | Sunil-Gupta0 -
During major update rankings update seem to be on pause ?
Hello, I have read in the past that during a major update google puts all his ressources in the update and it seems that they don't update search results anymore. Has someone noticed that too ? How long does it take for an update to be rolled out fully and have everything get back to normal ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Insane traffic loss and indexed pages after June Core Update, what can i do to bring it back?
Hello Everybody! After June Core Update was released, we saw an insane drop on traffic/revenue and indexed pages on GSC (Image attached below) The biggest problem here was: Our pages that were out of the index were shown as "Blocked by robots.txt", and when we run the "fetch as Google" tool, it says "Crawl Anomaly". Even though, our robots.txt it's completely clean (Without any disallow's or noindex rules), so I strongly believe that the reason that this pattern of error is showing, is because of the June Core Update. I've come up with some solutions, but none of them seems to work: 1- Add hreflang on the domain: We have other sites in other countries, and ours seems like it's the only one without this tag. The June update was primarily made to minimize two SERP results per domain (or more if google thinks it's relevant). Maybe other sites have "taken our spot" on the SERPS, our domain is considerably newer in comparison to the other countries. 2- Mannualy index all the important pages that were lost The idea was to renew the content on the page (title, meta description, paragraphs and so on) and use the manual GSC index tool. But none of that seems to work as well, all it says is "Crawl Anomaly". 3- Create a new domain If nothing works, this should. We would be looking for a new domain name and treat it as a whole new site. (But frankly, it should be some other way out, this is for an EXTREME case and if nobody could help us. ) I'm open for ideas, and as the days have gone by, our organic revenue and traffic doesn't seem like it's coming up again. I'm Desperate for a solution Any Ideas gCi46YE
Intermediate & Advanced SEO | | muriloacct0 -
Mobile Site Panda 4.2 Penalty
We are an ecommerce company, and we outsource our mobile site to a service, and our mobile site is m.ourdomain.com. We pass the Google mobile ready test. Our product page content on the mobile site is woefully thin (typically less than 100 words), and it appears that we got hit with Panda 4.2 on the mobile site. Starting at the end of July, our mobile rankings have dropped, and our mobile traffic is now about half of what it was in July. We are working to correct the content issue but it obviously takes time. So here's my question - if our mobile site got hit with Panda 4.2, could that have a negative effect on our desktop site?
Intermediate & Advanced SEO | | AMHC0 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
Is google rolling out a huge update this week?
I am seeing some huge shifts in SERPS at the moment, for some keywords such as web design. Nearly every single result on the homepage is a different company than 2 weeks ago. We are seeing some clients have huge jumps in ranks but also some are dropping. Seems like we could have a big Panda/Penguin like update rolling out.
Intermediate & Advanced SEO | | tempowebdesign1 -
Recovery Steps For Panda 3.5 (Rel. Apr. 19, 2012)?
I'm asking people who have recovered from Panda to share what criteria they used - especially on sites that are not large scale ecommerce sites. Blog site hit by Panda 3.5. Blog has approximately 250 posts. Some of the posts are the most thorough on the subject and regained traffic despite a Penguin mauling a few days after the Panda attack. (The site has probably regained 80% of the traffic it lost since Penguin hit without any link removal or link building, and minimal new content.) Bounce rate is 80% and average time on page is 2:00 min. (Even my most productive pages tend to have very high bounce rates BUT those pages maintain time on page in the 4 to 12 minute range.) The Panda discussions I've read on these boards seem to focus on e-commerce sites with extremely thin content. I assume that Google views much of my content as "thin" too. But, my site seems to need a pruning instead of just combiining the blue model, white model, red model, and white model all on one page like most of the ecommerce sites we've discussed. So, I'm asking people who have recovered from Panda to share what criteria they used to decide whether to combine a page, prune a page, etc. After I combine any series articles to one long post (driving the time on page to nice levels), I plan to prune the remaining pages that have poor time on page and/or bounce rates. Regardless of the analytics, I plan to keep the "thin" pages that are essential for readers to understand the subject matter of the blog. (I'll work on flushing out the content or producing videos for those pages.) How deep should I prune on the first cut? 5% ? 10% ? Even more ? Should I focus on the pages with the worst bounce rates, the worst time on page, or try some of both? If I post unique and informative video content (hosted on site using Wistia), what I should I expect for a range of the decrease in bounce rate ? Thanks for reading this long post.
Intermediate & Advanced SEO | | JustDucky0 -
Article Submissions - Still Worth it After Panda Update?
Are article submissions still relevant after the panda update? Many of these sites (ezinearticles) are still hit from the panda update.
Intermediate & Advanced SEO | | qlkasdjfw0