Panda Update - Challenge!
-
I met with a new client last week. They were very negatively impacted by the Panda update. Initially I thought the reason was pretty straight-forward and had to do with duplicate content. After my meeting with the developer, I'm stumped and I'd appreciate any ideas.
Here are a few details to give you some background.
The site is a very nice looking (2.0) website with good content. Basically they sell fonts. That's why I thought there could be some duplicate content issues. The developer assured me that the product detail pages are unique and he has the rel=canonical tag properly in place.
I don't see any issues with the code, the content is good (not shallow), there's no advertising on the site, XML sitemap is up to date, Google webmaster indicates that the site is getting crawled with no issues.
The only thing I can come up with is that it is either:
Something off-page related to links or
Related to the font descriptions - maybe they are getting copied and pasted from other sites...and they don't look like unique content to Google.
If anyone has ideas or would like more info to help please send me a message.
I greatly appreciate any feedback.
Thank you, friends!
LHC
-
Mmm... yeah hard to guess without looking at the site then, on my own experience / research, these are some of the issues I found in many of the sites affected by Panda:
•Intrusive advertising, excessive use of Adsense, sites created only for Adsense or to solely promote a product•High amounts of duplicate content / scraped content•Bad user interface / “ugly” design•Usage data - low click-through-rate, low time-on-site, 100% bounce rate•Content analysis - not usable/readable/easily-consumable content•Excessive internal linking to one or two pages only
And I don't mean your site to be spammy, but some cases, like news sites with advertising, sometimes they get articles out with just a couple paragraphs of content, so that single page becomes more advertising than content.
Consider posting your site, it would be nice to take a look and there is also the last reason: your site is innocent and just got hit by mistake, it happens.
-
Hi, Andrés-
They weren't running any AdSense - no advertising at all and the site isn't spammy.
-
Hi Lisa, besides what was already mentioned, one of the reasons many sites were affected is related to ads, if you have Adsense or any other kind of ads in an excessive way, where in some pages you have more ads than content, then that could be a signal of low quality.
-
We're seeing massive changes in rankings now. Not so much drops but far less in rises, and a few drops. This seems to be happening more recently and not immediately after the update.
We've guesstimated that it is down to the update, despite the delay in effects.
We've come to the conclusion that a lot of the links (both existing links and ones we've been building since) are not holding anywhere near as much weight as they once were. Especially links that were "easier to come by" i.e. blog comments, articles, etc...
Due to the fact that the sites the links were and are on, have been hit themselves it's logical to assume those links are now devalued. Lots of article sites were hit, and "low value" sites. Thankfully not all links were from such sites but some were, which explains the drops I think.
-
Hi Lisa, It will be a little tricky without actually looking at the website but my starting point would be what you have done, duplicate content elimination. If the rel=canonical is in place I would double check that it is a 301 re-direct. I would certainly take a look at the content to see if it is duped on third part websites. Some SEO firms just copy and paste segments of the website and add links. Another concern i would have is the backlink hosts status so I.E a website which seemed appropriate at the time has also been hit by the panda update and is now classed as spam. I would run a backlink checker + search for the content to see if it’s duped. Let me know how you get on with this, it will be interesting to see what the culprit is.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
is moz website update algo of pa and da ?
I sea tons of website build redirects backlink from google to them , and the have get a high ranking in moz , why moz don't pervent this methods?
Intermediate & Advanced SEO | | tailor1090 -
Chrome 79 Update and PDFs
I've been taking precautions for the Chrome 79 and 80 updates that will be more strict about serving mixed content. Quick question: will this impact http:// PDFs on https:// pages?
Intermediate & Advanced SEO | | SearchStan0 -
Ecommerce catalog update: 301 redirects?
Hello mozers, We run an ecommerce store and are planning a massive catalog update this month. Essentially, 100% of our product listings will be deleted, and an all new catalog will be uploaded. The new catalog contains mostly new products, however there are some products that already existing in the old catalog as well. The new catalog has a bunch of improvements to the product pages, included optimized meta titles and descriptions, multiple language, optimized URLs and more. My question is the following: When we delete the existing catalog, all indexed URLs will return 404 errors. Setting up 301 redirects from old to new products (for products which existing previously) is not feasible given the number of products. Also, many products are simply being remove entirely. So should we go ahead and delete all products, upload the new catalog, update the sitemap, resubmit it for crawling, and live with a bunch of 404 errors until these URLs get dropped from Google? The alternative I see is setting 301 redirects to the home page, but I am not sure this would be correct use of 301 redirects. Thanks for your input.
Intermediate & Advanced SEO | | yacpro130 -
Noindexing Thin News Content for Panda
We've been suffering under a Panda penalty since Oct 2014. We've completely revamped the site but with this new "slow roll out" nonsense it's incredibly hard to know at what point you have to accept that you haven't done enough yet. We have thousands of news stories going back to 2001, some of which are probably thin and some of which are probably close to other news stories on the internet being articles based on press releases. I'm considering noindexing everything older than a year just in case, however, that seems a bit of overkill. The question is, if I mine the logfiles and only deindex stuff that Google sends no further traffic to after a year could this be seen as trying to game the algo or similar? Also, if the articles are noindexed but still exist, is that enough to escape a Panda penalty or does the page need to be physically gone?
Intermediate & Advanced SEO | | AlfredPennyworth0 -
Blog content and panda?
If we want to create a blog to keep in front of our customers (via email and posting on social) and the posts will be around 300 - 1000 words like this site http://www.solopress.com/blog/ are we going to be asking for a panda slap as the issue would be the very little shares and traction after the first day or two. Also would panda only affect the blogs that are crap if we mix in a couple of really good posts or would it affect theses as well and possibly even the site? Any help would be appreciated.
Intermediate & Advanced SEO | | BobAnderson0 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Panda/Penguin & more than one services site in niche
Hello, My friend has a personal development training site. I have been advised not to make separate personal coaching sites for the owners of the training sites. Do you have experience that Panda/Penguin could penalize for separate sites in a similar niche? Do you need any more info to give a good response? Thank you.
Intermediate & Advanced SEO | | BobGW0 -
How to determine the correct number of ad units post-Panda
What guidelines are you using to determine the correct number of ad units? Also is it number of units per page or the size of the ads (visually)? Any additional guidance or links you can point me to regarding ads in a post-Panda world would be helpful.
Intermediate & Advanced SEO | | nicole.healthline0