Panda and Large Web Presence
-
I'm experiencing some recent significant drops in rankings across the board for a client of mine and I suspect that it's probably related to Panda. Their internet presence features completely unique, useful, well written content by certified industry experts. Further, all content is of proper length and again serves a core purpose, providing helpful information to their viewers. Where I think things potentially go wrong is that they have around 20 micro sites in operation, including multiple web 2.0 blogs. There are also multiple sites in operation that target more specific areas of the same city. Again all of the content is unique, but they all feature content that's of the same industry and broad topic.
Despite everything being 100% unique, I fear it's too excessive. Anyone know if Panda may target this type of approach even if the quality and uniqueness is appropriate?
-
Panda updates have hit microsites where content across the sites was either duplicated or "thin", although thin is often in the eye of the beholder. Keep in mind, and I mean this kindly, that "unique" is not always high-quality, and the quest for technical uniqueness can lead to practices where microsites are just spinning out versions of content with slightly different keyword concepts or ordering, etc. In other words, it's technically "unique", but most people wouldn't view it as valuable.
Early Panda updates did hit certain kinds of spun-off content hard, including geo-located content. In other words, you spun out your plumbing services page for 5,000 cities and it only differed by city names and a few basic facts (even if technically unique), that's definitely something Panda came down hard on.
Truthfully, though, it's really tough to tell without specifics. I'm more on EGOL's side of the fence - my gut feeling is that 20 micro-sites is excessive and I'd strongly suspect quality issues.
Some questions that might help you pin things down:
(1) Has traffic dropped across the entire cluster of sites or just the main site?
(2) Can you pin traffic drops down to any given date, set of keywords, or pages? Drill down as far as you can - that's always the most important first step, IMO.
(3) Are some of your micro-sites essentially dead - no traffic or ROI? You might not have to go all-or-none here. Odds are that some small % of your micro-sites are creating a large % of your value (let's call it an 80/20 rule). It's likely you could kill 10-15 of them with very little harm - at least that's what I typically see. You don't have to drop all 20 cold-turkey.
-
Where I think things potentially go wrong is that they have around 20 micro sites in operation...
Did they built all of these outhouses because they thought they would be a source of "links" ?
The first thing that I would do is to be sure that the content that is in use on their site today, right now, is unique content that originated with the company. If that is not the case, then it is time to throw things overboard or noindex the items that are not original and unique. If everything is original and unique then I would get into an "improvement & consolidation" mode, pulling good content out of the outhouses, improve it to the point of being Great Content, and posting it on the main site.
Keep in mind that problems related to Panda, Penguin, or other algos occur when you are crossways with one or more Google Principles. These can be really hard to diagnose and require a full site audit requiring many hours, done be someone who really knows their stuff. What you will get here with a generalized question is not much more than kibitzin'.
-
Hi Jay,
Do you have any dates that you can refer to in Analytics that show drop that might coincide with a penalty / algorithm update?
-Andy
-
Thank you everyone. I agree as well that it isn't the right approach. Moving forward though it would be extremely beneficial to pinpoint the exact cause of this recent decrease in ranking. It's peculiar to witness strong and reliable gains prior to a significant drop across the board on the heels of this update.
Let's say someone is creating multiple pages that target minor variations of the same keyword. Using unique, but essentially re-written content for all pages. If this was all hosted on the same site it would then be a clear violation of Panda.
"Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?" - Amit
It would not be duplicate content but could be seen as redundant articles on similar topics.
However, if re-written content that's similar in scope is spread across multiple domains as opposed to being hosted on the same site, would it not fall into the same Panda category?
-
I agree with Andy, your description of the setup sounds pretty excessive. Plus, just because content is unique and professionally written doesn't mean that it's high quality. If the sites all say the same thing but in different ways, then none of them are contributing anything meaningful. And your branding is diffused across a zillion different sites to boot.
-
Hi Jay,
Anyone know if Panda may target this type of approach even if the quality and uniqueness is appropriate?
No, this doesn't sound to me like Panda at all.
You mention they have microsites and blogs in operation - presumably this has been done to try and rank for additional phrases? I can't see many other reasons why this would be done.
My opinion here is to pull both the microsites and blogs back in and just create a blog on their own site (if they don't already have one). I wouldn't bother 301ing any external sites / posts back to those they might want to re-published on their current site either. You need to be advising them to start from scratch and ditch the chaff. If these external sites have all had a part to play in their current problems, then I would just distance yourself from them altogether.
...they all feature content that's of the same industry and broad topic
When looking at their own site, you need to also be advising them not to create blogs posts for the sake of it. Rather than creating 4-5 articles a week, tell them to create just 1 or two really high quality (and longer) articles weekly.
I hope this helps.
-Andy
-
Hi Jay,
Its a difficult question to answer however I can point you in a direction John Mueller of Google Switzerland has a hangout on Fridays at his g+ hang out below You can pose the question to him at times if he cant get an answer he will come back to you. Hope this helps
https://plus.google.com/+JohnMueller/posts
https://sites.google.com/site/webmasterhelpforum/en/office-hours
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive Nonsensical 301 on Large ecommerce Site
We are in the process of launching a large ecommerce site, which is a rebuild. Their old URL structure does not make it possible in our eyes to logically map every URL to it's corresponding new page. We have done our best to properly and manually redirect all pages that were receiving any amount of organic traffic and have also covered all pages that had external links. Our question is we will end up with potentially tens of thousands of 404 errors that will never fix themselves. The manual work will need to stop at some point. Would it be better to leave these 404's the way they are and just let them fall out of the index or should we take everything we cannot assign appropriately to a page like the products root or the home page? I'm also open to hearing any suggestions about how others have solved massive nonsensical 301's. Thanks in advance,
Technical SEO | | Bevelwise0 -
How can I see the SEO of a URL? I need to know the progress of a specific landing-page of my web. Not a keyword, an url please. Thanks.
I need to know the evolution on SEO of a specific landing-page (an URL) of my web. Not a keyword, a url. Thanks. (Necesito saber si es posible averiguar el progreso de una URL específica en el posicionamiento de Google. Es decir, lo que hace SEOmoz con las palabras clave pero al revés. Yo tengo una url concreta que quiero posicionar en las primeras posiciones de Google pero quiero ver cómo va progresando en función a los cambios que le voy aplicando. Muchas gracias)
Technical SEO | | online_admiral0 -
Can you recommend a Web Developer who specializes in SEO?
We are an e-commerce site, http://www.ccisolutions.com running on an obscure Web store coded for us by a small company called Assist, located in Utah. We believe we have numerous problems with our code that are negatively impacting our SEO. One such problem, the current meta refresh on our homepage, is in the process of being fixed (Thanks to Jenn Lopez at SEOMoz for helping me convince management it was important enough to pay the $ for the fix!). However, I believe there could be numerous other issues. I am the SEO strategist, but I am not a coder beyond basic HTML and CSS. Can anyone recommend a highly qualified Web developer who's strong in SEO that we might hire to do an audit of our code, including recommendations on how to fix anything that might be discovered as a problem?
Technical SEO | | danatanseo0 -
Is it better to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Is it better for SEO to delete web pages that I don't want anymore or should I 301 redirect all of the pages I delete to the homepage or another live page?
Technical SEO | | CustomOnlineMarketing0 -
How to publish duplicate content legitimately without Panda problems
Let's imagine that you own a successful website that publishes a lot of syndicated news articles and syndicated columnists. Your visitors love these articles and columns but the search engines see them as duplicate content. You worry about being viewed as a "content farm" because of this duplicate content and getting the Panda penalty. So, you decide to continue publishing the content and use... <meta name="robots" content="noindex, follow"> This allows you do display the content for your visitors but it should stop the search engines from indexing any pages with this code. It should also allow robots to spider the pages and pass link value through them. I have two questions..... If you use "noindex" will that be enough to prevent your site from being considered as a content farm? Is there a better way to continue publication of syndicated content but protect the site from duplicate content problems?
Technical SEO | | EGOL0 -
Large Scale Ecommerce. How To Deal With Duplicate Content
Hi, One of our clients has a store with over 30,000 indexed pages but less then 10,000 individual products and make a few hundred static pages. Ive crawled the site in Xenu (it took 12 hours!) and found it to by a complex mess caused by years of hack add ons which has caused duplicate pages, and weird dynamic parameters being indexed The inbound link structure is diversified over duplicate pages, PDFS, images so I need to be careful in treating everything correctly. I can likely identify & segment blocks of 'thousands' of URLs and parameters which need to be blocked, Im just not entirely sure the best method. Dynamic Parameters I can see the option in GWT to block these - is it that simple? (do I need to ensure they are deinxeded and 301d? Duplicate Pages Would the best approach be to mass 301 these pages and then apply a no-index tag and wait for it to be crawled? Thanks for your help.
Technical SEO | | LukeyJamo0 -
Xenu Alternative for Large Sites
We're launching a new site and we're trying to crawl it to check for any problems. It's millions of pages and Xenu seems to start encountering errors as the numbers mount past 500,000. Does anyone know of an alternative, free or paid, that could handle the size better?
Technical SEO | | eLocalusa0