ECommerce site being "filtered" by last Panda update, ideas and discussion
-
Hello fellow internet go'ers!
Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help.
Before I get into the questions I would like to provide some background:
I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site.
We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google.
Now for some questions:
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Is it a coincidence that it was an exact 30 day "filter"?
Why has only one site recovered?
-
Thanks for your responses.
@EGOL - I would agree that merging the sites would be ideal given that they share such a large database. Unfortunately, this isn't an option for our company (at this point-in-time). Acquiring new content for our product pages has been tossed around, but would be a HUGE undertaking, so its on the "back burner" for the moment.
@Ben Fox - We came to the conclusion that it was content because it was the only clear "offender" on the list of potential problems. However, the fact that only 3 of our sites got penalized perplexes me as well. It would have made more sense had all of our sites suffered a penalty (luckily only 3 did). One response I got from another forum was: since google removed enough duplicate content (3 sites in our case) it deemed that the others were "original".
We didn't point canonicals to any one site (like 9 going to 1). We only added the rel=canonical to our manufacturer category pages (a small percentage of pages). Since some of our domains sell products that aren't "niche specific" we told these pages to send preference to their proper niche domain (hope that made sense).
For discussion purposes, here is a response I got from another forum:
Why has only one site recovered?I suspect/assume the other sites will bounce back the same way after their own 30 day penalties expire.>Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content????? maybe removing the first site allowed the scoring penalty applied to the other sites to shrink in size. as each site was removed, the penalty applied to the others correspondingly shrunk. ?????>Is it a coincidence that it was an exact 30 day "filter"?No. 30 day is a common penalty.Does anyone agree with these? I've heard of the 30 day penalty before. If this is the case, then a warning from Google would be nice.
-
Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content?
Google can be slow to detect duplicate content and sometimes tolerates it.
Is it a coincidence that it was an exact 30 day "filter"?
Only google knows.
Why has only one site recovered?
Only google knows.
Google sees a lot of sites with same content and you say that these are "med-large" sites. If I was google I would say... "these are dupe content, we aren't going to index all of them, our searchers don't want to see ten sites with same stuff".
If these were my sites I would merge all of them into one single site. If the content on that site was unique to me I would probably then put all of my efforts into promotion and informative content for the product lines.
If the content was on other sites that I don't own then my efforts would go mainly into making unique product content and informative content for the product lines.
Google has been squashing duplicate content for years. If you have it and you place links between the sites it is very likely that at least one of your sites will be demoted in google or filtered - probably filtered. They don't want to spend their resources indexing ten duplicate sites. They would rather display unique sites to their searchers.
-
How did you decide that it was content causing the issue if only 3/10 of your sites were affected?
Also when you added the rel=canonical did 9 of your sites point to a primary site and was this the site that recovered?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
In one site a 3rd party is asking visitors to give feedback via pop-up that covers 30-50% of the bottom of the screen, depending on screen size. Is the 3rd party or the site in danger of getting penalized after the intrusive interstitial guidelines?
I am wondering whether the intrusive interstitial penalty affects all kinds of pop-ups regardless of their nature, eg if a third party is asking feedback through a discreet pop-up that appears from the bottom of the screen and covers max 50% of it. Is the site or the third party who is asking the feedback subject to intrusive interstitial penalty? Also is the fact that in some screens the popup covers 30% and in some others 50% plays any role?
Algorithm Updates | | deels-SEO0 -
Google indexing https sites by default now, where's the Moz blog about it!
Hello and good morning / happy Friday! Last night an article from of all places " Venture Beat " titled " Google Search starts indexing and letting users stream Android apps without matching web content " was sent to me, as I read this I got a bit giddy. Since we had just implemented a full sitewide https cert rather than a cart only ssl. I then quickly searched for other sources to see if this was indeed true, and the writing on the walls seems to indicate so. Google - Google Webmaster Blog! - http://googlewebmastercentral.blogspot.in/2015/12/indexing-https-pages-by-default.html http://www.searchenginejournal.com/google-to-prioritize-the-indexing-of-https-pages/147179/ http://www.tomshardware.com/news/google-indexing-https-by-default,30781.html https://hacked.com/google-will-begin-indexing-httpsencrypted-pages-default/ https://www.seroundtable.com/google-app-indexing-documentation-updated-21345.html I found it a bit ironic to read about this on mostly unsecured sites. I wanted to hear about the 8 keypoint rules that google will factor in when ranking / indexing https pages from now on, and see what you all felt about this. Google will now begin to index HTTPS equivalents of HTTP web pages, even when the former don’t have any links to them. However, Google will only index an HTTPS URL if it follows these conditions: It doesn’t contain insecure dependencies. It isn’t blocked from crawling by robots.txt. It doesn’t redirect users to or through an insecure HTTP page. It doesn’t have a rel="canonical" link to the HTTP page. It doesn’t contain a noindex robots meta tag. It doesn’t have on-host outlinks to HTTP URLs. The sitemaps lists the HTTPS URL, or doesn’t list the HTTP version of the URL. The server has a valid TLS certificate. One rule that confuses me a bit is : **It doesn’t redirect users to or through an insecure HTTP page. ** Does this mean if you just moved over to https from http your site won't pick up the https boost? Since most sites in general have http redirects to https? Thank you!
Algorithm Updates | | Deacyde0 -
Google's Mobile Update: What We Know So Far (Updated 3/25)
We're getting a lot of questions about the upcoming Google mobile algorithm update, and so I wanted to start a discussion that covers what we know at this point (or, at least, what we think we know). If you have information that contradicts this or expands on it, please feel free to share it in the comments. This is a developing situation. 1. What is the mobile update? On February 26th, Google announced that they would start factoring in mobile-friendliness as a ranking signal. The official announcement is here. Of note, "This change will affect mobile searches in all languages worldwide and will have a significant impact in our search results." 2. When will the update happen? In an unprecedented move, Google announced that the algorithm update will begin on April 21st. Keep in mind that the roll-out could take days or weeks. 3. Will this affect my desktop rankings? As best we know - no. Mobile-friendliness will only impact mobile rankings. This is important, because it suggests that desktop and mobile rankings, which are currently similar, will diverge. In other words, even though desktop and mobile SERPs look very different, if a site is #1 on desktop, it's currently likely to be #1 on mobile. After April 21st, this may no longer be the case. 4. Is this a boost or a demotion? This isn't clear, but practically it doesn't matter that much and the difference can be very difficult to measure. If everyone gets moved to the front of the line except you, you're still at the back of the line. Google has implied that this isn't a Capital-P Penalty in the sense we usually mean it. Most likely, the mobile update is coded as a ranking boost. 5. Is this a domain- or page-based update? At SMX West, Google's Gary Ilyes clarified that the update would operate on the page level. Any mobile-friendly page can benefit from the update, and an entire site won't be demoted simply because a few pages aren't mobile friendly. 6. Is mobile-friendly on a scale or is it all-or-none? For now, Google seems to be suggesting that a page is either mobile-friendly or not. Either you make the cut or you don't. Over time, this may evolve, but expect the April 21st launch to be all-or-none. 7. How can I tell if my site/page is mobile-friendly? Google has provided a mobile-friendly testing tool, and pages that are mobile-friendly should currently show the "Mobile-friendly" label on mobile searches (this does not appear on desktop searches). Some SEOs are saying that different tools/tests are showing different results, and it appears that the mobile-friendly designation has a number of moving parts. 8. How often will mobile data refresh? Gary also suggested (and my apologies for potentially confusing people on Twitter) that this data will be updated in real-time. Hopefully, that means we won't have to worry about Penguin-style updates that take months to happen. If a page or site becomes mobile-friendly, it should benefit fairly quickly. We're actively working to re-engineer the MozCast Project for mobile rankings and have begun collecting data. We will publish that data as soon as possible after April 21st (assuming it;s useful and that Google sticks to this date). We're also tracking the presence of the "Mobile-friendly" tag. Currently (as of 3/25), across 10,000 page-1 mobile results, about 63% of URLs are labeled as "Mobile-friendly". This is a surprisingly large number (to me, at least) - we'll see how it changes over time.
Algorithm Updates | | Dr-Pete15 -
Panda, Negative SEO and now Penguin - help needed
Hi,
Algorithm Updates | | mlm12
We are small business owners who've been running a website for 5 years that provides our income. We've done very little backlinking ourselves, and never did paid directories or anything like that - usually just occasional forum or blog responses. A few articles here and there with some of our keyword phrases for internal pages. Of course I admit we've done some kwp backlinks on some blogs, but our anchor text profile is largely brand names and our domain name and non keywords (excepting for some "bad" backlinks). Our DA is 34, PA 45 for our home page. We were doing great until last Sept 27 when we got hit by Panda and have been working on deoptimizing our site for keywords, we made a new site in Wordpress for good architecture and ease of use for our customers, and we're deleting/repurposing low quality pages and making our content more robust. We haven't yet recovered from this and now it appears we got hit May 22 for Penguin...ARGH! I recently discovered (hard to have time to devote to everything with just two of us) that others can "negative seo" a site now and I feel this has happened based upon results below... I signed up for linkdetox.com yesterday and it gives a grim picture of our backlinks (says we are in "deadly risk" territory). We have 83 "toxic" links and 600 some "suspicious" links (many are in malware/malicious listed sites, many are .pl domains from Poland, others are I believe foreign domains, or domains that are a bunch or letters that make no sense, or spammy sounding emd domains), - this makes up 80% of our links. As this is our only business, our income is now 1/3 of what it has been, even with PPC ads going as we've been hit hard by all of this and are wondering if we can survive fixing this. We do have an SEO firm minimally helping us along with guidance on recovering, but with income so low, we are doing the work ourselves and can't afford much. Needless to say, we are quite distressed and from reading around, not sure if we'll be able to recover and that is deeply saddening, especially from Negative SEO. We want to make sure we are on the right path for recovery if possible, hence my questions. We haven't been in contact with Google for reconsideration, again, no penalty messages from them. First of all, if we don't have a manual penalty, would you still contact all the toxic/malicious/possible porn looking sites and ask for a link removal, wait, ask for link removal, wait then disavow? Or just go straight to Google disavow? For backlinks coming from sites that are "gone" (like a message saying the account has been suspended), or there is no website there anymore, do I try and contact them too? Or go direct to disavow? Or do nothing? For the sites flagged as malicious (by linkdetox, my browser, or by Google), I don't want to try and open them on my browser to see if this site is legitimate. If linkdetox doesn't have the contact info for these - what are we supposed to do? For "suspicious" foreign sites that I can't read the webpage -would you still disavow them (I've seen many here say links from foreign sites should be disavowed). How do you keep up with all this is someone is negative SEOing you? We're really frustrated that Google's change has made it possible for competitors to tank your business (arguably though, if we had a stronger backlink profile this may not have hurt, or not as much - not sure). When you are small biz owners and can't hire a group to constantly monitor backlinks, get quality backlinks, content, site optimization, etc - it seems an almost impossible task to do. Are wordpress left nav and footer link anchor text an issue for Penguin? I would think Google would realize these internal links will be repetitive for the same anchor text on Wordpress (I know Matt Cutts said to not use the same anchor text more than once for internal linking -but obviously nav and footer menus will do this). What would you do if this was you? Try and fix it all? Start over with a new domain and 301 it (some say this has been working)? Just start over with a new domain and don't redirect? Thanks for your input and advice. We appreciate it.0 -
How to Link a Network of Sites w/o Penguin Penalties (header links)
I work for a network of sites that offer up country exclusive content. The content for the US will be different than Canada, Australia, Uk, etc.… but with the same subjects. Now to make navigation easy we have included in the header of every page a drop down that has links to the other countries, like what most of you do with facebook/twitter buttons. Now every page on every site has the same link, with the same anchor text. Example: Penguins in Canada Penguins in Australia Penguins in the USA Because every page of every site has the same links (it's in the header) the "links containing this anchor text" ratio is through the roof in Open Site Explorer. Do you think this would be a reason for penguin penalization? If you think this would hurt you, what would you suggest? no follow links? Remove the links entirely and create a single page of links? other suggestions?
Algorithm Updates | | BeTheBoss0 -
What to do with eCommerce site with color variations of the same product?
On our eCommerce website we sell products that each have about 20 color variations. When the site was built each color variation was added individually instead as a single product with a configurable color option. Would it be best to combine the different variations into a single product with a configurable drop down menu for color or to leave as is? I am worried the search engines see the individual product pages for each color as duplicate content. What are your thoughts on how Zappos handles color variations? On the category page they display each color variation as an individual product but when the product is clicked it goes to a single product page with the different configurable color variations. Thanks
Algorithm Updates | | jchosler1 -
Did we get hit by Panda? What do we do?
Hello, here's our site: nlpca(dot)com We had a big drop in rankings, going from about 19th to about 43rd for our main keyword and having significant drops in other keywords. This happened roughly 6 weeks ago We thought it was being caused by either: Placing keywords in titles before we had them in the content. or Trying to rank for Utah keywords - we're the NLP Institute of California and we are in both places now, but the site talks about mainly California. We changed both these things, and we're still at the low rankings. Will we move back up? What do we do? Will a backlink campaign be effective at this point?
Algorithm Updates | | BobGW0 -
Website "penalized" 3 times by Google
I have a website that I'm working with that has had the misfortune of gaining rankings/traffic on Google, then having the rankings/traffic removed...3 times! (Very little was changed on the site to gain or lose "favor" with Google, either.) Notes: Site is a mixture of high quality original content and duplicate content (vacation rental listings) When traffic crashes, we lose nearly all rankings and traffic (90+%) When traffic crashes, we lose all rankings sitewide, including those gained by our high quality, unique pages None of the "crash" dates appear to coincide with any Panda update dates We are working on adding unique content to our pages with duplicate content, but it's a long process and so far doesn't seem to have made any difference I'm confounded why Google keeps "changing its mind" about our site We have an XML sitemap, and Google keeps our site indexed pretty well, even when we lose our rankings Due to the drastic and sitewide loss of rankings, I'm assuming we are dealing with some sort of algorithmic penalty Timeline: Traffic steadily grows starting in Jan 2011 Traffic crashes on Feb 19, 2011. We assumed it was due to a pre-panda anti-scraper update, but don't know. Google sends traffic to our site on March 1, then none the next day On June 16th, I block part of the site using robots.txt (most of the section wasn't indexed anyway) On June 17th, Google starts ranking our site again. I thought it might be due to the robots.txt change, but I had just made the change a few hours ago, and Google wasn't even indexing the part of the site I blocked Traffic/rankings crash again on July 6th. No theory why. Site URL: http://www.floridaisbest.com Traffic Stats: Attached I know that we need more backlinks and less duplicate content, but I can't explain why our Google rankings are "on again, off again". I have never seen a site gain and lose all of its rankings/traffic so drastically multiple times, for no apparent reason. Any thoughts or ideas would be welcome. Thanks! t8IqB
Algorithm Updates | | AdamThompson0