Panda Update: Need your expertise...
-
Hi all, After Panda update our website lost about 45% of it's traffic from Google. It wasn't an instant drop mostly it happened gradually over the last 5 months. Our keywords (all of them except the domain name) started to lose positions from top #10 to now 40+ and all recovery attempts we have done so far didn't really help. At this moment it would be great to get some advice from the top experts like you here.
What we have done so far is that We have gone through the all pages and removed the duplicate / redundant ones. We have refresh the content on the main pages and also all pages now have an canonical tags. Our website is www.PrintCountry.com.
Thank you very much in advance for your time.
-
Any other advice from the experts?
-
Yeah... beyond what I've already said, I don't really know what to do. A sort of similar question is here, and Thomas makes some good points. Adding some of the new schema.org markup could help (they have it for products, reviews, blogs, and more).
-
Thanks again John. There are really good points here.
What would you suggest for the duplicate content? I mean we originally created those pages and mostly our affiliates took those descriptions and even titles. It's little bit out of control who can use our content but don't you think Google should know who the owner of the content is? or any advice to cope with this?
Another point is that it's very hard to create different content for the all products we have. For example some of them are the same cartridges with different colors so the descriptions and titles most of the time are very similar.When we look at the competitors they also have same content issue for the products but their ranking are still good.
In terms of content we actually have a blog (http://goo.gl/i0uXS), article site (http://goo.gl/lGtNl), testimonials and review etc...There are some great sources here and interestingly none of them got affected by the panda update. It was just the main domain.
Thanks
-
I looked a bit more. If I had to guess why you're being penalized it's because your site doesn't have much in the way of original content. For example, go to a product page, like this one for a Brother ribbon cartridge. The description reads "Brother Black Ribbon Cartridge. This brand new Brother 1230 black ribbon cartridge is manufactured by Original Equipment Manufacturer (OEM). 100% Manufacturer Guaranteed." If you stick that in Google, you'll see that you, along with every other printer site uses basically the same description, like printert.com, 123cheapink.com, inkcircus.com, etc.
You need to provide original content on your site that doesn't appear as duplicate content to Google. The blog is a good start. Having customer reviews, testimonials, and articles custom to your site are also good.
You might try taking a look at your competitors that are doing well in searches, and see what they're doing in terms of original content on their sites.
-
Thank you John for the reply.
Like you said what is good for user should be good for Google too right? The reason there are three ways to navigate on the website is mostly because of our visitor behaviors. Some visitors like to use the selector on the right side versus some of them like to use the top navigation links.
For h2 tags I see your point. It's something relatively easy to fix. But do you think it could be the major reason for our penalty?
-
Usually, what's good for the user is good for Google. I'm a little confused by how I'm supposed to navigate your site. There are 3 ways on the page to navigate your site, on the left, top, and right? There seems to be a lot of redundancy between these. You could probably consolidate these into one form of navigation?
Also, the links to categories within the expanded top navigation are wrapped in
s. For example, if you expand Ink & Toner, Brother, Canon, Dell, etc are all in
s. This is resulting in a lot of
tags on your page. I'd remove those tags since those elements aren't headers. It looks spammy to me.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Google Mobile Algorithm update
Hi there, On April the 21st Google seems to going to update their Mobile algorithm. I have a few questions about this one. Our current mobile website is very mobile friendly. We block all mobile pages with a noindex, so the desktop pages have been indexed on mobile devices. We use a redirect from desktop page to mobile page when someone hits a result on a mobile device. My gut tells me this is not April 21st-proof so I'm thinking about an update to make this whole thing adaptive. By making the thing adaptive, our mobile pages will be indexed instead of the desktop pages. Two questions: Will Google treat the mobile page as a 100% different page than the desktop page? Or will it match those two because everything will tell Google those belong together. In other words: will the mobile page start with a zero authority and will pages lose good organic positions because of authority or not? Which ranking factor will be stronger after April 21st for mobile pages: page authority or mobile friendliness? In other words: is it worth ignoring the 21 April update because the authority of the desktop pages is more important than making every page super mobile friendly? Hope to get some good advice! Marcel
Algorithm Updates | | MarcelMoz0 -
Need Advice - Google Still Not Ranking
Hi Team - I really need some expert level advice on an issue I'm seeing with our site in Google. Here's the current status. We launched our website and app on the last week of November in 2014 (soft launch): http://goo.gl/Wnrqrq When we launched we were not showing up for any targeted keywords, long tailed included, even the title of our site in quotes. We ranked for our name only, and even that wasn't #1. Over time we were able to build up some rankings, although they were very low (120 - 140). Yesterday, we're back to not ranking for any keywords. Here's the history: While developing our app, and before I took over the site, the developer used a thin affiliate site to gather data and run a beta app over the course of 1 - 2 years. Upon taking on the site and moving to launch the new website/app I discovered what had been run under the domain. Since than the old site has been completely removed and rebuild, with all associated urls (.uk, .net, etc...) and subdomains shutdown. I've allowed all the old spammy pages (thousands of them to 404). We've disavowed the old domains (.net, .uk that were sending a ton of links to this), along with some links that seemed a little spammy that were pointing to our domain. There are no manual actions or messaged in Google Webmaster Tools. The new website uses (SSL) https for the entire site, it scores a 98 / 100 for a mobile usability (we beat our competitors on Google's PageSpeed Tool), it has been moved to a business level hosting service, 301's are correctly setup, added terms and conditions, have all our social profiles linked, linked WMT/Analytics/YouTube, started some Adwords, use rel="canonical", all the SEO 101 stuff ++. When I run the page through the moz tool for a specific keyword we score an A. When I did a crawl test everything came back looking good. We also pass using other tools. Google WMT, shows no html issues. We rank well on Bing, Yahoo and DuckDuckGo. However, for some reason Google will not rank the site, and since there is no manual action I have no course of action to submit a reconsideration request. From an advanced stance, should we bail on this domain, and move to the .co domain (that we own, but hasn't been used before)? If we 301 this domain over, since all our marketing is pointed to .com will this issue follow us? I see a lot of conflicting information on algorithmic issues following domains. Some say they do, some say they don't, some say they do since a lot of times people don't fix the issue. However, this is a brand new site, and we're following all of Google's rules. I suspect there is an algorithmic penalty (action) against the domain because of the old thin affiliate site that was used for the beta and data gathering app. Are we stuck till Google does an update? What's the deal with moving us up, than removing again? Thoughts, suggestions??? I purposely, did a short url to leave out the company name, please respect that, since I don't want our issues to popup on a web search. 🙂
Algorithm Updates | | get4it0 -
Google Algorithm Update .. Author-rank finally kicking in ?
These few days I've been seeing great movement of my sites growing by 70-100% in traffic spikes. Some how I think this has something to do with AuthorRank maybe kicking in now as more of a factor in rankings? Anyone have an idea whats going on ?
Algorithm Updates | | NikolasNikolaou0 -
How does Google treat anchor tags on badges after penguin update?
We have a website builder that creates sites in sub-domains (i.e. yoursite.breezi.com) on every site we have included a badge that has anchor text and an image. My question is given the fact that we will include this on many if not most of the sites created inside our builder how will google treat backlinks with the same anchor tag/text from non relevant sites after the penguin update? I am concerned about the backlinks from non-theme related sites and it's SEO implications. Any help is greatly appreciated.
Algorithm Updates | | breezi0 -
Lesser visited, but highly ranked landing paged dropped in rank on Google. Time for a content update?
I noticed that my page one ranked landing pages that don't get a lot of love from me have dropped in rank big time on Google this week. This is a site that has static (meaning, I can't freshen up the content easily) landing pages for products that we sell. The pages that dropped are the ones that have the fewest inbound links, and don't get much attention on the social media side. Our most important landing pages have also dropped, but just a few spots on page one. This is a first for me. Does anyone think that this is a "lack of freshness" penalty? We are still number one on page one for our brand search terms. Would fresh content give me a shot at getting the pages back up? I'm willing to update them slowly, but before I go crazy, I'm reaching out to the pros here.
Algorithm Updates | | Ticket_King0 -
SEO updates and rank changes
We have been updating page titles and meta descriptions for a client (not changing ANY links and the content we are replacing is "fluff," no major keywords or any relevant information) yet in the past few weeks, rankings have plummeted. I used the SEOMoz grader to check and make sure we have the keywords in there, in the right places for the updated page source info, and we're getting A's yet for those same keywords, the website is nowhere to be found. For example for the phrase "organic t shirts," we get an A for this page: http://greenpromotionalitems.com/organic-t-shirts.htm but when searching organic t shirts, no Green Promotional Items... Ideas?
Algorithm Updates | | laidlawseo0 -
Was Panda applied at sub-domain or root-domain level?
Does anyone have any case studies or examples of sites where a specific sub-domain was hit by Panda while other sub-domains were fine? What's the general consensus on whether this was applied at the sub-domain or root-domain level? My thinking is that Google already knows broadly whether a "site" is a root-domain (e.g. SEOmoz) or a sub-domain (e.g. tumblr) and that they use this logic when rolling out Panda. I'd love to hear your thoughts and opinions though?
Algorithm Updates | | TomCritchlow1