Google Organic Ranking & Traffic Dropped
-
Hello,
We have been struggling to keep our website (http://goo.gl/vS37qA) ranking well in Google since April 30, 2015. For some reason at that time, there were around 15000 blocked pages (mainly Magento layered navigation pages) showing in Google's Search Console. We used canonical tags, and now all these pages have been removed from Google's index and Google Search Console. We didn't do anything that is against Google's Guidelines. Currently in Google Search Console we see:- Around 50 crawl errors- no malware- no blocked pages - no other error messages in both Webmasters tool.We have never practiced black hat SEO, paid for links, or used tactics that Google penalizes. We noticed in the last few months there are around 1000 Chinese/Russian/Japanese links points to our website, and we have used the disavow tool to notify Google of these attacks.Any help would be greatly appreciated in advance!
-
Hi Kristina,
Thank you very much for taking the time to review our site.
These 15000 pages were basically Layered Navigation pages. We did not feel that they were useful for Google, so we used to block them through robots.txt. However the Google bots got them somehow into their index. So then we added canonical tags on our pages and removed the block from robots.txt (In June 2015). Finally Google bots removed those pages from the index due to the canonical tags. We have around 2000 "real" pages on our website. Therefore right now it seems to us that Google's Index is showing the right number of pages. This has been the case since Sept 2015, but our rankings have stayed low.
-
Hi Nancy,
To be clear, when you say that there were 15,000 blocked pages, do you mean that Google reported that its crawler was blocked from crawling them? Or that Google was blocking them itself, thinking that they're malware or something like that?
The term "blocked" makes me think there's a technical issue here, not a penalty. Were these 15,000 pages important pages? I see that Google still has around 1,950 of your sites' pages in its index. How many should it have?
Best,
Kristina
-
Thanks for your insight and help Once Again!
Yes, at that time we dropped because we were attacked with tons of links from foreign countries. At that time we reviewed all of our back-links and used the disavow feature of Google Search Console.
-
Hi NancyH, I wouldn't use ebay.com as a comparison, with a DA of 95 any page speed issues are trumped by the backlink portfolio. I'm not seeing anything else that sticks out as an issue, I did notice that the peak of first page results was in Mar 2014, during the summer of 2014 there seems to have been the biggest dip and you have rebounded a bit since this summer of this year, sitting at about 769 first page results. Do you remember if anything changed with the site in the spring/summer of 2014? Have you dug into your analytics to see if any of those metrics changed around the same time? That is about all of the insight I can give without doing a full audit of the site. Maybe someone else in the forums can offer additional insight.
-
Hi VERBInteractive,
Thanks for looking into our issue and responding. I agree with you but improving our page speed over mobile devices is a challenge. Our development team is working this issue. We noticed many other ecommerce website (for instance ebay.com ) are still doing well over search although their page speed on Mobile devices are poor. Could you look further and see if you notice any other issue on our website?
-
Hi Nancy,
Have you explored your page speed on mobile devices? I noticed a big drop in the spring of this year (although it does look like you are slowly regaining a lot of first page results). This would have been right around google's algorithm change to favor mobile ready sites. If you look at https://developers.google.com/speed/pagespeed/insights/?url=http%3A%2F%2Fwww.thecpapshop.com%2F&tab=mobile you will see that although the user experience score is pretty good the speed score is pretty low. See what opportunities there you can fix.
Good luck,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Do You Know or Find Out if You've been hit by a Google Penalty?
Hi Moz Community, How do you find out if you have been hit with a Google Penalty? Thanks, Gary
White Hat / Black Hat SEO | | gdavey0 -
Can the disavow tool INCREASE rankings?
Hi Mozzers, I have a new client who has some bad links in their profile that are spammy and should be disavowed. They rank on the first page for some longer tail keywords. However, we're aiming at shorter, well-known keywords where they aren't ranking. Will the disavow tool, alone, have the ability to increase rankings (assuming on-site / off-site signals are better than competition)? Thanks, Cole
White Hat / Black Hat SEO | | ColeLusby0 -
Does the Traffic boost SEO/SERP ranks?
Hello, I know a guy that sells Organic traffic, bought 10k from him, will this help me to bost google seo ranks? Attached a screenshoot thank you!
White Hat / Black Hat SEO | | 7liberty0 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
Can Using Google Analytics Make You More Prone to Deindexation?
Hi, I'm aggressively link building for my clients using blog posts and have come upon information that using Google Analytics (as well as GWT, etc.) may increase my chance of deindexation. Anyone have any thoughts on this topic? I'm considering using Piwik as an alternative if this is the case. Thanks for your thoughts, Donna
White Hat / Black Hat SEO | | WebMarketingHUB0 -
Google Panelizes to much SEO
I just read this interesting article about a new Google Penalty that will be up in the next upcoming weeks/months about Google making changes to the algorithm. The penalty will be targeted towards websites that are over optimized or over seo'ed. What do you think about this? Is this a good thing or is this not a good thing for us as SEO marketeers? here's the link: SEL.com/to-much-seo I'm really curious as to your point of views. regards Jarno
White Hat / Black Hat SEO | | JarnoNijzing0 -
AdWare is Sending Me Traffic - Why?
I have a client who is receiving a good bit of traffic from a source that I believe to be AdWare, which is obviously bad. I didn't commission anyone to do this, so I'm at a loss as to where it originated, but it sends me many unique users per day - of course, the bounce rate is 80%, and the average time on the site is 8 seconds. If I were to change site servers, would that A) stop the AdWare traffic, and B) change my rankings?
White Hat / Black Hat SEO | | stubenbordt0 -
Can good penalize a site, and stop it ranking under a keyword permanently
hi all we recently took on a new client, asking us to improve there google ranking, under the term letting agents glasgow , they told us they used to rank top 10 but now are on page 14 so it looks like google has slapped them one, my question is can google block you permanently from ranking under a keyword or disadvantage you, as we went though the customers links, and removed the ones that looked strange, and kept the links that looked ok. but then there ranking dropped to 21, is it worth gaining new links under there main keyword even tho it looks like google is punishing them for having some bad links. the site is www. fine..lets...ltd...co....uk all one word cheers
White Hat / Black Hat SEO | | willcraig0