Number of indexed pages dropped. No manual action though?
-
I have a client who had their WordPress site hacked. At that point there was no message from Google in webmaster tools and the search results for their pages still looked normal. They paid sitelock to fix the site. This was all about a month ago.
Logging into Webmaster Tools now there are still no messages from Google nor anything on the manual actions page. Their organic traffic is essentially gone. Looking at the submitted sitemap only 3 of their 121 submitted pages are indexed. Before this all of them where in the index. Looking at the index status report I can see that the number of indexed pages dropped completely off the map.
We are sure that the site is free of malware. This client has done no fishy SEO practices. What can be done?
-
Hello Rodney,
From what you can see, have there been any significant changes in your website's rankings to go along with the traffic drop? I have a client who had some web development work done by their team and they accidentally deleted the GA tracking code, leading to an enormous drop in registered traffic but no difference in rankings.
If your rankings are still intact, try seeking out the UA on your homepage to ensure it hasn't been tampered with. If it is still there and GA is accurately tracking your data, then you probably have a lingering hack that was sprung after the cleanup occurred. These things can come back to bite you if a code audit isn't performed by a security tech to ensure the site is properly cleaned.
The loss of indexed pages sounds like there might be an issue in your robots.txt file or with your sitemap.
The last part of this might be an automated Panda penalty, but this is probably the least likely option.
During the hack, was there any nefarious link-building conducted that you're aware of? What you're experiencing might be the lingering effects of an automated Penguin penalty (there can be several months of sluggish rankings following one of these).
Let me know about the above and I will see what else I can do to help!
Thanks and best of luck,
Rob
-
Thanks to Umar for asking for the domain.
I looked using a site: operator and found 13 pages and 21 pdf's that were indexed. I am not familiar with the company who handled the cleanup so I would check to make sure they have not added any robots.txt, etc. in an attempt to keep it all safe. That would be my first look. To me this is the most logical if you have submitted a new sitemap.
Let us know how it comes out,
Robert
-
More info:
The website: http://www.bouldertherapeutics.com/
Website hacked: End of June
Website cleaned and verified by sitelock within days
Number of indexed pages pages drop from in the 100's to just a few
Organic search visits dropped from 1600 in June to just over 400 in July
No messages in webmaster tools. No website changes. No link building done.
-
This sounds really scary! I have to look at the URL before making any final comment on this.
IMO, there could be several reasons for this. You have to explore this from every way. Are you sure, the site is completely recover? I have some reservations on that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do uncrawled but indexed pages affect seo?
It's a well known fact that too much thin content can hurt your SEO, but what about when you disallow google to crawl some places and it indexes some of them anyways (No title, no description, just the link) I am building a shopify store and it's imposible to change the robots.txt using shopify, and they disallow for example, the cart. Disallow: /cart But all my pages are linking there, so google has the uncrawled cart in it's index, along with many other uncrawled urls, can this hurt my SEO or trying to remove that from their index is just a waste of time? -I can't change anything from the robots.txt -I could try to nofollow those internal links What do you think?
Intermediate & Advanced SEO | | cuarto7150 -
How to 301 Redirect /page.php to /page, after a RewriteRule has already made /page.php accessible by /page (Getting errors)
A site has its URLs with php extensions, like this: example.com/page.php I used the following rewrite to remove the extension so that the page can now be accessed from example.com/page RewriteCond %{REQUEST_FILENAME}.php -f
Intermediate & Advanced SEO | | rcseo
RewriteRule ^(.*)$ $1.php [L] It works great. I can access it via the example.com/page URL. However, the problem is the page can still be accessed from example.com/page.php. Because I have external links going to the page, I want to 301 redirect example.com/page.php to example.com/page. I've tried this a couple of ways but I get redirect loops or 500 internal server errors. Is there a way to have both? Remove the extension and 301 the .php to no extension? By the way, if it matters, page.php is an actual file in the root directory (not created through another rewrite or URI routing). I'm hoping I can do this, and not just throw a example.com/page canonical tag on the page. Thanks!0 -
How can I optimize pages in an index stack
I have created an index stack. My home page is http://www.southernwhitewater.com My home page (if your look at it through moz bat for chrome bar} incorporates all the pages in the index. Is this Bad? I would prefer to index each page separately. As per my site index in the footer What is the best way to optimize all these pages individually and still have the customers arrive at the top and links directed to the home page ( which is actually the 1st page). I feel I am going to need a rel=coniacal might be needed somewhere. Any help would be great!!
Intermediate & Advanced SEO | | VelocityWebsites0 -
Can you no index a page in Wordpress from just Google news?
I'm trying to find a plugin for Wordpress that enables you to no-index an individual page from Google news but not from Google search results. We want to remove some of our pages from Google news without hurting others.
Intermediate & Advanced SEO | | uSw0 -
Town and County pages taking months to index.
Hi, At http://www.general-hypnotherapy-register.com/regional-hypnotherapy-directory/ we have a load of town and county pages for all of the hypnotherapists on the site a) I have checked all of these links and they are spiderable. b) About a month back I noticed after the site changes, not entirely sure why, but the site was generating rogue pages, eg http://www.general-hypnotherapy-register.com/hypnotherapists/page/5/?town=barnsley instead of http://www.general-hypnotherapy-register.com/hypnotherapists/?town=barnsley We have added meta no index, no follow to these rogue pages around 4 weeks ago..however these pages still have a google cache date of Oct 4th predating these meta changes c) There are examples of the pages we do want, indexed, and ranking too on page 1, site:www.general-hypnotherapy-register.com/hypnotherapists eg http://www.general-hypnotherapy-register.com/hypnotherapists/?town=ockham however these pages are few and far between, these have a recent google cache date of Nov 1 **d) **The xml sitemap has all of the correct URLS, but in webmaster tools, the amount of pages indexed has been stubbornly flat at 2800 out of 4400 for 4 weeks now e) Query Paramaters: for ?town and ?county in webmaster tools, are set to Yes/Specifies Would love any suggestions, Thanks. Mark.
Intermediate & Advanced SEO | | Advantec0 -
Dynamic pages - ecommerce product pages
Hi guys, Before I dive into my question, let me give you some background.. I manage an ecommerce site and we're got thousands of product pages. The pages contain dynamic blocks and information in these blocks are fed by another system. So in a nutshell, our product team enters the data in a software and boom, the information is generated in these page blocks. But that's not all, these pages then redirect to a duplicate version with a custom URL. This is cached and this is what the end user sees. This was done to speed up load, rather than the system generate a dynamic page on the fly, the cache page is loaded and the user sees it super fast. Another benefit happened as well, after going live with the cached pages, they started getting indexed and ranking in Google. The problem is that, the redirect to the duplicate cached page isn't a permanent one, it's a meta refresh, a 302 that happens in a second. So yeah, I've got 302s kicking about. The development team can set up 301 but then there won't be any caching, pages will just load dynamically. Google records pages that are cached but does it cache a dynamic page though? Without a cached page, I'm wondering if I would drop in traffic. The view source might just show a list of dynamic blocks, no content! How would you tackle this? I've already setup canonical tags on the cached pages but removing cache.. Thanks
Intermediate & Advanced SEO | | Bio-RadAbs0 -
How important is the number of indexed pages?
I'm considering making a change to using AJAX filtered navigation on my e-commerce site. If I do this, the user experience will be significantly improved but the number of pages that Google finds on my site will go down significantly (in the 10,000's). It feels to me like our filtered navigation has grown out of control and we spend too much time worrying about the url structure of it - in some ways it's paralyzing us. I'd like to be able to focus on pages that matter (explicit Category and Sub-Category) pages and then just let ajax take care of filtering products below these levels. For customer usability this is smart. From the perspective of manageable code and long term design this also seems very smart -we can't continue to worry so much about filtered navigation. My concern is that losing so many indexed pages will have a large negative effect (however, we will reduce duplicate content and be able provide much better category and sub-category pages). We probably should have thought about this a year ago before Google indexed everything :-). Does anybody have any experience with this or insight on what to do? Thanks, -Jason
Intermediate & Advanced SEO | | cre80 -
Number of Indexed Pages are Continuously Going Down
I am working on online retail stores. Initially, Google have indexed 10K+ pages of my website. I have checked number of indexed page before one week and pages were 8K+. Today, number of indexed pages are 7680. I can't understand why should it happen and How can fix it? I want to index maximum pages of my website.
Intermediate & Advanced SEO | | CommercePundit0