Google Dropping Pages After SEO Clean Up
-
I have been using SEOmoz to clear errors from a site. There
were over 10,000 errors to start with. Most of these were duplicate content, duplicate titles and too many links on a page. Most of the duplicate errors have now been
cleared. This has been done in two weeks (down to around 3000 errors now).But instead of improving my rankings, pages that were on the second page of Google have started to drop out of the listings altogether. The pages that are dropping out
are not related to the duplicate problems and get A grades when I run SEOmoz
page reports.Can you clean up too much too quickly or is there likely to be another reason for it?
-
I totally agree with XNUMERIK. I had I site I took over after a "black hat" SEO was working on it and we dropped from #8 in Google to #212 after all my corrects. We're number #1 now for that same keyword within 3 months.
There is no "Data" to back up what I'm about to say, but it has worked nearly 100% of the time for me.
Here is what I think happens:
I corrected the technical stuff first (on-page) (site structure, internal linking and things like that)
The drop occurred when Google Crawled my site before I had a chance clean up all the backlinks, 301s coming into the site.
So, naturally I dropped because I basically blew my page authority.
Once I had a good fix on all the off-page corrections to match the on-page stuff, I fetched as Google (in Google's Webmaster Tools) and submitted all linked pages. (I only recommend doing this when you've had major changes like you've done)
It usually takes 4 to 5 crawls and I'm right back up where I was and usually higher then before. It doesn't take much longer with new links to the pages to rank it faster, especially if there is quality on-page SEO done.
-
When google is recalculating/reassessing your credentials (trust, popularity...) typically after a major update/change to your website/page, it is not uncommon that a page drops heavily before recovering to a much better position.
Best thing you can do is wait a few days and see what happens.
When you are doing "well" or "not so bad", you should always keep a backup before making any changes, this way you can go back to the older version much more quickly.
-
I usually am terrified to make any changes to "errors" on pages that are already performing well. As i think anyone will agree, no one, even G themselves fully 100% understand search results, ect.... Sometimes what an application may percieve as an "error" could actually be an unintended asset. Just my two cents after getting burned by correcting "errors"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages with Duplicate Page Content Crawl Diagnostics
I have Pages with Duplicate Page Content in my Crawl Diagnostics Tell Me How Can I solve it Or Suggest Me Some Helpful Tools. Thanks
Technical SEO | | nomyhot0 -
Can you noindex a page, but still index an image on that page?
If a blog is centered around visual images, and we have specific pages with high quality content that we plan to index and drive our traffic, but we have many pages with our images...what is the best way to go about getting these images indexed? We want to noindex all the pages with just images because they are thin content... Can you noindex,follow a page, but still index the images on that page? Please explain how to go about this concept.....
Technical SEO | | WebServiceConsulting.com0 -
Mysterious drop in the Number of Pages Crawled
The # of crawled pages on my campaign dashboard has been 90 for months. Approximate a week ago it dropped down to 25 crawled pages, and many links went with it. I have checked with my web master, and he said no changes have been made which would cause this to happen. I am looking for suggestions on how I can go about trouble shooting this issue, and possible solutions. Thanks in advance!
Technical SEO | | GladdySEO0 -
Why is the Page Authority of my product pages so low?
My domain authority is 35 (homepage Page Authority = 45) and my website has been up for years: www.rainchainsdirect.com Most random pages on my site (like this one) have a Page Authority of around 20. However, as a whole, the individual pages of my products rank exceptionally low. Like these: http://www.rainchainsdirect.com/products/copper-channel-link-rain-chain (Page Authority = 1) http://www.rainchainsdirect.com/collections/todays-deals/products/contempo-chain (Page Authority = 1) I was thinking that for whatever reason they have such low authority, that it may explain why these pages rank lower in google for specific searches using my exact product name (in other words, other sites that are piggybacking of my unique products are ranking higher for my product in a specific name search than the original product itself on my site) In any event, I'm trying to get some perspective on why these pages remain with the same non-existent Page Authority. Can anyone help to shed some light on why and what can be done about it? Thanks!
Technical SEO | | csblev0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
Will having a big list of cities for areas a client services help or damage SEO on a page?
We have a client we inherited that has flat text list of all the cities and counties they service on their contact page. They service the entire southeast so the list just looks crazy ridiculous. --------- Example: ---- South Carolina: Abbeville, Aiken, Allendale, Anderson, Bamberg, Barnwell, Beaufort, Berkeley, Calhoun, Charleston, Cherokee, etc etc ------ end example ------ The question is, will this help or hinder their seo for their very specific niche industry? Is this key word spamming? It has an end-user purpose so it technically isn't spam, but perhaps the engines may look at it otherwise. I couldn't find a definitive answer to the question, any help would be appreciated.
Technical SEO | | Highforge0 -
When Is It Good To Redirect Pages on Your Site to Another Page?
Suppose you have a page on your site that discusses a topic that is similar to another page but targets a different keyword phrase. The page has medium quality content, no inbound links, and the attracts little traffic. Should you 301 redirect the page to a stronger page?
Technical SEO | | ProjectLabs1 -
Does page speed affect what pages are in the index?
We have around 1.3m total pages, Google currently crawls on average 87k a day and our average page load is 1.7 seconds. Out of those 1.3m pages(1.2m being "spun up") google has only indexed around 368k and our SEO person is telling us that if we speed up the pages they will crawl the pages more and thus will index more of them. I personally don't believe this. At 87k pages a day Google has crawled our entire site in 2 weeks so they should have all of our pages in their DB by now and I think they are not index because they are poorly generated pages and it has nothing to do with the speed of the pages. Am I correct? Would speeding up the pages make Google crawl them faster and thus get more pages indexed?
Technical SEO | | upper2bits0