Google Dropping Pages After SEO Clean Up
-
I have been using SEOmoz to clear errors from a site. There
were over 10,000 errors to start with. Most of these were duplicate content, duplicate titles and too many links on a page. Most of the duplicate errors have now been
cleared. This has been done in two weeks (down to around 3000 errors now).But instead of improving my rankings, pages that were on the second page of Google have started to drop out of the listings altogether. The pages that are dropping out
are not related to the duplicate problems and get A grades when I run SEOmoz
page reports.Can you clean up too much too quickly or is there likely to be another reason for it?
-
I totally agree with XNUMERIK. I had I site I took over after a "black hat" SEO was working on it and we dropped from #8 in Google to #212 after all my corrects. We're number #1 now for that same keyword within 3 months.
There is no "Data" to back up what I'm about to say, but it has worked nearly 100% of the time for me.
Here is what I think happens:
I corrected the technical stuff first (on-page) (site structure, internal linking and things like that)
The drop occurred when Google Crawled my site before I had a chance clean up all the backlinks, 301s coming into the site.
So, naturally I dropped because I basically blew my page authority.
Once I had a good fix on all the off-page corrections to match the on-page stuff, I fetched as Google (in Google's Webmaster Tools) and submitted all linked pages. (I only recommend doing this when you've had major changes like you've done)
It usually takes 4 to 5 crawls and I'm right back up where I was and usually higher then before. It doesn't take much longer with new links to the pages to rank it faster, especially if there is quality on-page SEO done.
-
When google is recalculating/reassessing your credentials (trust, popularity...) typically after a major update/change to your website/page, it is not uncommon that a page drops heavily before recovering to a much better position.
Best thing you can do is wait a few days and see what happens.
When you are doing "well" or "not so bad", you should always keep a backup before making any changes, this way you can go back to the older version much more quickly.
-
I usually am terrified to make any changes to "errors" on pages that are already performing well. As i think anyone will agree, no one, even G themselves fully 100% understand search results, ect.... Sometimes what an application may percieve as an "error" could actually be an unintended asset. Just my two cents after getting burned by correcting "errors"
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link to AMP VS AMP Google Cache VS Standard page?
Hi guys, During the link building strategy, which version should i prefer as a destination between: to the normal version (php page) to the Amp page of the Website to the Amp page of Google Cache The main doubt is between AMP of the website or standard Version. Does the canonical meta equals the situation or there is a better solution? Thank you so mutch!
Technical SEO | | Dante_Alighieri0 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Google is Still Blocking Pages Unblocked 1 Month ago in Robots
I manage a large site over 200K indexed pages. We recently added a new vertical to the site that was 20K pages. We initially blocked the pages using Robots.txt while we were developing/testing. We unblocked the pages 1 month ago. The pages are still not indexed at this point. 1 page will show up in the index with an omitted results link. Upon clicking the link you can see the remaining un-indexed pages. Looking for some suggestions. Thanks.
Technical SEO | | Tyler1230 -
Need Third Party Input. Our Web host blocked all bots including Google and myself because they believe SEO is slowing down their server.
I would like some third party input... partly for my sanity and also for my client. I have a client who runs a large online bookstore. The bookstore runs in Magento and the developers are also apparently the web host. (They actually run the servers.. I do not know if they are sitting under someones desk or are actually in a data center) Their server has been slowed down by local and foreign bots. They are under the impression my SEO services are sending spammer bots to crawl and slow down their site. To fix the problem they disallowed all bots. Everything, Google, Yahoo, Bing. They also banned my access from the site. My clients organic traffic instantly took a HUGE hit. (almost 50% of their traffic is organic and over 50% is Organic + Adwords most everything from Google) Their keyword rankings are taking a quick dive as well. Could someone please verify the following as true to help me illustrate to my client that this is completely unacceptable behavior on part of the host. I believe: 1.) You should never disavow ALL robots from your site as a solution for spam. As a matter of fact most of the bad bots ignore robots.txt anyways. It is a way to limit where Google searches (which is obviously a technique to be used) 2.) On site SEO work as well as link building, etc. is not responsible for foreign bots and scrappers putting a heavy load on the server. 3.) Their behavior will ultimately lead to a massive loss of rankings (already happening) and a huge loss of traffic (already happening) and ultimately since almost half the traffic is organic the client could expect to lose a large sum of revenue from purchases made by organic traffic since it will disappear. Please give your input and thoughts. I really appreciate it!
Technical SEO | | JoshuaLindley1 -
Beating big brands for rankings on Google page 1 post Panda & Penguin
Hi all, so having followed lots of SeoMoz guidelines that we have read here and standard SEO ideas we seem to no longer be able to rank for our core keywords.. and certainly not rank in front of the big brands. We're a small eCommerce company and have historically ranked Google positions 1-4 for many of our keywords (a year or two ago)... but now no where near this any more. We always write unique content for our products of usually around 300-400 words per product we include our keywords in Title, meta description and H1 tags. We include buyers guides and set up articles on the site and generally have a reasonable amount of good quality and always uniquely written content Recently we have concentrated to ensure that page load speed is above average and Google Web Master Tools page speed gives us around 80-90 out of 100 We carry out linking and have always done... in the most recent past this has been weighted towards 'content for links' to gain purely incoming links (although in the early days from 2005 we did swap links with other web masters as well as write and publish on article sites etc). product category pages have an intro piece of text that includes the key phrases for that page and is placed as close to the body tag as possible. From what I understand if you are hit by Panda or Penguin the drop off is invariably over night, but we have not seen this... more of a gradual decline over the last year or two (although there was a bit of a downward blip on Panda update 20). Now we're lucky to be on page 2 for what were our main keywords / phrases such as "portable DVD players" or "portable DVD player"... in front of us in every position is a big national brand.. and certainly on page 1 it is purely only a big brand in every postion. They don't have great info from what we can see for these keywords and certainly don't give as much info as we do. For the phrase "portable DVD player" our portable DVD accessories page ranks better than our actual portable DVD player category page... which we also can't understand? This is our portable DVD category page: http://www.3wisemonkeys.co.uk/portable-dvd-players-car Currently we're starting to produce 2 minute product demo videos for as many of our product detail pages as we can and we plan to host these on something such as Vimeo so that content will be unique to our site (rather than YouTube) in order to give us a different format of unique content on many of our product detail pages to improve rankings (and conversion rates as the same time ideally). So ... I am hoping that some one out there can point us in the right direction and shed some light on our declining positions. Are we doing or have done something wrong... or is it in these post Panda / Penguin days extremely difficult for a small business to beat the big brands as Google believes these are what every one wants to see when shopping? Thanks for any comments and / or help.
Technical SEO | | jasef0 -
How should i knows google to indexed my new pages ?
I have added many products in my ecommerce site but most of the google still not indexed yet. I already submitted sitemap a month ago but indexed process was very slow. Is there anyway to know the google to indexed my products or pages immediately. I can do ping but always doing ping is not the good idea. Any more suggestions ?
Technical SEO | | chandubaba1 -
Is a Rel="cacnonical" page bad for a google xml sitemap
Back in March 2011 this conversation happened. Rand: You don't want rel=canonicals. Duane: Only end state URL. That's the only thing I want in a sitemap.xml. We have a very tight threshold on how clean your sitemap needs to be. When people are learning about how to build sitemaps, it's really critical that they understand that this isn't something that you do once and forget about. This is an ongoing maintenance item, and it has a big impact on how Bing views your website. What we want is end state URLs and we want hyper-clean. We want only a couple of percentage points of error. Is this the same with Google?
Technical SEO | | DoRM0 -
Does Google see page with Trailing Slash as different
My company is purchasing another company's website. We are moving their entire site onto our CMS and the IT guys are working hard to replicate the URL structure. Several of the category pages are changing slightly and I am not sure if it matters: Old URL - http://www.DOMAIN.com/products/adults New URL - http://www.DOMAIN.com/products/adults**/** Notice the trailing slash? Will Google treat the new page as the same as the old one or as completely different (i.e. new) page? P.S. - Yes, I can setup 301s but since these pages hold decent rankings I'd really like to keep it exactly the same.
Technical SEO | | costume0