New server update + wrong robots.txt = lost SERP rankings
-
Over the weekend, we updated our store to a new server. Before the switch, we had a robots.txt file on the new server that disallowed its contents from being indexed (we didn't want duplicate pages from both old and new servers).
When we finally made the switch, we somehow forgot to remove that robots.txt file, so the new pages weren't indexed. We quickly put our good robots.txt in place, and we submitted a request for a re-crawl of the site.
The problem is that many of our search rankings have changed. We were ranking #2 for some keywords, and now we're not showing up at all. Is there anything we can do? Google Webmaster Tools says that the next crawl could take up to weeks! Any suggestions will be much appreciated.
-
Dr. Pete,
I just ran across one of your webinars yesterday and you brought up some great ideas. Earned a few points in my book
Too often SEOs see changes in the rankings and react to counter-act the change. Most of the time these bounces are actually a GOOD sign. It means Google saw your changes and is adjusting to them. If your changes were positive you should see positive results. I have rarely found an issue where a user made a positive change and got a negative result from Google. Patience is a virtue.
-
Thanks everyone for the help! Fortunately we remedied the problem almost immediately, so it only took about a day to get our rankings back. I think the sitemap and fixed robots.txt were the most important factors.
-
I agree, let Google re-index first and then re evaluate the situation.
-
I hate to say it, but @inhouseninja is right - there's not a lot you can do, and over-reacting could be very dangerous. In other words - don't make a ton of changes just to offset this - Google will re-index.
A few minor cues that are safe:
(1) Re-submit your XML sitemap
(2) Build a few new links (authoritative ones, especially)
(3) Hit social media with your new URLs
All 3 are at least nudges to re-index. They aren't magic bullets, but you need to get Google's attention.
-
Remain calm. You should be just fine. It just takes time for Google to digest the new robots.txt. I would be concerned if things didn't change in 3-4 weeks. Adopt a rule to not freak out on Google until you've given the problem 14 days to resolve. Sometimes Google moves things around and this is natural.
If you want Google to crawl your site faster, build some links and do some social media. That will encourage Google to speed it up.
-
If this is all that happened the next crawl should fix it. Just sit tight and they should bounce up again in a week or so.
-
That does not sound fun at all.... So you just changed the server, complete copy?
My first question would be other than the server did anything else change? Copy or URL's
My second question would be is the other server still up and live to the internet?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If my website do not have a robot.txt file, does it hurt my website ranking?
After a site audit, I find out that my website don't have a robot.txt. Does it hurt my website rankings? One more thing, when I type mywebsite.com/robot.txt, it automatically redirect to the homepage. Please help!
Intermediate & Advanced SEO | | binhlai0 -
Website not ranking
Firstly, apologies for the long winded question. I'm 'newish' to SEO We have a website built on Magento , www.excelclothing.com We have been online for 5 years and had reasonable success. Having used a few SEO companies in the past we found ourselves under a 'partial manual penalty' early last year. By July we were out of penalty. We have been gradually working our way through getting rid of 'spammy' links. Currently the website ranks for a handful of non competitive keywords looking at the domain on SEM RUSH. This has dropped drastically over the last 2 years. Our organic traffic over the last 2-3 years has seen no 'falling off a cliff' and has maintained a similar pattern. I've been told so many lies by SEO companies trying to get into my wallet I'm not sure who to believe. We have started to add content onto all our Category pages to make more unique although most of our Meta Descriptions are a 'boiler plate' template. I'm wondering.... Am I still suffering from Penquin ? Am I trapped by Panda and if so how can I know that? Do I need more links removed? How can I start to rank for more keywords I have a competitor online with the same DA, PA and virtually same number of links but they rank for 3500 keywords in the top 20. Would welcome any feedback. Many Thanks.
Intermediate & Advanced SEO | | wgilliland1 -
Using folder blocked by robots.txt before uploaded to indexed folder - is that OK?
I have a folder "testing" within my domain which is a folder added to the robots.txt. My web developers use that folder "testing" when we are creating new content before uploading to an indexed folder. So the content is uploaded to the "testing" folder at first (which is blocked by robots.txt) and later uploaded to an indexed folder, yet permanently keeping the content in the "testing" folder. Actually, my entire website's content is located within the "testing" - so same URL structure for all pages as indexed pages, except it starts with the "testing/" folder. Question: even though the "testing" folder will not be indexed by search engines, is there a chance search engines notice that the content is at first uploaded to the "testing" folder and therefore the indexed folder is not guaranteed to get the content credit, since search engines see the content in the "testing" folder, despite the "testing" folder being blocked by robots.txt? Would it be better that I password protecting this "testing" folder? Thx
Intermediate & Advanced SEO | | khi50 -
Update content or create a new page for a year related blog post?
I have a page called 'video statistics 2013' which ranks really well for video stat searches and drives in a lot of traffic to the site. Am I best to just change the title etc to 2014 and update the content, or create a totally new page? The page has 2013 in the URL as well which may be a problem for just updating?
Intermediate & Advanced SEO | | JonWhiting0 -
Help - Lost Ranking - What did I screw up?!
Hi, We're working with a local service provider with a location specific keyword (not a real example: "orlando plumbing contractors"). Background:
Intermediate & Advanced SEO | | AaronHenry
In recent history the client updated a new site design and upgraded from Joomla 1.5 to Joomla 2.5. Of course there were duplicate content issues which have been resolved with the help of AceSEF. Duplicate content, title tags, and other content issues are handled as soon as they appear in GWT or MOZ. Additionally, a high number of backlinks were lost when the latest Google update hit. Many of these sites were of sites that no longer existed or were spammy and flushed out. Some were lost due to the previous SEO firm literally removing backlinks and switching them to their new client (seo firm was putting all of the work the client paid for under their name to control everything). Current Situation: The backlink loss seems to have been stopped (hopefully) because we are using a new strategy that relies solely on the quality of the links, surrounding text, varied anchor text, relevancy, etc.) However, we tried an experiment on just one of the clients keywords. That experiment seems to have blown up in our faces evidently. The landing page for the location specific keyword has dropped from the index completely (it seems), but only when searching broad. When using exact match with quotes like the example quoted above ("orlando plumbing contractors") the landing page appears, but several ranks lower. We were ranked yesterday 6/23/13, but as of today 6/24/13 are no longer ranked. On broad matches, non-relevant sites and even a site that shows only a broken server configuration is outranking the client (they appear for the broad search, but the client does not). What Was Done
We recently created a press release for and posted it on a press release site. We then created a link back to the landing page (exact match anchor text). We posted the PR article to several social sites (Google plus, folkd, delicious, twitter, stumble upon, diigo). We also created a blog article (on-site) on site for that, creating links back to the landing page (the links all had exact anchor text). We posted that blog article to social news sites (facebook, stumble, delicious) and included a ping. The PR article was manually rewritten and posted to the PR site (we had to make 2 versions of the PR; one for the blog and one for the PR site). The Result
The client ending up dropping off the broad search rankings, but only slipped a few for "exact match" (with quotes). The PR article that was created is now ranked on page 3 for the board keyword and is still beat by non working sites. We suspect that the exact anchor text could be causing this problem. Anyone else have an idea (we're scratching our heads and trying not to freak out at the same time).0 -
Does It Really Matter to Restrict Dynamic URLs by Robots.txt?
Today, I was checking Google webmaster tools and found that, there are 117 dynamic URLs are restrict by Robots.txt. I have added following syntax in my Robots.txt You can get more idea by following excel sheet. #Dynamic URLs Disallow: /?osCsidDisallow: /?q= Disallow: /?dir=Disallow: /?p= Disallow: /*?limit= Disallow: /*review-form I have concern for following kind of pages. Shorting by specification: http://www.vistastores.com/table-lamps?dir=asc&order=name Iterms per page: http://www.vistastores.com/table-lamps?dir=asc&limit=60&order=name Numbering page of products: http://www.vistastores.com/table-lamps?p=2 Will it create resistance in organic performance of my category pages?
Intermediate & Advanced SEO | | CommercePundit0 -
Ranking problems
Hi All My site is live for a year now. I;m getting tons of traffic (alexa 54k) and business are good. The only problem is that I have 0 page rank....I have checked again and again the site;s structure to see if there is anything wrong with the site but everything seems to be ok. Google just added search links to the site (megamoneygames) which looks very nice. For example, none of my competitors have search links but they all have page rank of 4 while I have 0. In addition, for some reason the site's age (days) shows 0 although it is live for a year now... Do you have any idea of what is going on? do I have errors in the site? Thanks
Intermediate & Advanced SEO | | Pariplay0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0