New server update + wrong robots.txt = lost SERP rankings
-
Over the weekend, we updated our store to a new server. Before the switch, we had a robots.txt file on the new server that disallowed its contents from being indexed (we didn't want duplicate pages from both old and new servers).
When we finally made the switch, we somehow forgot to remove that robots.txt file, so the new pages weren't indexed. We quickly put our good robots.txt in place, and we submitted a request for a re-crawl of the site.
The problem is that many of our search rankings have changed. We were ranking #2 for some keywords, and now we're not showing up at all. Is there anything we can do? Google Webmaster Tools says that the next crawl could take up to weeks! Any suggestions will be much appreciated.
-
Dr. Pete,
I just ran across one of your webinars yesterday and you brought up some great ideas. Earned a few points in my book
Too often SEOs see changes in the rankings and react to counter-act the change. Most of the time these bounces are actually a GOOD sign. It means Google saw your changes and is adjusting to them. If your changes were positive you should see positive results. I have rarely found an issue where a user made a positive change and got a negative result from Google. Patience is a virtue.
-
Thanks everyone for the help! Fortunately we remedied the problem almost immediately, so it only took about a day to get our rankings back. I think the sitemap and fixed robots.txt were the most important factors.
-
I agree, let Google re-index first and then re evaluate the situation.
-
I hate to say it, but @inhouseninja is right - there's not a lot you can do, and over-reacting could be very dangerous. In other words - don't make a ton of changes just to offset this - Google will re-index.
A few minor cues that are safe:
(1) Re-submit your XML sitemap
(2) Build a few new links (authoritative ones, especially)
(3) Hit social media with your new URLs
All 3 are at least nudges to re-index. They aren't magic bullets, but you need to get Google's attention.
-
Remain calm. You should be just fine. It just takes time for Google to digest the new robots.txt. I would be concerned if things didn't change in 3-4 weeks. Adopt a rule to not freak out on Google until you've given the problem 14 days to resolve. Sometimes Google moves things around and this is natural.
If you want Google to crawl your site faster, build some links and do some social media. That will encourage Google to speed it up.
-
If this is all that happened the next crawl should fix it. Just sit tight and they should bounce up again in a week or so.
-
That does not sound fun at all.... So you just changed the server, complete copy?
My first question would be other than the server did anything else change? Copy or URL's
My second question would be is the other server still up and live to the internet?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why am i ranking better in canada?
hi, some of my keywords with high volume are ranking on first page in canada, but in the states i am on 3rd pages first result. what factors are contributing this disparity. what can be done here in this case. is it because of my links and tld distribution or some server location thing. what should i do to rank better in the US? i have shared hosting server in singapore.
Intermediate & Advanced SEO | | Sam09schulz0 -
Homepage only has disappeared from SERPs
I have a website that has been ranking really well with the home page for our most important keywords. 2nd position. A few days ago I noticed the home page was nowhere to be found in the SERPs. Not really knowing what to do I republished my Rapidweaver site and got the message that there were two index files. An index.html and index.php file which could cause problems. The home page has always been a html page so I have no idea how this was created. I deleted the php file from the file directory and resubmitted the homepage in webmaster tools for indexing. Within about half an hour the page was appearing in the SERPs again in my original position for the important keywords. Two days later and it's gone again. I've tried everything I can think of and resubmitted the page to Google for indexing. I can see it is being indexed as it comes up when I type the URL into the search bar but it will not come up in the search results either for my two keywords or the actual name of my business. When I type in business name it brings up lots of other pages from the site but not the home page. I've spoken to 3 SEO companies and nobody knows what is causing it. Please help with any suggestions this will definitely impact our business if we can't figure it out. Has this happened to anyone else? My website is NSFW but is www.aprilnites.com.au
Intermediate & Advanced SEO | | GemmaApril0 -
Need to update Google Search Console profile for http to https change. Will a "change of address" option suffice or do we need to create a new GSC profile?
In the past I have seen most clients create new Google Search Profile when they update to a https URL. However a colleague of mine asked if just updating the change of address option will suffice https://support.google.com/webmasters/answer/83106. Would it be best to just update the change of address for the Google Search Console profile to keep the data seamless? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Question about Syntax in Robots.txt
So if I want to block any URL from being indexed that contains a particular parameter what is the best way to put this in the robots.txt file? Currently I have-
Intermediate & Advanced SEO | | DRSearchEngOpt
Disallow: /attachment_id Where "attachment_id" is the parameter. Problem is I still see these URL's indexed and this has been in the robots now for over a month. I am wondering if I should just do Disallow: attachment_id or Disallow: attachment_id= but figured I would ask you guys first. Thanks!0 -
Should comments and feeds be disallowed in robots.txt?
Hi My robots file is currently set up as listed below. From an SEO point of view is it good to disallow feeds, rss and comments? I feel allowing comments would be a good thing because it's new content that may rank in the search engines as the comments left on my blog often refer to questions or companies folks are searching for more information on. And the comments are added regularly. What's your take? I'm also concerned about the /page being blocked. Not sure how that benefits my blog from an SEO point of view as well. Look forward to your feedback. Thanks. Eddy User-agent: Googlebot Crawl-delay: 10 Allow: /* User-agent: * Crawl-delay: 10 Disallow: /wp- Disallow: /feed/ Disallow: /trackback/ Disallow: /rss/ Disallow: /comments/feed/ Disallow: /page/ Disallow: /date/ Disallow: /comments/ # Allow Everything Allow: /*
Intermediate & Advanced SEO | | workathomecareers0 -
Rankings going down down down
Hi guys, I know this is a little open ended, but any advice/ideas would be greatly appreciated. I launched a new site about 3 months ago (www.transfersandshuttles.co.za). I have had unique, useful articles written for it, and the site does provide a useful service. I have done a little link building and continue to do so. The site was making decent progress moving up the rankings for a few weeks, but now it just seems to get worse and worse.
Intermediate & Advanced SEO | | cashchampion
I'm not looking for an entire site audit or strategy here, just if anything jumps out at you that seems very poor for seo, please let me know. Thanks so much,
Marc0 -
Ranking a site in the USA
I'm UK based and looking at setting up a site to rank in the USA. As I understand it a .com TLD is best but these are used worldwide so do I simply need to set the geotargeting to USA in webmaster tools? Or is there a better domain to use? With hosting the site in US and on page content related to US cities (I plan to create a page for each US city I operate in the the city name in the H1 tag) will that be enough for google to understand that the page should rank in the US version of google. Also how can I view Google USA search results - when I go to google.com it automatically redirects to google.co.uk and I can only change the location on the left hand side to UK cities. Any help much appreciated!
Intermediate & Advanced SEO | | SamCUK0 -
Negative impact on crawling after upload robots.txt file on HTTPS pages
I experienced negative impact on crawling after upload robots.txt file on HTTPS pages. You can find out both URLs as follow. Robots.txt File for HTTP: http://www.vistastores.com/robots.txt Robots.txt File for HTTPS: https://www.vistastores.com/robots.txt I have disallowed all crawlers for HTTPS pages with following syntax. User-agent: *
Intermediate & Advanced SEO | | CommercePundit
Disallow: / Does it matter for that? If I have done any thing wrong so give me more idea to fix this issue.0