After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
-
Hello,
I have just performed doing server migration 2 days back
All's well with traffic moved to new servers
But somehow - it seems that w.r.t previous host that on submitting a new article - it was getting indexed in minutes. Now even after submitting page for indexing - its taking bit of time in coming to Search Engines and some pages wherein content is daily updated - despite submitting for indexing - changes are not getting reflected
Site name is - http://www.mycarhelpline.com
Have checked in robots, meta tags, url structure - all remains well intact. No unknown errors reports through Google webmaster
Could someone advise - is it normal - due to name server and ip address change and expect to correct it automatically or am i missing something
Kindly advise in . Thanks
-
Thanks all for Inputs
I searched Google and found this note from Google which may happen post server migration
https://support.google.com/webmasters/answer/6033412?hl=en&ref_topic=6033383
A note about Googlebot’s crawl rate
It’s normal to see a temporary drop in Googlebot’s crawl rate immediately after the launch, followed by a steady increase over the next few weeks, potentially to rates that may be higher than from before the move.
This fluctuation occurs because we determine crawl rate for a site based on many signals, and these signals change when your hosting changes. As long as Googlebot does not encounter any serious problems or slowdowns when accessing your new serving infrastructure, it will try to crawl your site as fast as necessary and possible.
Add on Thompson Paul - Appreciate - yes its a good suggestion, will see to include sitemap
-
The one thing you haven't mentioned, which is likely to be most critical for this issue, is your XML sitemap. I couldn't find it at any of the standard URLs (/sitemap.xml and /sitemap_index.xml both lead to generic 404 pages). Also, there's no directive to the sitemap in your robots.txt.
Given that the sitemap.xml is the clearest and fastest way for you to help Google to discover new content, I'd strongly recommend you get a clean, dynamically updated sitemap.xml implemented for the site, submit it through both Google and Bing webmaster tools, and place the proper pointer to it in your robots.txt file.
Once it's been submitted to the webmaster tools, you'll be able to see exactly how frequently its being discovered/crawled.
Hope that helps?
Paul
-
The good news is, this actually sounds pretty normal. 24 hours to reflect changes in content is better than many sites. I can't account for why it dropped from 4 to 24, but I'd say this is still in the range of "good"
-
@ Cyrus
Certain pages
Earlier it was less than 4 hrs - but now its taking around 24 hrs - basis the data been updated in Search engine result just found today - i thought it was not getting updated at all
Fetch & render - no issues, Its submitting. No errors in GWT
Tested speed test - though no noticeable improvement in loading time - but no unnessary page size or load time been increased too
I was wondering - can it be a temporary phenomena - where crawl speed is slow and later on will come back to normal. Its less than 72 hrs when server been migrated
Google Search Console Crawl Stats is last updated for 16th June - so unable to figure it out from there. No errors in webmaster
-
Howdy,
A couple of questions:
1. Are there certain pages that aren't getting updated, or is it your entire site?
2. How often are changes in the pages reflected in Google's cache?Is it a case where Google simply displays old/outdated information all the time? Finally, have you done a "Fetch and Render" check in Google Webmaster Tools?
-
@Anirban
Thanks, no errors in GWT Tools. Loading time - could not observe a change. As per Gtmetrics - tested - is well within limits. Pages with dynamic content are not getting updated in Search Engine - which earlier was happening on immediate basis
-
It should not be. check your page load time. If pages takes longer to load than google bot may bounce off. Check your webmaster tool as see if there are any server errors showing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub Directories Domain & Page Crawl Depth
Hi, I just bought an old domain with good backlinks and authority, that domain was technology product formerly. So, I want to make this domain for my money site. The purpose of this website is to serve technological information like WordPress tutorial and etc (free software or drivers). And I just installed a sub directory on this domain like https://maindomain.com/subdirectory/ and this directory I made for a free software like graphics drivers download (NVIDIA or AMD). What you think with this website? Is it make sense? Wait, I just added this domain to my campaign at MOZ and the result shown my sub directory was 6 times of crawl depth. Is it good for directory or I need to move the sub directory to my main site? Thank you, hope someone answer my confuse. Best Regard, Matthew.
Intermediate & Advanced SEO | | matthewparkman0 -
Changing Canonical Tags on Indexed Pages that are Ranking Well
Hi Guys, I recently rolled out a domain wide canonical tag change. Previously the website had canonical tags without the www, however the website was setup to redirect to www on page load. I noticed that the site competitors were all using www and as far as I understand www versus non www, it's based on preference. In order to keep things consistent, I changed the canonical tag to include the www. Will the site drop in rankings? Especially if the pages are starting to rank quite well. Any feedback is appreciated. Thanks!
Intermediate & Advanced SEO | | QuickToImpress0 -
Google Adsbot crawling order confirmation pages?
Hi, We have had roughly 1000+ requests per 24 hours from Google-adsbot to our confirmation pages. This generates an error as the confirmation page cannot be viewed after closing or by anyone who didn't complete the order. How is google-adsbot finding pages to crawl that are not linked to anywhere on the site, in the sitemap or linked to anywhere else? Is there any harm in a google crawler receiving a higher percentage of errors - even though the pages are not supposed to be requested. Is there anything we can do to prevent the errors for the benefit of our network team and what are the possible risks of any measures we can take? This bot seems to be for evaluating the quality of landing pages used in for Adwords so why is it trying to access confirmation pages when they have not been set for any of our adverts? We included "Disallow: /confirmation" in the robots.txt but it has continued to request these pages, generating a 403 page and an error in the log files so it seems Adsbot doesn't follow robots.txt. Thanks in advance for any help, Sam
Intermediate & Advanced SEO | | seoeuroflorist0 -
How do I Enable Rich Snippets for an Events Page that is updated weekly?
Hello Moz World! I have a client that has an events page that they update every week. They conduct weekly demos with current customers and potential customers for their software. They post dates, times and topics for each demo. I'd like to enable event rich snippets for their website (see attached image for an example), but I am unsure exactly how to do that. A) Do I just need to setup Event Schema Tags? Does it need to be updated manually every week? Is their a software solution? Thanks ahead of time for all the great responses! Cheers Will H. 8Wsh3l8
Intermediate & Advanced SEO | | MarketingChimp100 -
How to Fix Duplicate Page Content?
Our latest SEOmoz crawl reports 1138 instances of "duplicate page content." I have long been aware that our duplicate page content is likely a major reason Google has de-valued our Web store. Our duplicate page content is the result of the following: 1. We sell audio books and use the publisher's description (narrative) of the title. Google is likely recognizing the publisher as the owner / author of the description and our description as duplicate content. 2. Many audio book titles are published in more than one format (abridged, unabridged CD, and/or unabridged MP3) by the same publisher so the basic description on our site would be the same at our Web store for each format = more duplicate content at our Web store. Here's are two examples (one abridged, one unabridged) of one title at our Web store. Kill Shot - abridged Kill Shot - unabridged How much would the body content of one of the above pages have to change so that a SEOmoz crawl does NOT say the content is duplicate?
Intermediate & Advanced SEO | | lbohen0 -
Domain change - slow & easy, or rip off the bandaid?
We are laying the foundation for a domain change. I'm gathering all of the requirements listed from Google (301's, sign up the new domain with WMT, etc), customer communications, email system changes, social updates, etc. But through everything I've read, I'm not quite clear on one thing. We have the option of keeping our current domain and the new domain running off the same eCommerce database at the same time. This means that we have the option of running two exact duplicates simultaneously. The thought is that we would slowly, quietly turn on the new domain, start the link building and link domain changing processes, and generally give the new domain time to make sure it's not going to croak for some reason. Then, after a week or so, flip on a full 301 rewrite for the old domain. There are no concerns regarding order databases, as both domains would be running off of the same system. The only concern I have in the user experience is making sure I have internal links all set to relative, so visitors to the new domain aren't flipped over and freaked out by an absolute URL. I'm not confident that this co-existing strategy is the best approach, though. I'm wondering if it would be better from an SEO (and customer) perspective to Have the new domain active and performing a 302 redirect from the new domain to the corresponding page on the old domain When we're ready to flip the switch, implement the 301 redirect from old to new (removing the 302, of course) at switch time. Any thoughts or suggestions?
Intermediate & Advanced SEO | | Goedekers0 -
301 redirects from old to new pages whit a lot of changes
Hello all, We are going to restyle and change CMS so all the urls will change. We are also updating content, adding much more content to the old pages trying to be more user and SEO friendly. My doubt is about doing 301 redirects from old to new pages when the content has changed a lot. Does it will mantain the ranking of the page or will crawlers thought that is a total diferent page. For example: one page new page will change from the old one the url, title, headers, meta description, content text and images. Should i maintain old content and do the CMS change with the 301 redirects and later change the content, that means a lot of work, or do it all at once? Thanks in advance Tomas
Intermediate & Advanced SEO | | tomas.guemes0 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770