Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
After Server Migration - Crawling Gets slow and Dynamic Pages wherein Content changes are not getting Updated
-
Hello,
I have just performed doing server migration 2 days back
All's well with traffic moved to new servers
But somehow - it seems that w.r.t previous host that on submitting a new article - it was getting indexed in minutes. Now even after submitting page for indexing - its taking bit of time in coming to Search Engines and some pages wherein content is daily updated - despite submitting for indexing - changes are not getting reflected
Site name is - http://www.mycarhelpline.com
Have checked in robots, meta tags, url structure - all remains well intact. No unknown errors reports through Google webmaster
Could someone advise - is it normal - due to name server and ip address change and expect to correct it automatically or am i missing something
Kindly advise in . Thanks
-
Thanks all for Inputs
I searched Google and found this note from Google which may happen post server migration
https://support.google.com/webmasters/answer/6033412?hl=en&ref_topic=6033383
A note about Googlebot’s crawl rate
It’s normal to see a temporary drop in Googlebot’s crawl rate immediately after the launch, followed by a steady increase over the next few weeks, potentially to rates that may be higher than from before the move.
This fluctuation occurs because we determine crawl rate for a site based on many signals, and these signals change when your hosting changes. As long as Googlebot does not encounter any serious problems or slowdowns when accessing your new serving infrastructure, it will try to crawl your site as fast as necessary and possible.
Add on Thompson Paul - Appreciate - yes its a good suggestion, will see to include sitemap
-
The one thing you haven't mentioned, which is likely to be most critical for this issue, is your XML sitemap. I couldn't find it at any of the standard URLs (/sitemap.xml and /sitemap_index.xml both lead to generic 404 pages). Also, there's no directive to the sitemap in your robots.txt.
Given that the sitemap.xml is the clearest and fastest way for you to help Google to discover new content, I'd strongly recommend you get a clean, dynamically updated sitemap.xml implemented for the site, submit it through both Google and Bing webmaster tools, and place the proper pointer to it in your robots.txt file.
Once it's been submitted to the webmaster tools, you'll be able to see exactly how frequently its being discovered/crawled.
Hope that helps?
Paul
-
The good news is, this actually sounds pretty normal. 24 hours to reflect changes in content is better than many sites. I can't account for why it dropped from 4 to 24, but I'd say this is still in the range of "good"
-
@ Cyrus
Certain pages
Earlier it was less than 4 hrs - but now its taking around 24 hrs - basis the data been updated in Search engine result just found today - i thought it was not getting updated at all
Fetch & render - no issues, Its submitting. No errors in GWT
Tested speed test - though no noticeable improvement in loading time - but no unnessary page size or load time been increased too
I was wondering - can it be a temporary phenomena - where crawl speed is slow and later on will come back to normal. Its less than 72 hrs when server been migrated
Google Search Console Crawl Stats is last updated for 16th June - so unable to figure it out from there. No errors in webmaster
-
Howdy,
A couple of questions:
1. Are there certain pages that aren't getting updated, or is it your entire site?
2. How often are changes in the pages reflected in Google's cache?Is it a case where Google simply displays old/outdated information all the time? Finally, have you done a "Fetch and Render" check in Google Webmaster Tools?
-
@Anirban
Thanks, no errors in GWT Tools. Loading time - could not observe a change. As per Gtmetrics - tested - is well within limits. Pages with dynamic content are not getting updated in Search Engine - which earlier was happening on immediate basis
-
It should not be. check your page load time. If pages takes longer to load than google bot may bounce off. Check your webmaster tool as see if there are any server errors showing.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Migrating Subfolder content to New domain Safely
Hello everyone, I'm currently facing a challenging situation and would greatly appreciate your expertise and guidance. I own a website, maniflexa.com, primarily focused on the digital agency niche. About 3 months ago, I created a subfolder, maniflexa.com/emploi/, dedicated to job listings which is a completely different niche. The subfolder has around 120 posts and pages. Unfortunately, since I created the subfolder, the rankings of my main site have been negatively impacted. I was previously ranking #1 for all local digital services keywords, but now, only 2 out of 16 keywords have maintained their positions. Other pages have dropped to positions 30 and beyond. I'm considering a solution and would like your advice: I'm planning to purchase a new domain and migrate the content from maniflexa.com/emploi/ to newdomain.com. However, I want to ensure a smooth migration without affecting the main domain maniflexa.com rankings and losing backlinks from maniflexa.com/emploi/ pages. Is moving the subfolder content to a new domain a viable solution? And how can I effectively redirect all pages from the subfolder to the new domain while preserving page ranks and backlinks?
Intermediate & Advanced SEO | | davidifaso
I wish they did, but GSC doesn't offer a solution to migration content from subfolder to a new domain. 😢 Help a fellow Mozer. Thanks for giving a hand.0 -
No Index thousands of thin content pages?
Hello all! I'm working on a site that features a service marketed to community leaders that allows the citizens of that community log 311 type issues such as potholes, broken streetlights, etc. The "marketing" front of the site is 10-12 pages of content to be optimized for the community leader searchers however, as you can imagine there are thousands and thousands of pages of one or two line complaints such as, "There is a pothole on Main St. and 3rd." These complaint pages are not about the service, and I'm thinking not helpful to my end goal of gaining awareness of the service through search for the community leaders. Community leaders are searching for "311 request service", not "potholes on main street". Should all of these "complaint" pages be NOINDEX'd? What if there are a number of quality links pointing to the complaint pages? Do I have to worry about losing Domain Authority if I do NOINDEX them? Thanks for any input. Ken
Intermediate & Advanced SEO | | KenSchaefer0 -
Changed all external links to 'NoFollow' to fix manual action penalty. How do we get back?
I have a blog that received a Webmaster Tools message about a guidelines violation because of "unnatural outbound links" back in August. We added a plugin to make all external links 'NoFollow' links and Google removed the penalty fairly quickly. My question, how do we start changing links to 'follow' again? Or at least being able to add 'follow' links in posts going forward? I'm confused by the penalty because the blog has literally never done anything SEO-related, they have done everything via social and email. I only started working with them recently to help with their organic presence. We don't want them to hurt themselves at all, but 'follow' links are more NATURAL than having everything as 'NoFollow' links, and it helps with their own SEO by having clean external 'follow' links. Not sure if there is a perfect answer to this question because it is Google we're dealing with here, but I'm hoping someone else has some tips that I may not have thought about. Thanks!
Intermediate & Advanced SEO | | HashtagJeff0 -
Google crawling different content--ever ok?
Here are a couple of scenarios I'm encountering where Google will crawl different content than my users on initial visit to the site--and which I think should be ok. Of course, it is normally NOT ok, I'm here to find out if Google is flexible enough to allow these situations: 1. My mobile friendly site has users select a city, and then it displays the location options div which includes an explanation for why they may want to have the program use their gps location. The user must choose the gps, the entire city, or he can enter a zip code, or choose a suburb of the city, which then goes to the link chosen. OTOH it is programmed so that if it is a Google bot it doesn't get just a meaningless 'choose further' page, but rather the crawler sees the page of results for the entire city (as you would expect from the url), So basically the program defaults for the entire city results for google bot, but for for the user it first gives him the initial ability to choose gps. 2. A user comes to mysite.com/gps-loc/city/results The site, seeing the literal words 'gps-loc' in the url goes out and fetches the gps for his location and returns results dependent on his location. If Googlebot comes to that url then there is no way the program will return the same results because the program wouldn't be able to get the same long latitude as that user. So, what do you think? Are these scenarios a concern for getting penalized by Google? Thanks, Ted
Intermediate & Advanced SEO | | friendoffood0 -
Is it okay to copy and paste on page content into the meta description tag?
I have heard conflicting answers to this. I always figured that it was okay to selectively copy and paste on page content into the meta description tag.....especially if the onpage content is well written. How can it be duplicate content if it's pulling from the exact same page? Does anybody have any feedback from a credible source about this? Thanks.
Intermediate & Advanced SEO | | VanguardCommunications1 -
Is there a way to get a list of Total Indexed pages from Google Webmaster Tools?
I'm doing a detailed analysis of how Google sees and indexes our website and we have found that there are 240,256 pages in the index which is way too many. It's an e-commerce site that needs some tidying up. I'm working with an SEO specialist to set up URL parameters and put information in to the robots.txt file so the excess pages aren't indexed (we shouldn't have any more than around 3,00 - 4,000 pages) but we're struggling to find a way to get a list of these 240,256 pages as it would be helpful information in deciding what to put in the robots.txt file and which URL's we should ask Google to remove. Is there a way to get a list of the URL's indexed? We can't find it in the Google Webmaster Tools.
Intermediate & Advanced SEO | | sparrowdog0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0