Is it safe to not have a sitemap if Google is already crawling my site every 5-10 min?
-
I work on a large news site that is constantly being crawled by Google. Googlebot is hitting the homepage every 5-10 minutes. We are in the process of moving to a new CMS which has left our sitemap nonfunctional. Since we are getting crawled so often, I've met resistance from an overwhelmed development team that does not see creating sitemaps as a priority. My question is, are they right? What are some reasons that I can give to support my claim that creating an xml sitemap will improve crawl efficiency and indexing if we are already having new stories appear in Google SERPs within 10-15 minutes of publication? Is there a way to quantify what the difference would be if we added a sitemap?
-
I agree with Robert on all points.
To keep it out of the dev team's overwhelmed hands, just use http://code.google.com/p/googlesitemapgenerator/ or one of the many free generators online to create your sitemaps intermittently.
Maybe 3 months or 6 months down the road the dev team can come up with something when they're less crushed from the site move and you can have them do something similar to Google XML sitemaps plugin for Wordpress which updates the sitemap everytime you add new content. Until then, submitting the freely generated ones should give Google at least a little heads up and feel like you're doing the right thing.
-
As to your 1, I would agree and suggest that it is important on a couple of SEO levels. If you have just updated a story and by virtue of that you have freshened the content. I would want that indexed quickly to move it up if at all possible. However, if you can tell in GWMT that the site is being indexed a couple of times an hour, I am not sure it strengthens your argument.
As to your 2, I would say yes, but if you did a canonical or a 301 on the previous URL - as you should have - it is irrelevant.
Best,
-
Thanks Robert. As you surmised, our URLs are not changing (thankfully!). Fortunately, for now, our Google News sitemap still works. The only arguments I've come up with so far are:
- Having a sitemap will help SEs recrawl updated stories faster.
- Having a sitemap will help SEs find out when a URL has changed.
In my experience, Google does not index changes to existing pages as quickly as newly published articles. My thinking is that if we supply the changes via sitemap, reindexing speed will improve.
Thoughts?
-
Jon
You state you are a news site and you are moving to a new CMS. Assuming the Domain, URL's are the same, I can understand the dev team resistance. This is from WebMaster tools around news sites (bold is mine):
A Google News Sitemap can help you control which content Google News crawls and can speed up the inclusion of your articles in Google News search results. You're welcome to submit your sitemap in your Webmaster Tools account prior to submitting your site for inclusion in Google News. However, only sitemaps associated with an approved site will be crawled without error by Google News.
So, assuming you are already a Google News approved site, you can most likely move forward without immediately submitting a site map. Call me old fashion, but I still think a site map submission is important. But, again, I do get the dev teams resistance. Hope this at least assists your argument.
One added bit of info, You could use a sitemap generator to take a load off of them. Here is a list of many sitemap generators. Since I am not in the dev shop, I cannot recommend any, but I do use the Screaming Frog Spider (never used their SM Generator) This way the Dev team would have a bit less work.
Hope it helps you out a bit,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I need help on how best to do a complicated site migration. Replacing certain pages with all new content and tools, and keeping the same URL's. The rest just need to disappear safely. Somehow.
I'm completely rebranding a website but keeping the same domain. All content will be replaced and it will use a different theme and mostly new plugins. I've been building the new site as a different site in Dev mode on WPEngine. This means it currently has a made-up domain that needs to replace the current site. I know I need to somehow redirect the content from the old version of the site. But I'm never going to use that content again. (I could transfer it to be a Dev site for the current domain and automatically replace it with the click of a button - just as another option.) What's the best way to replace blahblah.com with a completely new blahblah.com if I'm not using any of the old content? There are only about 4 URL'st, such as blahblah.com/contact hat will remain the same - with all content replaced. There are about 100 URL's that will no longer be in use or have any part of them ever used again. Can this be done safely?
Intermediate & Advanced SEO | | brickbatmove1 -
Interest in optimise Google Crawl
Hello, I have an ecommerce site with all pages crawled and indexed by Google. But I have some pages with multiple urls like : www.sitename.com/product-name.html and www.sitename.com/category/product-name.html There is a canonical on all these pages linking to the simplest url (so Google index only one page). So the multiple pages are not indexed, but Google still comes crawling them. My question is : Did I have any interest in avoiding Google to crawl these pages or not ? My point is that Google crawl around 1500 pages a day on my site, but there are only 800 real pages and they are all indexed on Google. There is no particular issue, so is it interesting to make it change ? Thanks
Intermediate & Advanced SEO | | onibi290 -
Google Search Console Site Property Questions
I have a few questions regarding Google Search Console. Google Search Console tells you to add all versions of your website https, http, www, and non-www. 1.) Do I than add ALL the information for ALL versions? Sitemaps, preferred site, etc.? 2.) If yes, when I add sitemaps to each version, do I add the sitemap url of the site version I'm on or my preferred version? - For instance when adding a sitemap to a non-www version of the site, do I use the non-www version of the sitemap? Or since I prefer a https://www.domain.com/sitemap.xml do I use it there? 3.) When adding my preferred site (www or non-www) do I use my preferred site on all site versions? (https, http, www, and non-www) Thanks in advance. Answers vary throughout Google!
Intermediate & Advanced SEO | | Mike.Bean0 -
Xml sitemap Issue... Xml sitemap generator facilitating only few pages for indexing
Help me I have a website earlier 10,000 WebPages were facilitated in xml sitemap for indexation, but from last few days xml sitemap generator facilitating only 3300 WebPages for indexing. Please help me to resolve the issue. I have checked Google webmaster indexed pages, its showing 8,141. I have tried 2-3 paid tools, but all are facilitating 3300 pages for indexing. I am not getting what is the exact problem, whether the server not allowing or the problem with xml sitemap generator. Please please help me…
Intermediate & Advanced SEO | | udistm0 -
Best way for Google and Bing not to crawl my /en default english pages
Hi Guys, I just transferred my old site to a new one and now have sub folder TLD's. My default pages from the front end and sitemap don't show /en after www.mysite.com. The only translation i have is in spanish where Google will crawl www.mysite.com/es (spanish). 1. On the SERPS of Google and Bing, every url that is crawled, shows the extra "/en" in my TLD. I find that very weird considering there is no physical /en in my urls. When i select the link it automatically redirects to it's default and natural page (no /en). All canonical tags do not show /en either, ONLY the SERPS. Should robots.txt be updated to "disallow /en"? 2. While i did a site transfer, we have altered some of the category url's in our domain. So we've had a lot of 301 redirects, but while searching specific keywords in the SERPS, the #1 ranked url shows up as our old url that redirects to a 404 page, and our newly created url shows up as #2 that goes to the correct page. Is there anyway to tell Google to stop showing our old url's in the SERP's? And would the "Fetch as Google" option in GWT be a great option to upload all of my url's so Google bots can crawl the right pages only? Direct Message me if you want real examples. THank you so much!
Intermediate & Advanced SEO | | Shawn1240 -
Google local pointing to Google plus page not homepage
Today my clients homepage dropped off the search results page (was #1 for months, in the top for years). I noticed in the places account everything is suddenly pointing at the Google plus page? The interior pages are still ranking. Any insight would be very helpful! Thanks.
Intermediate & Advanced SEO | | stevenob0 -
Development site crawled
We just found out our password protected development site has been crawled. We are worried about duplicate content - what are the best steps to take to correct this beyond adding to robots.txt?
Intermediate & Advanced SEO | | EileenCleary0 -
What is next from Google Panda and Google Penguin?
Does anyone know what we can expect next from Google Panda/Penguin? We did prepare for this latest update and so far so good.
Intermediate & Advanced SEO | | jjgonza0