Is it safe to not have a sitemap if Google is already crawling my site every 5-10 min?
-
I work on a large news site that is constantly being crawled by Google. Googlebot is hitting the homepage every 5-10 minutes. We are in the process of moving to a new CMS which has left our sitemap nonfunctional. Since we are getting crawled so often, I've met resistance from an overwhelmed development team that does not see creating sitemaps as a priority. My question is, are they right? What are some reasons that I can give to support my claim that creating an xml sitemap will improve crawl efficiency and indexing if we are already having new stories appear in Google SERPs within 10-15 minutes of publication? Is there a way to quantify what the difference would be if we added a sitemap?
-
I agree with Robert on all points.
To keep it out of the dev team's overwhelmed hands, just use http://code.google.com/p/googlesitemapgenerator/ or one of the many free generators online to create your sitemaps intermittently.
Maybe 3 months or 6 months down the road the dev team can come up with something when they're less crushed from the site move and you can have them do something similar to Google XML sitemaps plugin for Wordpress which updates the sitemap everytime you add new content. Until then, submitting the freely generated ones should give Google at least a little heads up and feel like you're doing the right thing.
-
As to your 1, I would agree and suggest that it is important on a couple of SEO levels. If you have just updated a story and by virtue of that you have freshened the content. I would want that indexed quickly to move it up if at all possible. However, if you can tell in GWMT that the site is being indexed a couple of times an hour, I am not sure it strengthens your argument.
As to your 2, I would say yes, but if you did a canonical or a 301 on the previous URL - as you should have - it is irrelevant.
Best,
-
Thanks Robert. As you surmised, our URLs are not changing (thankfully!). Fortunately, for now, our Google News sitemap still works. The only arguments I've come up with so far are:
- Having a sitemap will help SEs recrawl updated stories faster.
- Having a sitemap will help SEs find out when a URL has changed.
In my experience, Google does not index changes to existing pages as quickly as newly published articles. My thinking is that if we supply the changes via sitemap, reindexing speed will improve.
Thoughts?
-
Jon
You state you are a news site and you are moving to a new CMS. Assuming the Domain, URL's are the same, I can understand the dev team resistance. This is from WebMaster tools around news sites (bold is mine):
A Google News Sitemap can help you control which content Google News crawls and can speed up the inclusion of your articles in Google News search results. You're welcome to submit your sitemap in your Webmaster Tools account prior to submitting your site for inclusion in Google News. However, only sitemaps associated with an approved site will be crawled without error by Google News.
So, assuming you are already a Google News approved site, you can most likely move forward without immediately submitting a site map. Call me old fashion, but I still think a site map submission is important. But, again, I do get the dev teams resistance. Hope this at least assists your argument.
One added bit of info, You could use a sitemap generator to take a load off of them. Here is a list of many sitemap generators. Since I am not in the dev shop, I cannot recommend any, but I do use the Screaming Frog Spider (never used their SM Generator) This way the Dev team would have a bit less work.
Hope it helps you out a bit,
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
For a sitemap.html page, does the URL slug have to be /sitemap?
Also, do you have to have anchors in your sitemap.html? or are naked URLs that link okay?
Intermediate & Advanced SEO | | imjonny1230 -
Google webmaster reports non-existent links between syndicated sites
We have run into an issue with linking that we are completely puzzled by. We syndicate our content to various clients, taking care to ensure that we have followed all the best practices that Google recommends for syndicating content. But recently, we noticed Google Webmaster report links from ClientA to ClientB, and we cannot figure out why it thinks that way. We have never created, and we have never found the links that Google Webmaster claims are there. It is important for us to keep our clients isolated. Has anyone seen such behavior? Any ideas/pointers/hunches would be very much appreciated. Happy to provide more information. We even asked on the Google Webmaster Forum (https://productforums.google.com/forum/#!topic/webmasters/QkGF7-HZHTY;context-place=forum/webmasters), but thought this might be a better place to get expert advice. Thanks!
Intermediate & Advanced SEO | | prakash.sikchi0 -
Does blocking foreign country IP traffic to site, hurt my SEO / US Google rankings?
I have a website is is only of interest to US visitors. 99% (at least) of Adsense income is from the US. But I'm getting constant attempts by hackers to login to my admin account. I have countermeasures fo combat that and am initiating others. But here's my question: I am considering not allowing any non US, or at least any non-North American, traffic to the site via a Wordpress plugin that does this. I know it will not affect my business negatively, directly. However, are there any ramifications of the Google bots of these blocked countries not being able to access my site? Does it affect the rankings of my site in the US Google searches. At the very least I could block China, Russia and some eastern European countries.
Intermediate & Advanced SEO | | bizzer0 -
Investigating Google's treatment of different pages on our site - canonicals, addresses, and more.
Hey all - I hesitate to ask this question, but have spent weeks trying to figure it out to no avail. We are a real estate company and many of our building pages do not show up for a given address. I first thought maybe google did not like us, but we show up well for certain keywords 3rd for Houston office space and dallas office space, etc. We have decent DA and inbound links, but for some reason we do not show up for addresses. An example, 44 Wall St or 44 Wall St office space, we are no where to be found. Our title and description should allow us to easily picked up, but after scrolling through 15 pages (with a ton of non relevant results), we do not show up. This happens quite a bit. I have checked we are being crawled by looking at 44 Wall St TheSquareFoot and checking the cause. We have individual listing pages (with the same titles and descriptions) inside the buildings, but use canonical tags to let google know that these are related and want the building pages to be dominant. I have worked though quite a few tests and can not come up with a reason. If we were just page 7 and never moved it would be one thing, but since we do not show up at all, it almost seems like google is punishing us. My hope is there is one thing that we are doing wrong that is easily fixed. I realize in an ideal world we would have shorter URLs and other nits and nats, but this feels like something that would help us go from page 3 to page 1, not prevent us from ranking at all. Any thoughts or helpful comments would be greatly appreciated. http://www.thesquarefoot.com/buildings/ny/new-york/10005/lower-manhattan/44-wall-st/44-wall-street We do show up one page 1 for this building - http://www.thesquarefoot.com/buildings/ny/new-york/10036/midtown/1501-broadway, but is the exception. I have tried investigating any differences, but am quite baffled.
Intermediate & Advanced SEO | | AtticusBerg10 -
Google webmaster tools showing "no data available" for links to site, why?
In my google webmaster account I'm seeing all the data in other categories except links to my site. When I click links to my site I get a "no data available" message. Does anyone know why this is happening? And if so, what to do to fix it? Thanks.
Intermediate & Advanced SEO | | Nicktaylor10 -
Site rankings down
Our site is over 10 years old and has consistently ranked highly in google.co.uk for over 100 key phrases. Until the middle of April, we were 7th for 'nuts and bolts' and 5th for 'bolts and nuts' - we have been around these positions for 5-6 years easily now. Our rankings dropped mid-April, but now (presumably as a result of Penguin 2.0), we've seen larger decreases across the board. We are now 5th page on 'nuts and bolts', and second page on 'bolts and nuts'. Can anyone please shed any light on this? Although we'd fallen some before Penguin 2.0, we've fallen quite a bit further since. So I'm wondering if it's that. We do still rank well on our more specialised terms though - 'imperial bolts', 'bsw bolts', 'bsf bolts', we're still top 5. We've lost out with the more generic terms. In the past we did a bit of (relevant) blog commenting and obtained some business directory links, before realising the gain was tiny if at all. Are those likely to be the issue? I'm guessing so. It's hard to know which to get rid of though! Now, I use social media sparingly, just Facebook, Twitter and G+. The only linkbuilding I do now is by sending polite emails to people who run classic car clubs that would use our bolts, stuff like that. I've had a decent response from that, and a few have become customers directly. Here's our link profile if anyone would be kind enough as to have a look: http://www.opensiteexplorer.org/links?site=www.thomassmithfasteners.com Also, SEOMOZ says we have too many links on our homepage (107) - the dropdown navigation is the culprit here. Should I simply get rid of the dropdown and take users to the categories? Any advice here would be appreciated before I make changes! If anyone wants to take a look at the site, the URL is in the link profile above - I'm terrified of posting links anywhere now! Thanks for your time, and I'd be very grateful for any advice. Best Regards, Stephen
Intermediate & Advanced SEO | | stephenshone1 -
Moving from a static HTML CSS site with .html files to a Wordpress Site while keeping link structure
Mozzers, Hope this finds you well. I need some advice. We have a site built with a dreamweaver template, and it is lacking in responsiveness, ease of updates, and a lot of the coding is behind traditional web standards (which I know will start to hurt our rank - if not the user experience). For SEO purposes, we would like to move the existing static based site to Wordpress so we can update it easily and keep content fresh. Our current site, thriveboston.com, has a lot of page extensions ending in .html. For the transition, it is extremely important for us to keep the link structure. We rank well in the SERPs for Boston Counseling, etc... I found and tested a plugin (offline) that can add a .html extension to Wordpress pages, which allows us to keep our current structure, but has anyone had any luck with this live? Has anyone had any luck moving from a static site - to a Wordpress site - while keeping the current link structure - without hurting any rank? We hope to move soon because if the site continues to grow, it will become even harder to migrate the site over. Also, does anyone have any hesitations? It this a bad move? Should we just stay on the current DWT template (the HTML and CSS) and not migrate? Any suggestions and advice will be heeded. Thanks Mozzers!
Intermediate & Advanced SEO | | _Thriveworks0