Proper sitemap update frequency
-
I have 12 sitemaps submitted to Google. After about a week, Google is about 50% of the way through crawling each one.
In the past week I've created many more pages. Should I wait until Google is 100% complete with my original sitemaps or can I just go ahead and refresh them? When I refresh the original files will have different URLs.
-
You want your sitemap to include all your important URLs. Don't remove them from the sitemap just because you have been crawled.
-
Agreed, I don't see any issue with it, if you have more urls submit them if you can.
-
Nah, I don't think so. If they havent gotten to them yet, it shouldnt affect it. You could probably change the URL's, change the name of the sitemap, etc and have it not do anything.
If anything, you would want them to find the new URL's before its done with the first crawl, rather than index something that is no longer correct.
-
Thanks David. To clarify, the urls haven't changed I've just added more of them.
I am wondering if it will "throw google off" if I uploaded all new sitemaps that had different URLs in them before it's done with first crawl. I am getting good crawl frequency now and didn't want to disrupt it.
Does that make sense or change your answer at all?
Thanks again.
-
If you have URL's that changed, I would resubmit. If Google hasn't found them yet, what difference would it make to submit more that haven't been found yet? When they do crawl them, you will have them crawling the right and updated URL locations
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTML or XML sitemap - benefits
Hi all, Can I use only HTML sitemap or I should use both versions?
Intermediate & Advanced SEO | | Tormar
How much I would lose in case when I would lose only HTML sitemap, without XML sitemap? Thank you.0 -
Sitemap error
Hey Guys Everytime I run the tester through google webmaster tools - I keep getting an error that tells me "Your Sitemap appears to be an HTML page. Please use a supported sitemap format instead." An idea how to go about fixing this without changing the site around? https://www.zenory.co.nz/sitemap I have seen competitors sitemaps look similar to mine. Cheers
Intermediate & Advanced SEO | | edward-may0 -
Website not properly listed on google organic despite SEO efforts
Hello, I have worked thoroughly on my website tags including HTML titles, URLs, H1 headers and text on each section. The problem is that despite this effort, my website does not seem to improve in terms of ranking (the onsite optimization has been done 6 months ago already). We have a sitemap, we have done link building and everything but still no tangible progress. The anomaly I am experiencing is the following: If I search on Google.com.lb for "aparment for sale in lebanon" I don't get the section on my website that is optimized for that particular query (which is the Buy section located here http://www.ramcolb.com/apartment-sale-beirut-lebanon). my site appears only on page 6 but the Homepage appears which is very counter intuitive because it is not optimized for the "apartment for sale in Lebanon" keyword. And this anomaly is present on almost all sections and their relevant queries. The relevant section for the particular query never appears, it is always another irrelevant section that appears but far in the listings (beyond page 6). It is as if Google hasn't indexed properly my website and is mixing up the sections... Has anyone experienced this type of problems? what can be done? Thanks in advance
Intermediate & Advanced SEO | | ImadKaram0 -
Affiliate links vs. seo (updated 19.02.2014)
UPDATE - 19.02.2014: Hi, We got another negative answer from Google pointing again to our affiliate links, so the 301 redirect and block was not enough.
Intermediate & Advanced SEO | | Silviu
I understand the need of contacting all of them and ask for the nofollow, we've started the process, but it will take time, alot of time. So I'd like to bring to your attention another 2 scenarious I have in mind: 1. Disavow all the affiliate links.
Is it possible to add big amount of domains (>1000) to the disavow doc.? Anyone tryed this? 2. Serve 404 status for urls coming from affiliates that did not add noffolow attribute.
This way we kinda tell G that content is no longer available, but we will end up with few thousand 404 error pages.
The only way to fix all those errors is by 301 redirecting them afterwards (but this way the link juice might 'restart' flowing and the problem might persist). Any input is welcomed. Thanks Hi Mozers, After a reconsideration request regarding our link profile, we got a 'warning' answer about some of our affiliate sites (links coming from our affiliate sites that violate Google's quality guidelines). What we did (and was the best solution in trying to fix the 'seo mistake' and not to turn off the affiliate channel) was to 301 redirect all those links to a /AFFN/ folder and block this folder from indexing.
We're still waiting for an answer on our last recon. request. I want to know you opinion about this? Is this a good way to deal with this type of links if they're reported? Changing the affiliate engine and all links on the affiliate sites would be a big time and technical effort, that's why I want to make sure it's truly needed. Best,
Silviu0 -
Sitemaps on the fly
Has anyone submitted pages that generate sitemaps on the fly as opposed to only submitting static XML files to Bing? For instance, sitemap.php vs sitemap.xml, video sitemap.php vs videositemap.xml?
Intermediate & Advanced SEO | | alhallinan0 -
Does a sitemap override Google parameter handling?
This question might seem silly, but I'll ask anyway. We have an eCommerce site with a ton of duplicate content, mostly caused by faceted navigation. In researching ways to reduce the clutter, I've decided to use Google parameter handling to stop Googlebot from crawling pages with certain parameters, like: sort order, page #, etc... Now my question: If I set all of these parameters so that Googlebot doesn't crawl the grids, how will they ever find the individual product pages? We do upload a sitemap with all of the product pages. Does this solve my issue? Or, should I handle the duplicate content with noindex, follow tag? Or, is there an even better way? Thanks
Intermediate & Advanced SEO | | rhoadesjohn0 -
Sitemaps
I am working with a site that has sitemaps broken down very specifically. By page type: article, page etc and also broken down by Category. Unfortunately, this is not done hierarchically. Category and page type are separate maps, they are not nested. My question here is: Is is detrimental to have two separate sitemaps that point to the same pages? Should we eliminate one of these taxonomies, or maybe just try to make them hierarchical? IE item type -> category -> pagetitle Is there an issue with having a sitemap index that points to a nested sitemap index? (I dont think so, but might as well be sure. Thanks Moz Community! Can't delete my question, but turns out that isn't how they are structured. Food for thought anyway I suppose.
Intermediate & Advanced SEO | | MarloSchneider0 -
Mobile Sitemap Issue
Hi there, I am having some difficulty with an error on Webmaster Tools. I'm concerned with a possible duplicate content penalty following the launch of my mobile site. I have attempted to update my sitemap to inform Google that a different mobile page exists in addition to the desktop page. I have followed Google's guidelines as outlined here:
Intermediate & Advanced SEO | | DBC01
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34648 I'm having problems with my sitemap.xml file. Webmaster tools is reporting that it is not able to read the file and when I validate it I am getting an error stating that the 'Namespace prefix xhtml on link is not defined'. All I am trying to do is to create a sitemap that uses the rel="alternate" to inform Google that their is a mobile version of that specific page in addition to the desktop version. An instance of the code I am using is below: xml version="1.0" encoding="UTF-8"?> xml-stylesheet type="text/xsl" href="gss.xsl"?> <urlset< span="">xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://www.google.com/schemas/sitemap/0.84 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"> http://www.mydomain/info/detail/ <xhtml:link< span="">rel="alternate" media="only screen and (max-width: 640px)" href="http://m.mydomain.com/info/detail.html"/> <lastmod></lastmod>2013-02-01T16:03:48+00:00<changefreq></changefreq>daily0.50</xhtml:link<></urlset<> Any help would be much appreciated. Thanks0