Google sitemap just for a part of site?
-
Hi,
I am about reorganize (content and seo-wise) a part of a larger site and I wondered if it is possible to use a Google sitemap just for some but not all pages of a site?
Does anyone know if this has any impact on pages that are not included in the sitemap?
Thanks
-
You can add multiple sitemaps in Google Webmaster Tools, that's not a problem. So you could, I suppose, add a sitemap of just your new pages.
In my opinion though, I think you should just generate a new one at http://www.xml-sitemaps.com and upload and resubmit it. That would work just as well as a part sitemap.
If your site is really large though, maybe the part sitemap is the right answer, but going forward you'll have to remember which bits appear in which sitemap so you don't have overlap or accidental omissions.
Hope this helps.
-
Hi Martin,
no it's not about excluding pages.
I am working on a part of a larger site and I'd just like to see my changes to be crawled as fast as possible, that's all. For that reason I thought about submitting a sitemap just for that part (doing it for the whole site is too time consuming at the moment).
But I just wondered if it's possible. If not I just wait until Googlebot has done crawling the new pages.
-
To be honest a sitemap.xml is predominantly there to inform search engines of all your site's pages and their importance in your structure.
It's helpful to keep this up to date, so that the right pages are appearing quickly in the SERPs.
However a site with the right content in robots.txt and/or robots META tags will get crawled and, if the site is well structured and the internal links are all present, then all the pages of a site will end up in the SERPS anyway.
My question back to you would be - are there pages that you don't want to appear in Google's results and if so, why? The reason I ask this is that usually it's about excluding pages from search engines rather than making sure pages are included (assuming site structure and internal links are good).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40 -
Google Dancing?
Hello, I was wondering why my website for some keywords goes from 2nd 3rd page in Google to 7th or even more sometimes? This happens since a while. Any suggestion? Thanks. Eugenio
Technical SEO | | socialengaged0 -
How to optimize for different google seach center (google.de, google.ch) ?
We all use Deutsch language and (.com) domains for the sites. I ranked well in google.com ,but not so well in google.de , google.ch , my competitors ranked much better in google.de,google.ch. I checked most of their outbound-links, but get few information. Links from (.DE) domains or links from sites located in German help the rank for special google seach center ? (google.de, google.ch) . Or some other factors i missed? please help.
Technical SEO | | sunvary0 -
We're no longer turning up in Google SERP for our brand search when we used to be #1 after our site update. Any ideas why?
We recently updated our website and during the push, someone mistakenly 301 redirected "www.brandx.com" to "brandx.com" instead of the otherway. Since then, our website no longer turns up for the search "brandx" on Google. We have reversed the mistake a few days ago, but we're still not turning up, and we used to rank #1 in Google SERP. Could it just be due to timing between the crawls and that our www. site didn't make it in Google's index due to this mistake? We have submitted our new sitemap to google a couple of days ago as well, as a side we're still showing up #1 in Bing's results however. And it should still show up based on SEOMoz's SERP report. Any help would help as I'm growing increasingly concerned.
Technical SEO | | JoeLin0 -
A site is not being indexed by Google Yahoo or Bing
This site - http://adoptionconnection.org/ is not being indexed by any of the search engines. I checked the easy stuff - robots text is: <meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">all, index, follow</a>" /> <meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">noodp</a>" /> <meta name="<a class="attribute-value">robots</a>" content="<a class="attribute-value">noydir</a>" /> I have checked what I can determine would cause the issue but have found nothing to prevent it from being indexed. I'm thinking it may be re-directs etc. Any answer would be great. Thanks in advance,
Technical SEO | | Intergen0 -
How to block/notify google that your domain has been added to sites with very low trustworthiness?
Hey Guys, I am writing to SEOmoz community because a problem occurred which I do not know how to solve: My domain (xyz.com) occured on very strange sites with very low trustworthiness (even blocked by google). Checking the site, I found out that all of the pictures were ALT=xyz.com. Could this hurt my position of my site on google rankings? How to prevent such actions, what should I do? Thanks for you help in advance!
Technical SEO | | Kajmany0 -
Index forum sites
Hi Moz Team, somehow the last question i raised a few days ago not only wasnt answered up until now, it was also completely deleted and the credit was not "refunded" - obviously there was some data loss involved with your restructuring. Can you check whether you still find the last question and answer it quickly? I need the answer 🙂 Here is one more question: I bought a website that has a huge forum, loads of pages with user generated content. Overall around 500.000 Threads with 9 Million comments. The complete forum is noindex/nofollow when i bought the site, now i am thinking about what is the best way to unleash the potential. The current system is vBulletin 3.6.10. a) Shall i first do an update of vbulletin to version 4 and use the vSEO tool to make the URLs clean, more user and search engine friendly before i switch to index/follow? b) would you recommend to have the forum in the folder structure or on a subdomain? As far as i know subdomain does take lesser strenght from the TLD, however, it is safer because the subdomain is seen as a separate entity from the regular TLD. Having it in he folder makes it easiert to pass strenght from the TLD to the forum, however, it puts my TLD at risk c) Would you release all forum sites at once or section by section? I think section by section looks rather unnatural not only to search engines but also to users, however, i am afraid of blasting more than a millionpages into the index at once. d) Would you index the first page of a threat or all pages of a threat? I fear duplicate content as the different pages of the threat contain different body content but the same Title and possibly the same h1. Looking forward to hear from you soon! Best Fabian
Technical SEO | | fabiank0