Should XML sitemaps include *all* pages or just the deeper ones?
-
Hi guys,
Ok this is a bit of a sitemap 101 question but I cant find a definitive answer:
When we're running out XML sitemaps for google to chew on (we're talking ecommerce and directory sites with many pages inside sub-categories here) is there any point in mentioning the homepage or even the second level pages? We know google is crawling and indexing those and we're thinking we should trim the fat and just send a map of the bottom level pages.
What do you think?
-
It is correct that DA, PA, depth of pages, etc. are all factors in determining which pages get indexed. If your site offers good navigation, reasonable backlinks, anchor text, etc then you can get close to all pages indexed even on a very large site.
Your site map should naturally include a date on every link which indicates when content was added or changed. Even if you submit a 10k list of links, Google can evaluate the dates on each link and determine which content has been added or modified since your site was last crawled.
-
Well yes, that's kinda my point. We do have a sensible, crawlable navigation so there will be no problems there, so then the sitemap really becomes an indicator of what needs to be crawled (new and updated pages), but then the same question stands...
With other sites we've managed with thousands of pages we've found it detrimental to give Google hundreds of pages to crawl on a sitemap that we don't feel are important. We're pretty sure (and SEOmoz staff have supported this) that domain authority and the number of pages you can get into the index are closely related.
-
Tim,
We always index ALL pages...the help tip on Google XML also suggests including all pages of your site in the XML sitemap.
-
Your sitemap should include every page of your site that you wish to be indexed.
The idea is that if your site does not provide crawlable navigation, Google can use your sitemap to crawl your site. There are some sites that use flash and when a crawler lands on a page there is absolutely no where for the crawler to go.
If your site navigation is solid then a sitemap doesn't offer any value to Google other then an indicator of when content is updated or added.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Query on Sitemap xml Root Path
Is it compulsory to have sitemap.xml at this path - abcd.com/sitemap.xml? My sitename is abcd.com. Now is it compulsory to have sitemap.xml at this path - abcd.com/sitemap.xml only? a) If i take cnd services where path can be like xyz.com/sitemap.xml and then this sitemap i can submit in robot file so it is fine? b) What will happen here in webmaster tool as in webmaster tool when we submit sitemap by default it gives us domain name like abcd.com and we have to just add /sitemap.xml
Technical SEO | | Johny123450 -
What should I do with all these 404 pages?
I have a website that Im currently working on that has been fairly dormant for a while and has just been given a face lift and brought back to life. I have some questions below about dealing with 404 pages. In Google WMT/search console there are reports of thousands of 404 pages going back some years. It says there are over 5k in total but I am only able to download 1k or so from WMT it seems. I ran a crawl test with Moz and the report it sent back only had a few hundred 404s in, why is that? Im not sure what to do with all the 404 pages also, I know that both Google and Moz recommend a mixture of leaving some as 404s and redirect others and Id like to know what the community here suggests. The 404s are a mix of the following: Blog posts and articles that have disappeared (some of these have good back-links too) Urls that look like they used to belong to users (the site used to have a forum) which where deleted when the forum was removed, some of them look like they were removed for spam reasons too eg /user/buy-cheap-meds-online and others like that Other urls like this /node/4455 (or some other random number) Im thinking I should permanently redirect the blog posts to the homepage or the blog but Im not sure what to do about all the others? Surely having so many 404s like this is hurting my crawl rate?
Technical SEO | | linklander0 -
Sitemap
Hi, I am setting up a new sitemap for our website. the website contains about 8000 - 10.000 pages. Of wich are 6000 productpages. I have 10 categories, about 80 sub-catagories and about 400 sub-sub categories ( these ar my most important landingpages) At this moment our sitemap is only 1 MB. From that point of view 1 sitemap will be enough. But can i take SEO advantage by splitting this sitemap in 10 categories? Or are there other ways to set it up for a better SEO? Thanks!
Technical SEO | | Leonie-Kramer0 -
How to Delete a Page on the Web?
Google reports and I have confirmed that the following old page is presenting on the Web. http://www.audiobooksonline.com/The_Great_American_Baseball_Box_Greatest_Moments_from_the_Last_80_Years_original_audio_collection_compact_discs.html This page hasn't been in our site's directory for some time and is no longer needed by us. What is the best way to fix this Google reported crawl error?
Technical SEO | | lbohen0 -
How to verify a page-by-page level 301 redirect was done correctly?
Hello, I told some tech guys to do a page-by-page relevant 301 redirect (as talked about in Matt Cutts video https://www.youtube.com/watch?v=r1lVPrYoBkA) when a company wanted to move to a new domain when their site was getting redesigned. I found out they did a 302 redirect on accident and had to fix that, so now I don't trust they did the page-by-page relevant redirect. I have a feeling they just redirected all of the pages on the old domain to the homepage of the new domain. How could I confirm this suspicion? I run the old domain through screaming frog and it only shows 1 URL - the homepage. Does that mean they took all of the pages on the old domain offline? Thanks!
Technical SEO | | EvolveCreative0 -
Duplicate pages
Hi Can anyone tell me why SEO MOZ thinks these paes are duplicates when they're clearly not? Thanks very much Kate http://www.katetooncopywriter.com.au/how-to-be-a-freelance-copywriter/picture-1-58/ http://www.katetooncopywriter.com.au/portfolio/clients/other/ http://www.katetooncopywriter.com.au/portfolio/clients/travel/ http://www.katetooncopywriter.com.au/webservices/what-i-do/blog-copywriter/
Technical SEO | | ToonyWoony0 -
Local SEO for service industry - one landing page for every town...in every county...in every state?
Starting a second local based service site. Initially going to target a couple counties and move on from there as the business grows. The first site of mine I set up a page for each town [service] + [town] + [state] + [zip]. I am afraid this could get out of control though if I don't have unique content on each page. For the last site I simply copied the page and replace the town name in each as well as the picture, picture title, and image name to make it look more unique for users but not necessarily Google. I had pretty good results but I want this next site to be done properly. Should I only target a few of the major markets to begin with? What about long tail searches for smaller towns that currently bring in a good amount of business? I am concerned about having "too many" long tail pages for each town which would essentially become a listing of every town and county in the state if I was to maintain the pace I want to. Also I would need a good amount of backlinks to each specific town page url if I wanted to do well in each of those specific markets right? Is this where the fine line between niche term and broad search is? Is there any happy medium?
Technical SEO | | kabledesigns0 -
ROR Sitemap
Do search engines Read RoR sitemaps ? Are they necessary ? Isn't xml sitemap enough.
Technical SEO | | seoug_20050