I have two sitemaps which partly duplicate - one is blocked by robots.txt but can't figure out why!
-
Hi, I've just found two sitemaps - one of them is .php and represents part of the site structure on the website. The second is a .txt file which lists every page on the website. The .txt file is blocked via robots exclusion protocol (which doesn't appear to be very logical as it's the only full sitemap). Any ideas why a developer might have done that?
-
There are standards for the sitemaps .txt and .xml sitemaps, where there are no standards for html varieties. Neither guarantees the listed pages will be crawled, though. HTML has some advantage of potentially passing pagerank, where .txt and .xml varieties don't.
These days, xml sitemaps may be more common than .txt sitemaps but both perform the same function.
-
yes, sitemap.txt is blocked for some strange reason. I know SEOs do this sometimes for various reasons, but in this case it just doesn't make sense - not to me, anyway.
-
Thanks for the useful feedback Chris - much appreciated - Is it good practice to use both - I guess it's a good idea if onsite version only includes top-level pages? PS. Just checking nature of block!
-
Luke,
The .php one would have been created as a navigation tool to help users find what they're looking for faster, as well as to provide html links to search engine spiders to help them reach all pages on the site. On small sites, such sitemaps often include all pages of the site, on large ones, it might just be high level pages. The .txt file is non html and exists to provide search engines with a full list of urls on the site for the sole purpose of helping search engines index all the site's pages.
The robots.txt file can also be used to specify the location of the sitemap.txt file such as
sitemap: http://www.example.com/sitemap_location.txt
Are you sure the sitemap is being blocked by the robots.txt file or is the robots.txt file just listing the location of the sitemap.txt?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt wildcards - the devs had a disagreement - which is correct?
Hi – the lead website developer was assuming that this wildcard: Disallow: /shirts/?* would block URLs including a ? within this directory, and all the subdirectories of this directory that included a “?” The second developer suggested that this wildcard would only block URLs featuring a ? that come immediately after /shirts/ - for example: /shirts?minprice=10&maxprice=20 BUT argued that this robots.txt directive would not block URLS featuring a ? in sub directories - e.g. /shirts/blue?mprice=100&maxp=20 So which of the developers is correct? Beyond that, I assumed that the ? should feature a * on each side of it – for example - /? - to work as intended above? Am I correct in assuming that?
Intermediate & Advanced SEO | | McTaggart0 -
Robots.txt Allowed
Hello all, We want to block something that has the following at the end: http://www.domain.com/category/product/some+demo+-text-+example--writing+here So I was wondering if doing: /*example--writing+here would work?
Intermediate & Advanced SEO | | ThomasHarvey0 -
Robots.txt - Googlebot - Allow... what's it for?
Hello - I just came across this in robots.txt for the first time, and was wondering why it is used? Why would you have to proactively tell Googlebot to crawl JS/CSS and why would you want it to? Any help would be much appreciated - thanks, Luke User-Agent: Googlebot Allow: /.js Allow: /.css
Intermediate & Advanced SEO | | McTaggart0 -
Getting into Google News, URL's & Sitemaps
Hello, I know that one of the 'technical requirements' to get into google news is that the URL's have unique numbers at the end, BUT, that requirement can be circumvented if you have a Google News Sitemap. I've purchased the Yoast Google News Sitemap (https://yoast.com/wordpress/plugins/news-seo/) BUT just found out that you cannot submit a google news Sitemap until you are accepted into google news. Thus, my question is that do you need to add the digits to the URL's temporarily until you get in and can submit a google news sitemap, OR, is it ok to apply without them and take care of the sitemap after you get in. If anyone has any other tips about getting into Google News that would be great! Thanks!
Intermediate & Advanced SEO | | stacksnew0 -
Can submitting sitemap to Google webmaster improve SEO?
Can creating fresh sitemap and submitting to Google webmaster improve SEO?
Intermediate & Advanced SEO | | chanel270 -
What is the best way to rank well in two countries simultaneously with only one CCTLD
I have a .co.nz website and would like to rank on .com.au without setting up a new country specific website for .com.au. What is the best way to do this ?
Intermediate & Advanced SEO | | SteveK640 -
Best solutions when homepage won't rank in Google?
My homepage (www.LeatherHideStore.com) will not rank for my keywords in Google - with Google mostly pulling product pages and some categories for serp results. In contrast, my homepage consistently shows for Yahoo and Bing with exceptions where a category is a better match for the keyword. In other words, it is working exactly as it should in Yahoo and Bing. After a year of this frustration I just upgraded to a new site on Magento Community and surprise, the same problem! The SEO moz analyzer has flagged significant duplicate content issues which I think is at the heart of my problem. I have asked my developer to address these but let's just say that customer service is not his forte. I am even starting to doubt he knows what to do although the site appears is well done. Given that it is a brand new site and duplicate content in Magento is to be expected (from what I have now read), I am deeply discouraged that my developer did not or could not plan for this so here I am again! Can anyone give me guidance on what to do? I have read a lot about canonicalization and it seems complicated especially if you have 1000 duplicate page titles. I have seen that there are some extensions (i.e. Ultimate SEO Suite by aheadWorks) for Magento that claim to be able to solve duplicate content problems but I am really just grasping at straws and do not have the confidence or skills to implement this on my own. Can anyone please help? Thanks! Hunter
Intermediate & Advanced SEO | | leatherhidestore0 -
How to place two NADs on site (One website, 2 locations)
Hello, For our site: nlpca(dot)com we have 2 locations. One location is based out of a hotel in California, and one location is where we have our offices in Utah. Our site is about both locations, emphisizing California. Do we need to create a Utah page and put the Utah NAD on that page with separate address and phone number? What do we use as an address since we only have a hotel room in California now? What do we need to do to rank for both in the natural and also Places listings? Right now we're #1 for NLP California and #4 for NLP Utah Thanks!
Intermediate & Advanced SEO | | BobGW0