Question about construction of our sitemap URL in robots.txt file
-
Hi all,
This is a Webmaster/SEO question. This is the sitemap URL currently in our robots.txt file:
http://www.ccisolutions.com/sitemap.xml
As you can see it leads to a page with two URLs on it. Is this a problem? Wouldn't it be better to list both of those XML files as separate line items in the robots.txt file?
Thanks!
Dana
-
Hi Jarno,
Thanks so very much! I have to say I am really liking the A1 generator. How awesome of you to follow up. I really appreciate that. Yes, if you want to send me the complete sitemap via PM that would be awesome. I certainly hope I can return the favor Happy Holidays!
Dana
-
Yes, we definitely use XENU, but I think I like Screaming Frog a bit better (although our IT Director swears it's broken).
-
Hi Christopher,
Thanks for the update. Yes, I looked at it too and other than it not being "pretty" XML, the data seemed to be okay. The one thing the A! generator did that we couldn't do was assign the values for importance and frequency specific pages are modified. If that data is accurate, that's pretty cool. I'm just not sure, although it seems it did identify pages that are modified more frequently correctly. I have 30 days to play with the free trial, but so far I think I like it a lot.
Dana
-
Dana,
It just finished scanning here are the results:
Internal Sitemap URL's:
- Listed found: 5248
- Listed deduced: 5301
- Analyzed content: 3110
- Analyzed references: 3176
External URL's:
- Listed found: 700
When i look at the overview of the result i see a number of 301 redirects, canonical redirects (when tested again the get code 200 OK). But I see a lot op pages.
When i build the sitemap it generates one file (no idea why not more then one) with all the links in the document. Google's sitemap protocol states it should be like the schema at sitemaps.org which it does. The entire protocol of sitemap.org states that a sitemap can not hold over 50,000 links and should be smaller then 10 MB in filesize.
The one I just build for you is only 1 MB and contains less url's then 50,000 and thus is it allowed by Google.
http://www.sitemaps.org/protocol.html
I can send you the entire version of the sitemap if you'd like in a personal message or through e-mail?
Hope this helps you further.
kind regards
Jarno
-
i started the scan and it's still busy:
2500 analyzed references so far.
Let you know how it turns out.
Jarno
-
Thanks Jarno. I really appreciate that. Yes, I had it selected to just scan for images (as prompted when I attempted to create an image sitemap). Let me know what you see? I am wondering if it is going around in circles?
Dana
-
Dana,
sometimes that happens. Are you scanning for images or are you scanning the site?
i will check your site tomorrow with my full version and see what it does.
Sometimes with some websites you'll get things like this but it can be loads of things. 3500 pages should not take 2 hours but only a couple of minutes. I'll check it first thing tomorrow. A1 is not installed on my laptop..
Let you know tomorrow.
Kind regards
Jarno
-
A1 Sitemap does 2 things:
1 ) It builds a file names sitemap.xml which contains all files on the website (not conform the google requirements
-
It builds a number of files listed in sitemap-index.xml for every 100 pages in one sitemap. So if you're website contains 2800 pages You'll get loads of files: 28 sitemap-1.xml etc and 1 sitemap-index.xml file. Which does meet the Google standards. Afterwards you can do 2 things in Google webmasters:
-
enter the sitemap-index.xml file as a sitemap -> Google will follow everything and come to the grand total of 2800 pages.
-
Enter each sitemap separately.-> same result but you can pinpoint better where you have a 100 pages and google only indexes fewer (can happen).
Hope this helps
-
-
Hi again Jarno,
Is it normal for A1's sitemap generator's "Scan website" function for images to take over two hours? Our site is about 3,500 URLs. So far it has under "Internal 'sitemap' URLs" Listed found: 82076 (and climbing every few seconds).
I am wondering if there isn't something wrong? (I don't have any frame of reference since I've never used it before). Thanks!
Dana
-
I'm not familiar with the A1 Sitemap generator, but regarding the sitemap protocol, there is a limit on the size of a single sitemap.xml file, so for large sites, the sitemap must be split into multiple sitemap.xml files. And, the protocol has a method for indexing these multiple sitemap.xml files. It's sort of like an index to an index. None of my sites exceed the sitemap file limit, so I don't know which sitemap generators use this approach, but I would guess many of them do.
Sitemap generators I have used include DMXZone which is a Dreamweaver plugin, and xml-sitemaps.com which includes a video sitemap generator.
Best,
ChristopherEDIT: PS: Your current sitemap looks fine to me.
-
Thanks Christopher,
Your answer took a noment to sink in, but I think I get it (I think I am coffee deprived this morning).
So, if I am using the A1 Sitemap generator that Jarno suggested, this sitemap index should automatically be generated based on the size of my generated sitemap. Is that correct?
-
Thanks Jarno,
I have downloaded and am trying the 30-day free trial of the A1 Sitemap Generator right now. Thanks for the tip. Can you comment on Christopher's remark below concerning sitemap indexes for larger sitemaps?
Can either you or Christopher give me more clarification on that. Is this what our IT director has attempted to do with the sitemap in our robots.txt file? If so, has it been done correctly?
Thanks!
-
There is a limit on the size of a sitemap and to allow for large sitemaps to be split into smaller sitemaps, the sitemap protocol includes a sitemapindex. See "Using Sitemap index files (to group multiple sitemap files)" here http://www.sitemaps.org/protocol.html. Of course, it's also possible to include the multiple sitemaps in the robot.txt file, but automated sitemap generators will likely use the sitemapindex feature so that the robots.txt file does not have to be modified as the size of the site changes.
Best,
Christopher -
Another tool to help generate a sitemap and even check broken links is called Xenu (weird logo, but good free product).
-
Dana,
the buildup of your sitemap.xml is very strange to me. I use an external program to build my sitemap.xml for me entire website.
You now have a link in your robots.txt file pointing to a sitemap which contains 2 files (both .xml) with een map of the site?
Why not use a program (free or paid like Microsys A1 (the one I use)) to build 1 sitemap.xml en point to this file from your robots.txt?
hope this helps
if you do have any questions, please let me know.
kind regards
Jarno
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots.txt
Hello, My client has a robots.txt file which says this: User-agent: * Crawl-delay: 2 I put it through a robots checker which said that it must have a **disallow command**. So should it say this: User-agent: * Disallow: crawl-delay: 2 What effect (if any) would not having a disallow command make? Thanks
Technical SEO | | AL123al0 -
Exclude root url in robots.txt ?
Hi, I have the following setup: www.example.com/nl
Technical SEO | | mikehenze
www.example.com/de
www.example.com/uk
etc
www.example.com is 301'ed to www.example.com/nl But now www.example.com is ranking instead of www.example.com/nl
Should is block www.example.com in robots.txt so only the subfolders are being ranked?
Or will i lose my ranking by doing this.0 -
Structure of urls
**Hallo from Athens, Greece. We have to implement the following project and i need your help: ** We will build a company guide for the whole country and company local guides for each city for the same client. **Information of the country guide is the sum of information of local guides, so when a user is at the country guide he sees information from companies from all cities and when the user is at city guide he sees info only for the city. ** The problem is the structure of the url we should have. Should the page of presentation of each company should have structure as domain.gr/id/company? or city.domain.gr/id/company and the one to be canonical to the other? is this good for seo? Should both urls be included in the sitemap? Thank you
Technical SEO | | herculesopa0 -
Questions about Redirects
Hi, I am trying to make sure that I can determine if a site has a 301 redirect set up to redirect the site from domain.com to www.domain.com and am hoping that you can confirm the following for me, or let me know if I am off track: is http://www.internetofficer.com/seo-tool/redirect-check/ a reliable way to check if a 301 redirect is set up? is Screaming Frog SEO Spider a good tool to use to see if a redirect is in place? if I search for site:www.domain.com and site:domain.com, I should only get results for the site being indexed, not for the site that has the 301 redirect set up, right? For example, if www.domain.com is set up to redirect to domain.com, then I should get no search results for site:www.domain.com and only show indexed pages for domain.com. If I search for site:www.domain.com and site:domain.com and get results for both, then does this mean that the redirect is not set up? if a redirect is set up from www.domain.com to domain.com, should the crawl report should only show one page crawled on www.domain.com? if a crawl report shows same number of pages for www.domain.com as for domain.com, does that mean that redirect is not set up properly? Thanks in advance for your help! Carolina
Technical SEO | | csmm0 -
Hyphen in URL
Hi, I would like to know if the following statement holds true today or it doesn't matter whether we use hyphens or underscore If you have a URL like keyword1_keyword2, Google will only return that page if the user searches for keyword1_keyword2 ( highly unlikely ) . But If you have a URL like keyword1-keyword2, that page can be returned for the searches - keyword1,keyword2 and even “keyword1keyword2” Thanks
Technical SEO | | seoug_20050 -
Robots.txt and 301
Hi Mozzers, Can you answer something for me please. I have a client and they have 301 re-directed the homepage '/' to '/home.aspx'. Therefore all or most of the linkjuice is being passed which is great. They have also marked the '/' as nofollow / noindex in the Robots.txt file so its not being crawled. My question is if the '/' is being denied access to the robots is it still passing on the authority for the links that go into this page? It is a 301 and not 302 so it would work under normal circumstances but as the page is not being crawled do I need to change the Robots.txt to crawl the '/'? Thanks Bush
Technical SEO | | Bush_JSM0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0