Can't generate a sitemap with all my pages
-
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly.
I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages?
Kindly,
Greg
-
Thank you all for the responses... I found them all helpful. I will look into creating my own sitemap with the IIS tool.
I can't help the 70k pages but the URLS are totally static. I guess I can make a site map for all the aspx pages and then other one for all the lowest level .html pages.
Thanks everyone!
-
I definitely agree with Logan. The max for an XML sitemap for Search Console is 50,000 URLs, so you won't be able to fit all of yours into one.
That being the case, divide them into different sitemaps by category or type, then list all of those in one directory sitemap and submit that. Now you can see indexation by page type on your website.
Finally, I have to ask why you are doing this with a third party tool and creating a static sitemap as opposed to creating a dynamic one that can update automatically when you publish new content? If your site is static and you're not creating new pages, then your approach might be ok, but otherwise I'd recommend investigating how you build a dynamic XML sitemap that updates with new content.
Cheers!
-
Looking at your site how sure are you that you need 70,000 pages?
For the sitemap I would stop trying to use a website and do it yourself. It looks like you are running IIS. They have a sitemap generator that you can install on a server easily and run it there. It looks like you have GoDaddy, they catch a lot of crap but I have always found their technical support to be top notch. If you can't figure out how to do it on the server I would give them a call.
-
Greg,
Have you tried creating multiple XML sitemaps by section of the site, like by folder or by product detail pages? 70,000 is a huge amount of URLs and even if you could get them all on one sitemap, I wouldn't recommend it. Nesting sitemaps into an index sitemap can help Google understand your site structure and make it easier for you to troubleshoot indexing problems should they arise.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How i can increase my page authority?
Hi, I have website and i want to increase my page authority. My website is latestdatabase.com I have making more backlinks but not good page authority so far. Please give me suggest.
Intermediate & Advanced SEO | | LatestMailingDatabase1 -
Pages automatically generated
Hello, I use the divi theme and got pages that were automatically generated with images. Is google going to penalise me because of those and consider it is thin content ? Should I remove those ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Do I need to remove pages that don't get any traffic from the index?
Hi, Do I need to remove pages that don't get any traffic from the index? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
What strategies can you use when you're optimizing for 10 locations x 20+ services?
We have a client site (a dentist) that has 10 locations and 20+ services (braces, teeth whitening, cosmetic dentistry, etc, etc.). We're trying to figure out the ideal approach to optimally cover all their locations and services, but each option we consider has drawbacks: Optimize service pages for service name + each location name (or at least the biggest location names), with service name and location names in the title tag. That results in a too long title tag, plus possible user confusion, since they are searching for "braces richmond" but the title tag lists other cities, some of which are in a different state. Optimize service pages for service name + each location name, but don't include the locations in the page title. This is the current option being used, but it appears to be hurting the rankings at least a bit not having the location name in the page title. Create a page for each service + location combo. That will be 200+ pages, which will mean the pages will be deeper in the site, with less link juice. Create new domains for each location/state covered. But then we have to start over building link juice. How have other sites dealt with this? What has worked best and what hasn't worked?
Intermediate & Advanced SEO | | AdamThompson2 -
How to solve outbound broken links? Those don't exist now?
There are many, many broken links on the website. What normal strategy to use for that? http://www.txacspecialist.com/air-conditioning-equipment-service-austin/american-standard/ It's an AC site, so all the links to AC vendors who have changed their product pages, all of those links are broken So for instance, the carrier 20xl doesn't exist anymore. Now they sell the carrier 45abp. We link carrier 20xl and now the page and AC model is not exist. So what I can do to solve the broken link issue?
Intermediate & Advanced SEO | | bondhoward0 -
Re-Direct Users But Don't Affect Googlebot
This is a fairly technical question... I have a site which has 4 subdomains, all targeting a specific language. The brand owners don't want German users to see the prices on the French sub domain and are forcing users into a re-direct to the relevant subddomain, based on their IP address. If a user comes from a different country, (ie the US) they are forced on the UK sub domain. The client is insistent on keeping control of who sees what (I know that's a debate in it's own right), but these re-directs we're implementing to make that happen, are really making it difficult to get all the subdomains indexed as I think googlebot is also getting re-directed and is failing to do it's job. Is there are a way of re-directing users, but not Googlebot?
Intermediate & Advanced SEO | | eventurerob0 -
SEO and marketing for a company that doesn't want to promote their primary website
Hi All! One of my new clients is in a semi-grey-hat industry, and is in perpetual danger of having their real websites (of which they have several), blocked by the Chinese firewall (which is where their target market is). So their idea is to use neutral sites to write information (Squidoo, article site, maybe a stand-alone WP site with a few pages) and promote those pages. The idea being that China is less likely to block those sites, and then the link to the actual website from those pages could always be changed if China blocks the website listed. I'm a little dubious as to how feasible this is - how do you promote a Squidoo page? Or an article on an article site for semi-competitive keywords? Besides on-page SEO (which may not be enough), is there anything you can really do post-Penguin? If anyone has any ideas as to the above - or as to how else to effectively market sites when you can't market the site and brand directly, I'd be very happy to hear. Thanks!
Intermediate & Advanced SEO | | debi_zyx0 -
Dynamically generated page issues
Hello All! Our site uses dynamically generated pages. I was about to begin the process of optimising our product category pages www.pitchcare.com/shop I was going to use internal anchor text from some high ranking pages within our site but each of the product category pages already have 1745 links! Am I correct in saying that internal anchor text links works to a certain point? (maybe 10 or so links) So any new internal anchor text links will count for nothing? Thanks Todd
Intermediate & Advanced SEO | | toddyC0