Can't generate a sitemap with all my pages
-
I am trying to generate a site map for my site nationalcurrencyvalues.com but all the tools I have tried don't get all my 70000 html pages... I have found that the one at check-domains.com crawls all my pages but when it writes the xml file most of them are gone... seemingly randomly.
I have used this same site before and it worked without a problem. Can anyone help me understand why this is or point me to a utility that will map all of the pages?
Kindly,
Greg
-
Thank you all for the responses... I found them all helpful. I will look into creating my own sitemap with the IIS tool.
I can't help the 70k pages but the URLS are totally static. I guess I can make a site map for all the aspx pages and then other one for all the lowest level .html pages.
Thanks everyone!
-
I definitely agree with Logan. The max for an XML sitemap for Search Console is 50,000 URLs, so you won't be able to fit all of yours into one.
That being the case, divide them into different sitemaps by category or type, then list all of those in one directory sitemap and submit that. Now you can see indexation by page type on your website.
Finally, I have to ask why you are doing this with a third party tool and creating a static sitemap as opposed to creating a dynamic one that can update automatically when you publish new content? If your site is static and you're not creating new pages, then your approach might be ok, but otherwise I'd recommend investigating how you build a dynamic XML sitemap that updates with new content.
Cheers!
-
Looking at your site how sure are you that you need 70,000 pages?
For the sitemap I would stop trying to use a website and do it yourself. It looks like you are running IIS. They have a sitemap generator that you can install on a server easily and run it there. It looks like you have GoDaddy, they catch a lot of crap but I have always found their technical support to be top notch. If you can't figure out how to do it on the server I would give them a call.
-
Greg,
Have you tried creating multiple XML sitemaps by section of the site, like by folder or by product detail pages? 70,000 is a huge amount of URLs and even if you could get them all on one sitemap, I wouldn't recommend it. Nesting sitemaps into an index sitemap can help Google understand your site structure and make it easier for you to troubleshoot indexing problems should they arise.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google doesn't show proper meta for my subpage, how to fix it?
We have a subdomain blog.companyname.com. I am working on its English version blog.companyname.com/en but for some reason Google shows meta description from blog.companyname.com in search results which is not in Englsih language. How do I force google to show blog.companyname.com/en 's own meta?
Intermediate & Advanced SEO | | SofyaFr0 -
Can you create town focused landing pages for a website without breaking Google guidelines?
I recently watched a webmaster video that said that town focused landing pages are seen as doorway pages if they only exist to capture search traffic. And then I read that just because you can sell your product/service in a certain area, doesn't mean you can have a page for it on your website. Is it possible to create town focused landing pages for a website without breaking Google guidelines?
Intermediate & Advanced SEO | | Silkstream1 -
Links / Top Pages by Page Authority ==> pages shouldnt be there
I checked my site links and top pages by page authority. What i have found i dont understand, because the first 5-10 pages did not exist!! Should know that we launched a new site and rebuilt the static pages so there are a lot of new pages, and of course we deleted some old ones. I refreshed the sitemap.xml (these pages are not in there) and upload it in GWT. Why those old pages appear under the links menu at top pages by page authority?? How can i get rid off them? thx, Endre
Intermediate & Advanced SEO | | Neckermann0 -
What is the proper way to execute 'page to page redirection'
I need to redirection every page of my website to a new url of another site I've made. I intend to add:"Redirect 301 /oldpage.html http://www.example.com/newpage.html"I will use the 301 per page to redirect every page of my site, but I'm confused that if I add:"Redirect 301 / http://mt-example.com/" it will redirect all of my pages to the homepage and ignore the URLs i have separately mentioned for redirection.Please guide me.
Intermediate & Advanced SEO | | NABSID0 -
My New(ish) Site Isn't Ranking Well And Recently Fell
I launched my site (jesfamilylaw.com) at the beginning of January. Since then, I've been trying to build high quality back links. I have a few back links with keyword targeted anchor text from some guest posts I've published (maybe 3 or so) and I have otherwise signed up for business directories and industry-specific directories. I have a few social media profiles and some likes on Facebook, both for the company page and some posts. Despite this, I've had a lot of trouble cracking Google's top ten for any term, long or tall tail. I was starting to climb for Evanston Family Law, which is the key term I believe I am best optimized for, but took a dive yesterday. I fell from maybe the 14th result to somewhere on the 4th page. For all my other target terms, I don't know if I've gotten into the 20s yet. To further complicate matters, my Google Places listing isn't showing and is on the second page of results for Places searches, after businesses that aren't located in the same city. The night before I fell, I resubmitted my site to Google because Webmaster tools was showing duplicate title tags when I had none. I had also made a couple changes to some internal links and title tags, but only for a small fraction of the site. Long story short, I don't know what's going on. I don't know why I fell in the rankings and why my site isn't competitive for some of my target key phrases. I've read so many horror stories about Penguin that I fear my onsite optimization may be hurting my rankings or my back links are insufficient. I've done plenty of competitor research and the sites that are beating me have very aggressive onsite optimization and few back links. In short, I am very confused. Any help would be immensely appreciated.
Intermediate & Advanced SEO | | JESFamilyLaw0 -
Lots of city pages - How do I ensure we don't get penalized
We are planning on having a job posting page for each city that we are looking to hire new CFO partners in. But, the problem is, we have LOTS of locations. I was wondering what would be the best way to have similar content on each page (since the job description and requirements are the same for each job posting) without being hit by Google for having duplicate content? One of the main reasons we have decided to have location based pages is that we have noticed visitors to our site are searching for "cfo job in [location] but we notice that most of these visitors then leave. We believe it to be because the pages they land on make no mention of the location that they were looking for and is a little incongruent with what they were expecting. We are looking to use the following URLs and TItle/Description as an example: | http://careers.b2bcfo.com/cfo-jobs/Alabama/Birmingham | CFO Careers in Birmingham, AL | | Are you looking for a CFO Career in Birmingham, Alabama ? We're looking for partners there. Apply today! | | Any advice you have for this would be greatly appreciated. Thank you.
Intermediate & Advanced SEO | | B2B.CFO0 -
Generating 404 Errors but the Pages Exist
Hey I have recently come across an issue with several of a sites urls being seen as a 404 by bots such as Xenu, SEOMoz, Google Web Tools etc. The funny thing is, the pages exist and display fine. This happens on many of the pages which use the Modx CMS, but the index is fine. The wordpress blog in /blog/ all works fine. The only thing I can think of is that I have a conflict in the htaccess, but troubleshooting this is difficult, any tool I have found online seem useless. Have tried to rollback to previous versions but still does not work. Anyone had any experience of similar issues? Many thanks K.
Intermediate & Advanced SEO | | Found0 -
Google swapped our website's long standing ranking home page for a less authoritative product page?
Our website has ranked for two variations of a keyword, one singular & the other plural in Google at #1 & #2 (for over a year). Keep in mind both links in serps were pointed to our home page. This year we targeted both variations of the keyword in PPC to a products landing page(still relevant to the keywords) within our website. After about 6 weeks, Google swapped out the long standing ranked home page links (p.a. 55) rank #1,2 with the ppc directed product page links (p.a. 01) and dropped us to #2 & #8 respectively in search results for the singular and plural version of the keyword. Would you consider this swapping of pages temporary, if the volume of traffic slowed on our product page?
Intermediate & Advanced SEO | | JingShack0