What is the best tool for getting a SiteMap url for a website with over 4k pages?
-
I have just migrated my website from HUGO to Wordpress and I want to submit the sitemap to Google Search Console (because I haven't done so in a couple years). It looks like there are many tools for getting a sitemap file built. But I think they probably vary in quality. Especially considering the size of my site.
-
Screaming Frog is good for crawling an existing Sitemap.xml file and can indeed produce Sitemap.xml files, but if our site is medium-sized (thousands of URLs) then really you'd want a dynamic one. Pretty sure the Yoast SEO plugin for WordPress has this built in with some tweak options and variables, probably start there
With Screaming Frog you'd have to keep manually re-building your sitemap XML / XML index file. Sounds pointless, boring and tedious when relatively stable dynamic options exist
-
As Keen already stated, Screaming From is the best option
-
We use Screaming Frog and we are about 80k pages.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce Preferred URL Structure for Printing Website
Hello Mozers! We are adding an ecommerce functionality to our existing website.
Technical SEO | | CheapyPP
Our company offers a wide range of commercial printing and mail services. We have done a pretty good job over the years in building content both in terms of our print offerings and blog section highlighting those offerings. We have finally bit the bullet and have decided to add end-to end ecommerce functionality. Users will be able to price, pay, upload and order thru our website. My question to the community becomes which sub folder do we use?
The ecommerce functionality is a third part software and needs to sit in a sub folder and we can't seem to find a good fit. Most of our content pages for print items are something like this www.website/printing/ - pillar page examples of url structure for sub pages www.website/printing/flyer-printing/
www.website/printing/booklet-printing/
www.website/printing/door-hangers/
www.website/printing/business-cards/ Options would be order-printing/ or prints/ So we we thinking /orders/ would be the best but not certain and wanted some feedback from the community. If we did go this route the url structure would be: order/business-cards this would be the default econ page order/business-cards/full-uv-coaing-both-sides individual product page What are your thoughts? CH0 -
How to 301 trailing URLs to new domain home page - wildcard?
How would I add a redirect rule so all old domain URLs redirect to a new domain? All the old pages no longer exist on a new website. The domains have been through several CMS platforms, so it would be unnecessary to recreate them. Problem is, they're indexed in search engines from the past 10 years, so it's causing a lot of 404s. Example: search "NARI Tampa Bay" and you'll find 2 old domains: nari-tampabay.com & nari-tampabay.org. The new domain is naritb.org Those 2 old domains are now pointed to the same nameservers as the new and listed as parked domains. Here's the current rules in htaccess: <code>RewriteEngine On RewriteCond %{HTTP_HOST} ^nari-tampabay.org [NC,OR] RewriteCond %{HTTP_HOST} ^www.nari-tampabay.org [NC] RewriteRule ^(.*)$ https://www.naritb.org/$1 [L,R=301] RewriteEngine On RewriteCond %{HTTP_HOST} ^nari-tampabay.com [NC,OR] RewriteCond %{HTTP_HOST} ^www.nari-tampabay.com [NC] RewriteRule ^(.*)$ https://www.naritb.org/$1 [L,R=301]</code>
Technical SEO | | CartoMark0 -
Best Practice - Linking out to client websites in niche industry
I have a client in a niche building industry that provides 4 different services to them. She has provided me with a list of 131 past clients of hers that she wants hyperlinked on her site to theirs. The logic is that a lot of these clients are heavy hitters and quite impressive to their peers so the links will be reinforcing my client's value. Is there a best practice for determining whether the link should be follow/no follow? Should I be checking the client's site's spam score, page rank, anything else? Some of these 131 links will be duplicated due to the client performing more than one service for them.
Technical SEO | | JanetJ1 -
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean?
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean? We have added some text in the pages with keywords thats related the page
Technical SEO | | AlexisWithers0 -
Google Crawling Issues! How Can I Get Google to Crawl My Website Regularly?
Hi Everyone! My website is not being crawled regularly by Google - there are weeks when it's regular but for the past month or so it does not get crawled for seven to eight days. There are some specific pages, that I want to get ranked but they of late are not being crawled AT ALL unless I use the 'Fetch As Google' tool! That's not normal, right? I have checked and re-checked the on-page metrics for these pages (and the website as a whole, backlinking is a regular and ongoing process as well! Sitemap is in place too! Resubmitted it once too! This issue is detrimental to website traffic and rankings! Would really appreciate insights from you guys! Thanks a lot!
Technical SEO | | farhanm1 -
Is there a way to get Google to index more of your pages for SEO ranking?
We have a 100 page website, but Google is only indexing a handful of pages for organic rankings. Is there a way to submit to have more pages considered? I have optimized meta data and get good Moz "on-page graders" or the pages & terms that I am trying to connect....but Google doesn't seem to pick them up for ranking. Any insight would be appreciated!
Technical SEO | | JulieALS0 -
Product page Canonicalization best practice
I'm getting duplicate content errors in GWT for product list pages that look like this: -www.example.com/category-page/product
Technical SEO | | IceIcebaby
-www.example.com/category-page/product/?p=2 The "p=2" example already has a rel=canonical in place, " Shouldn't the non-canonical pages be using the canonical attribute for the first page rather than the additional product pages? Thanks!0 -
Google webmaster tools says access denied for 77 urls
Hi i am looking in google webmaster tools and i have seen a major problem which i hope people can help me sort out. The problem is, i am being told that 77 urls are being denied access. The message when i look for more information says the below Googlebot couldn't crawl your URL because your server either requires login to access the page, or is blocking Googlebot from accessing your site. the responce code is 403 here is a couple of examples http://www.in2town.co.uk/Entertainment-Magazine http://www.in2town.co.uk/Weight-Loss-Hypnotherapy-helped-woman-lose-3-stone i think the problem could be that i have sent them to another url in my httaccess file using the 403 re-direct but why would it bring up that google bot could not crawl them any help would be great
Technical SEO | | ClaireH-1848860