How Do I Generate a Sitemap for a Large Wordpress Site?
-
Hello Everyone!
I am working with a Wordpress site that is in Google news (i.e. everyday we have about 30 new URLs to add to our sitemap) The site has years of articles, resulting in about 200,000 pages on the site. Our strategy so far has been use a sitemap plugin that only generates the last few months of posts, however we want to improve our SEO and submit all the URLs in our site to search engines.
The issue is the plugins we've looked at generate the sitemap on-the-fly. i.e. when you request the sitemap, the plugin then dynamically generates the sitemap. Our site is so large that even a single request for our sitemap.xml ties up tons of server resources and takes an extremely long time to generate the sitemap (if the page doesn't time out in the process).
Does anyone have a solution?
Thanks,
Aaron
-
In my case, xml-sitempas works extremely good. I fully understand that a DB solution would avoid the crawl need, but the features that I get from xml-sitemaps are worth it.
I am running my website on a powerful dedicated server with SSDs, so perhaps that's why I'm not getting any problems plus I set limitations on the generator memory consumption and activated the feature that saves temp files just in case the generation fails.
-
My concern with recommending xml-sitemaps was that I've always had problems getting good, complete maps of extremely large sites. An internal CMS-based tool is grabbing pages straight from the database instead of having to crawl for them.
You've found that it gets you a pretty complete crawl of your 5K-page site, Federico?
-
I would go with the paid solution of xml-sitemaps.
You can set all the resources that you want it to have available, and it will store in temp files to avoid excessive consumption.
It also offers settings to create large sitemaps using a sitemap_index and you could get plugins that create the news sitemap automatically looking for changes since the last sitemap generation.
I have it running in my site with 5K pages (excluding tag pages) and it takes 10 minutes to crawl.
Then you also have plugins that create the sitemaps dynamically, like SEO by Yoast, Google XML Sitemaps, etc.
-
I think the solution to your server resource issue is to create multiple sitemaps, Aaron. Given that the sitemap protocol only allows 50,000 URLs max. per sitemap and Google News sitemaps can't be over 1000 URLs, this was going to be a necessity anyway, so may as well use these limitations to your advantage.
There's a functionality available for sitemaps called a sitemap index. It basically lists all the sitemap.xmls you've created, so the search engines can find and index them. You put it at the root of the site and then link to it in robots.txt just like a regular sitemap. (Can also submit it in GWT). In fact, Yoast's SEO plugin sitemaps and others use just this functionality already for their News add-on.
In your case, you could build the News sitemap dynamically to meet its special requirements (up to 1000 URLs and will crawl only last 2 days of posts) and to ensure it's up-to-the-minute accurate, as is critical for news sites.
Then separately you would build additional, segmented sitemaps for the existing 200,000 pages. Since these are historical pages, you could easily serve them from static files, since they wouldn't need to update once created. By having them static, there's be no server load to serve them each time - only the load to generate the current news sitemap. (I'd actually recommend you keep each static sitemap to around 25,000 pages each to ensure search engines can crawl them easily)
This approach would involve a bit of fiddling to initially set up, as you'd need to generate the "archive" sitemaps then convert them to static versions, but once set up, the News sitemap would take care of itself and once a month (or whatever you decide) you'd need to add the "expiring" pages from the News sitemap to the most recent "archive" segment. A smart programmer might even be able to automate that process.
Does this approach sound like it might solve your problem?
Paul
P.S. Since you'd already have the sitemap index capability, you could also add video and image sitemaps to your site if appropriate.
-
Have you ever tried using a web-based sitemap generator? Not sure how it would respond to your site but at least it would be running on someone else's server, right?
Not sure what else to say honestly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages automatically generated
Hello, I use the divi theme and got pages that were automatically generated with images. Is google going to penalise me because of those and consider it is thin content ? Should I remove those ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Having problem with multiple ccTLD sites, SERP showing different sites on different region
Hi everyone, We have more than 20 websites for different region and all the sites have their specific ccTLD. The thing is we are having conflict in SERP for our English sites and almost all the English sites have the same content I would say 70% of the content is duplicating. Despite having a proper hreflang, I see co.uk results in (Google US) and not only .co.uk but also other sites are showing up (xyz.in, xyz.ie, xyz.com.au)The tags I'm using are below, if the site is for the US I'm using canonical and hreflang tag :https://www.xyz.us/" />https://www.xyz.us/" hreflang="en-us" />and for the UK siteshttps://www.xyz.co.uk/" />https://www.xyz.co.uk/" hreflang="en-gb" />I know we have ccTLD so we don't have to use hreflang but since we have duplicate content so just to be safe we added hreflang and what I have heard/read that there is no harm if you have hreflang (of course If implemented properly).Am I doing something wrong here? Or is it conflicting due to canonicals for the same content on different regions and we are confusing Google so (Google showing the most authoritative and relevant results)Really need help with this.Thanks,
Intermediate & Advanced SEO | | shahryar890 -
Why my site not ranking
Hello everyone, can anyone suggest me, where i am having problem in my site www.suntechengineers.com, i know content is less,
Intermediate & Advanced SEO | | poojathakar
but any other things that i am missing in my site? Is There any on page query please let me know, i need urgently getting up this,please help thanx in advance0 -
New site. How important is traffic for a new site? And what about domain age?
Hi guys. I've been building a new site because i've seen a real SEO opportunity out there. I'm a mixing professional by trade and so I wanted to take advantage of SEO to help gain more work. Here's the site: www.signalchainstudios.co.uk I'm curious about domain age. This site fairly well optimised for my keywords, and my site got pretty good content on it (i think so anyway). But it's no where to be seen on the SERP's (link at all). Is this just a domain age issue? I'd have though it might be in the top 50 because my site's services are not hard to rank for at all! Also what about traffic? Does Google want to see an 'active' site before it considers 'promoting' it up the ranks? Or are back links and good content the main factor in the equation? Thanks in advance. I love this community to bits 🙂 Isaac.
Intermediate & Advanced SEO | | isaac6631 -
WordPress Duplicate URLs?
On my site, there are two different category bases leading to the exact same page. My developer claims that this is a common — and natural — occurrence when using WordPress, and that there's not a duplicate content issue to worry about. Is this true? Here's an example of the correct url. and... Here's an example of the same exact content, but using a different url. Notice that one is coming from /topics and the other is coming from /authors base. My understanding is that this is bad. Am I wrong?
Intermediate & Advanced SEO | | JasonMOZ1 -
Consolidate Local sites to one larger site
I am a partner in a real estate company that operates in 10 different markets across the country. Each of these markets has it's own individual domain. My question is should we consolidate each of these markets into one domain that services all markets? What would we possibly gain or lose from an organic traffic standpoint? In some of our more established markets (Indianapolis, Las Vegas, Tampa, Orlando and Charlotte) our organic traffic accounts for 50-60% of our total traffic. In some of our newer markets (Denver, Phoenix, San Diego) it accounts for less than 15%. We do operate under two different brand names. EasyStreet Realty and Highgarden Real Estate. EasyStreet has been around since 2000 with most of our Highgarden sites only up for 6-24 months. Another question is we are considering converting all EasyStreet divisions to Highgarden. I am a little reluctant to do so, since most of our organic traffic is coming from our EasyStreet sites. Thoughts? You can find links to all our sites at www.easystreetrealty.com or www.highgarden.com Thank you in advance for your insight.
Intermediate & Advanced SEO | | EasyStreet0 -
After Receiving a "Googlebot can't access your site" would this stop your site from being crawled?
Hi Everyone,
Intermediate & Advanced SEO | | AMA-DataSet
A few weeks ago now I received a "Googlebot can't access your site..... connection failure rate is 7.8%" message from the webmaster tools, I have since fixed the majority of these issues but iv noticed that all page except the main home page now have a page rank of N/A while the home page has a page rank of 5 still. Has this connectivity issues reduced the page ranks to N/A? or is it something else I'm missing? Thanks in advance.0 -
Redirecting site from html/php to wordpress
I've never come across this and haven't been able to really find anything that explains it very well. I want to get opinions before we make a definitive decision. Here's the scenario... I am working on a site that was built in HTML/PHP and some of the pages are ranking pretty well. (some page 1, but not number 1) We are going to start using the Wordpress platform by year's end. The pages that were built in html have been built a little spammy but they still rank. I just think they are keyword stuffed a little and not very "reader friendly" (I think the last person was spinning content). So, we've built completely new content on our new pages and we've commissioned really good content writers for them. I will be handling the on-page SEO going forward so I know what to do there. My questions are this.... Should I 301 the old pages to the new pages with the better content? (old pages have the .html or .php extensions so www.example.com/keyword.php will become www.example.com/keyword-keyword Is there any negative side to doing this since the content will be completely different then the old pages that are being 301 from. (Keywords are pretty much staying the same with the exception of minor variations. ie, www.example.com/red-cashmere-sweater.php to www.example.com/cashmere-sweater) I ask this because I've moved sites before where I've just changed the location of the same content. I've never done it where the content is changing and so is the URL extension. Thank you in advance for your help and guidance.
Intermediate & Advanced SEO | | DarinPirkey0