How Do I Generate a Sitemap for a Large Wordpress Site?
-
Hello Everyone!
I am working with a Wordpress site that is in Google news (i.e. everyday we have about 30 new URLs to add to our sitemap) The site has years of articles, resulting in about 200,000 pages on the site. Our strategy so far has been use a sitemap plugin that only generates the last few months of posts, however we want to improve our SEO and submit all the URLs in our site to search engines.
The issue is the plugins we've looked at generate the sitemap on-the-fly. i.e. when you request the sitemap, the plugin then dynamically generates the sitemap. Our site is so large that even a single request for our sitemap.xml ties up tons of server resources and takes an extremely long time to generate the sitemap (if the page doesn't time out in the process).
Does anyone have a solution?
Thanks,
Aaron
-
In my case, xml-sitempas works extremely good. I fully understand that a DB solution would avoid the crawl need, but the features that I get from xml-sitemaps are worth it.
I am running my website on a powerful dedicated server with SSDs, so perhaps that's why I'm not getting any problems plus I set limitations on the generator memory consumption and activated the feature that saves temp files just in case the generation fails.
-
My concern with recommending xml-sitemaps was that I've always had problems getting good, complete maps of extremely large sites. An internal CMS-based tool is grabbing pages straight from the database instead of having to crawl for them.
You've found that it gets you a pretty complete crawl of your 5K-page site, Federico?
-
I would go with the paid solution of xml-sitemaps.
You can set all the resources that you want it to have available, and it will store in temp files to avoid excessive consumption.
It also offers settings to create large sitemaps using a sitemap_index and you could get plugins that create the news sitemap automatically looking for changes since the last sitemap generation.
I have it running in my site with 5K pages (excluding tag pages) and it takes 10 minutes to crawl.
Then you also have plugins that create the sitemaps dynamically, like SEO by Yoast, Google XML Sitemaps, etc.
-
I think the solution to your server resource issue is to create multiple sitemaps, Aaron. Given that the sitemap protocol only allows 50,000 URLs max. per sitemap and Google News sitemaps can't be over 1000 URLs, this was going to be a necessity anyway, so may as well use these limitations to your advantage.
There's a functionality available for sitemaps called a sitemap index. It basically lists all the sitemap.xmls you've created, so the search engines can find and index them. You put it at the root of the site and then link to it in robots.txt just like a regular sitemap. (Can also submit it in GWT). In fact, Yoast's SEO plugin sitemaps and others use just this functionality already for their News add-on.
In your case, you could build the News sitemap dynamically to meet its special requirements (up to 1000 URLs and will crawl only last 2 days of posts) and to ensure it's up-to-the-minute accurate, as is critical for news sites.
Then separately you would build additional, segmented sitemaps for the existing 200,000 pages. Since these are historical pages, you could easily serve them from static files, since they wouldn't need to update once created. By having them static, there's be no server load to serve them each time - only the load to generate the current news sitemap. (I'd actually recommend you keep each static sitemap to around 25,000 pages each to ensure search engines can crawl them easily)
This approach would involve a bit of fiddling to initially set up, as you'd need to generate the "archive" sitemaps then convert them to static versions, but once set up, the News sitemap would take care of itself and once a month (or whatever you decide) you'd need to add the "expiring" pages from the News sitemap to the most recent "archive" segment. A smart programmer might even be able to automate that process.
Does this approach sound like it might solve your problem?
Paul
P.S. Since you'd already have the sitemap index capability, you could also add video and image sitemaps to your site if appropriate.
-
Have you ever tried using a web-based sitemap generator? Not sure how it would respond to your site but at least it would be running on someone else's server, right?
Not sure what else to say honestly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemaps: Best Practice
What should and what shouldn't go in the sitemap? In particular, pages like subscribe to our newsletter/ unsubscribe to our newsletter? Is there really any benefit in highlighting those pages to the SEs? Thanks for any advice/ anecdotes 🙂
Intermediate & Advanced SEO | | Fubra0 -
XML Sitemap Questions For Big Site
Hey Guys, I have a few question about XML Sitemaps. For a social site that is going to have presonal accounts created, what is the best way to get them indexed? When it comes to profiles I found out that twitter (https://twitter.com/i/directory/profiles) and facebook (https://www.facebook.com/find-friends?ref=pf) have directory pages, but Google plus has xml index pages (http://www.gstatic.com/s2/sitemaps/profiles-sitemap.xml). If we go the XML route, how would we automatically add new profiles to the sitemap? Or is the only option to keep updating your xml profiles using a third party software (sitemapwriter)? If a user chooses to not have their profile indexed (by default it will be index-able), how do we go about deindexing that profile? Is their an automatic way of doing this? Lastly, has anyone dappled with google sitemap generator (https://code.google.com/p/googlesitemapgenerator/) if so do you recommend it? Thank you!
Intermediate & Advanced SEO | | keywordwizzard0 -
Custom sitemap or sitemap generator tool
I have recently launched a website which is using a free sitemap generator (http://web-site-map.com/). It's a large travel agency site (www.yougoadventure.com) with predominantly dynamically generated content - users can add their products as and when and be listed automatically. The guy doing the programming for the site says the sitemap generator is not up to the job and that I should be ranking far better for certain search terms than the site is now. He reckons it doesn't provide lastmod info and the sitemap should be submitted every time a new directory is added or change made. He seems to think that I need to spend £400-£500 for him to custom build a site map. Surely there's a cheaper option out there for a sitemap that can be generated daily or 'ping' google every-time an addition to the site is made or product added? Sorry for the non tech speak - Ive got my web designer telling one thing and the programmer another so im just left trawling through Q&As. Thanks
Intermediate & Advanced SEO | | Curran0 -
XML Sitemaps - how to create the perfect XML Sitemap
Hello, We have a site that is not updated very often - currently we have a script running to create/update the XML sitemap every time a page is added/edited or deleted. I have a few questions about best practices for creating XML sitemaps. 1. If the site is not updated for months on end - is it a bad idea to force the script to update i.e. changing the dates once a month? Will google noticed nothing has changed just the date i.e. all the content on the site is exactly the same. Will they start penalising you for updating an XML sitemap when there is nothing new about the website?
Intermediate & Advanced SEO | | JohnW-UK
2. Is it worth automating the XML file to link into Bing/Google to update via webmaster tools - as I say even if the site is never updated?
3. Is the use of "priorities" necessary?
4. The changefreq - does that mean Google/Bing expects to see a new file ever month?
5. The ordering of the pages - the script seems pretty random and put the pages in a random order - should we make it order the pages with the most important ones first? Should the home page always be first?
6. Below is a sample of how our XML sitemap appears - is there anything that we should change? i.e. all marked up properly? This XML file does not appear to have any style information associated with it. The document tree is shown below.
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9"><url><loc>http://www.domain.com</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url>
<url><loc>http://www.domain.com/contact/</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url>
<url><loc>http://www.domain.com/sitemap/</loc>
<lastmod>2013-11-06</lastmod>
<changefreq>monthly</changefreq></url></urlset> Hope someone can help enlighten us to best practices0 -
Any Suggestions For My Site?
I've recently started a website that is based on movie posters. The site has fundamentally been built for users and not SEO but I'm wondering if anyone can see any problems or just general advice that may help with our SEO efforts? The "content" on the website are the movie posters. I know Google likes text content, but I don't see what else we could add that wouldn't be purely for SEO. My site is: http://www.bit.ly/ZSPbTA
Intermediate & Advanced SEO | | whispertera0 -
Site Structure Question
Hi All, Got a question about site structure, I currently have a website where everything is hosted on the root of the domain. See example below: site.com/men site.com/men-shorts site.com/men-shorts-[product name] I want to change the structure to site.com/men/shorts/[product-name] I have asked a couple of SEOs and some agree with me that the structure needs to be changed and some say that as long as I dictate the structure with internal links and breadcrumbs the URL structure doesn't matter... What do you guys think? Many thanks, Carlos
Intermediate & Advanced SEO | | Carlos-R0 -
Ranking a site in the USA
I'm UK based and looking at setting up a site to rank in the USA. As I understand it a .com TLD is best but these are used worldwide so do I simply need to set the geotargeting to USA in webmaster tools? Or is there a better domain to use? With hosting the site in US and on page content related to US cities (I plan to create a page for each US city I operate in the the city name in the H1 tag) will that be enough for google to understand that the page should rank in the US version of google. Also how can I view Google USA search results - when I go to google.com it automatically redirects to google.co.uk and I can only change the location on the left hand side to UK cities. Any help much appreciated!
Intermediate & Advanced SEO | | SamCUK0 -
Large Site SEO - Dev Issue Forcing URL Change - 301, 302, Block, What To Do?
Hola, Thanks in advance for reading and trying to help me out. A client of mine recently created a large scale company directory (500k+ pages) in Drupal v6 while the "marketing" type pages of their site was still in manual hard-coded HTML. They redesigned their "marketing" pages, but used Drual v7. They're now experiencing server conflicts with both instances of Drupal not allowing them to communicate/be on the same server. Eventually the directory will be upgraded to Drupal v7, but could take weeks to months the client does not want to wait for the re-launch. The client wants to push the new marketing site live, but also does not want to ruin the overall SEO value of the directory and have a few options, but I'm looking to help guide them down the path of least resistance: Option 1: Move the company directory onto a subdomain and the "marketing site" on the www. subdomain. Client gets to push their redesign live, but large scale 301s to the directory cause major issues in terms of shaking up the structure of the site causing ripple effects into getting pulled out of the index for days to weeks. Rankings and traffic drop, subdomain authority gets lost and the company directory health looks bad for weeks to months. However, 301 maintains partial SEO value and some long tail traffic still exists. Once the directory gets moved to Drupal v7, the directory will then cancel the 301 to the subdomain and revert back to original www. subdomain URLs Option 2: Block the company directory from search engines with robots.txt and meta instructions, essentially cutting off the floodgates from the established marketing pages. No major scaling 301 ripple effect, directory takes a few weeks to filter out of the index, traffic is completely lost, however once drupal v7 gets upgraded and the directory is then re-opened, directory will then slowly gain back SEO value to get close to old rankings, traffic, etc. Option 3: 302 redirect? Lose all accumulate SEO value temporarily... hmm Option 4: Something else? As you can see, this is not an ideal situation. However, a decision has to be made and I'm looking to chose the lesser of evils. Any help is greatly appreciated. Thanks again -Chris
Intermediate & Advanced SEO | | Bacon0