Is there a way for me to automatically download a website's sitemap.xml every month?
-
From now on we want to store all our sitemap.xml over the next years. Its a nice archive to have that allows us to analyse how many pages we have on our website and which ones were removed/redirected.
Any suggestions?
Thanks
-
If you use a MySQL database to store your website data, I think that to do this kind of automatic "archival" work by creating an automatic PHP script would take between 2 to 5 hours work. I don't see why it should take more than that.
If someone tells you that it is going to take more than that, I would be suspicious. Either the programmer is not good enough, or wants to cheat on you. That unfortunately happens more than you think!!
Be sure to ask for a step-by-step description of how they plan to complete the job. If you have doubts, please feel free to ask me, I am a pretty expert PHP programmer. I don't work for others, but just for myself (I built and keep tweaking my own websites virtualsheetmusic.com, musicianspage.com and others with very little help from external programmers).
Good luck!
-
Hi Fabrizo,
How long would it take for a PHP programmer to write this code approximately? Since we would have to outsource this I would like an indication to oversee costs involved.
Thanks!
-
The way I would do it would be to make a simple PHP (or Perl) program that every day, week or month (as you may need it), archives your sitemap.xml on a specific directory on your server, and possibly zip it. As a PHP programmer myself, I can tell you that that's really simply to do. Just ask to a PHP programmer, I am sure it will make it in a couple hours!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
'duplicate content' on several different pages
Hi, I've a website with 6 pages identified as 'duplicate content' because they are very similar. This pages looks similar because are the same but it show some pictures, a few, about the product category that's why every page look alike each to each other but they are not 'exactly' the same. So, it's any way to indicate to Google that the content is not duplicated? I guess it's been marked as duplicate because the code is 90% or more the same on 6 pages. I've been reviewing the 'canonical' method but I think is not appropriated here as the content is not the same. Any advice (that is not add more content)?
Technical SEO | | jcobo0 -
What's the best way to integrate off site inventory?
I can't seem to make any progress with my car dealership client in rankings or traffic. I feel like I've narrowed out most of the common problems, the only other thing I can see is that all their inventory is on a subdomain using a dedicated auto dealership software. Any suggestion of a better way to handle this situation? Am I missing something obvious? The url is rcautomotive.com Thanks for your help!
Technical SEO | | GravitateOnline0 -
Webmaster Tools vs Screaming from for 404's
Hey guys, I was just wondering which is better to use to find the 404's effecting your site. I have been using webmaster tools and just purchased screaming frog which has given me a totally different list of 404's compared to WMT. Which do I use, or do I use both? Cheers
Technical SEO | | Adamshowbiz0 -
Best practice for XML sitemap depth
We run an eCommerce for education products with 20 or so subject based catalogues (Maths, Literacy etc) and each catalogue having numerous ranges (Counting, Maths Games etc) then products within those. We carry approximately 15,000 products. My question is around the sitemap we submit - nightly - and it's depth. It is currently set to cover off home, catalogues and ranges plus all static content (about us etc). Should we be submitting sitemaps to include product pages as well? Does it matter or would it not make much difference in terms of search. Thanks in advance.
Technical SEO | | TTS_Group0 -
Is it best to create multiple xml sitemaps for different sections of a site?
I have a client with a very big site that includes a blog, videos, photo gallery, etc. Is it best to create a separate xml file for each of these sections? It seems to me like that would be the best way to keep everything organized. Or at least separate the blog out from the main site. Does anybody have any tips or recommendations? I'm not finding any good information about this.
Technical SEO | | MichaelWeisbaum0 -
Exclude Child URLs from XML Sitemap Generator (Wordpress)
Hi all, I was recommended the XML Sitemap Generator for Wordpress by the very helpful Keith Bloemendaal and John Pring - however I can't seem to exclude child URLs. There is a section Exclude items and a subsection Exclude posts. I have tried inputting the URLs for the pages I don't want in the sitemap, however that didn't work. So I read that you have to include a list of "IDs" - not sure where on earth to find that info, tried the page name and the post= number from the URL, however neither worked. I hope somebody can point me in the right direction - and apologies, I am a Wordpress novice, and I got no answers from the Wordpress forums so turned right back to SEOmoz! Cheers.
Technical SEO | | markadoi840 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0 -
If non-paying customers only get a 2 min snippet of a video, can my video length in sitemap.xml be the full length?
I am working on a website that all of its primary contents are videos. They have an assortment of free videos, but the majority or viewable only with a subscription to the site. If you don't have a subscription, you can see a 2 min video clip of the contents of the video. But all the videos can be anywhere from 10min to 1.5 hours. When I am auto-generating the sitemap.xml, can I put the full length of the videos for paying members in the XML in the video:duration property? Or because publicly only 2 minutes is available (unless you pay for a membership) is that frowned upon?
Technical SEO | | nbyloff0