Is there a way for me to automatically download a website's sitemap.xml every month?
-
From now on we want to store all our sitemap.xml over the next years. Its a nice archive to have that allows us to analyse how many pages we have on our website and which ones were removed/redirected.
Any suggestions?
Thanks
-
If you use a MySQL database to store your website data, I think that to do this kind of automatic "archival" work by creating an automatic PHP script would take between 2 to 5 hours work. I don't see why it should take more than that.
If someone tells you that it is going to take more than that, I would be suspicious. Either the programmer is not good enough, or wants to cheat on you. That unfortunately happens more than you think!!
Be sure to ask for a step-by-step description of how they plan to complete the job. If you have doubts, please feel free to ask me, I am a pretty expert PHP programmer. I don't work for others, but just for myself (I built and keep tweaking my own websites virtualsheetmusic.com, musicianspage.com and others with very little help from external programmers).
Good luck!
-
Hi Fabrizo,
How long would it take for a PHP programmer to write this code approximately? Since we would have to outsource this I would like an indication to oversee costs involved.
Thanks!
-
The way I would do it would be to make a simple PHP (or Perl) program that every day, week or month (as you may need it), archives your sitemap.xml on a specific directory on your server, and possibly zip it. As a PHP programmer myself, I can tell you that that's really simply to do. Just ask to a PHP programmer, I am sure it will make it in a couple hours!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
I'm getting an error in Search Console that pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. Unfortunately I can't post images on here but I've linked some url's below. The page below in search console shows the error above... https://mixeddigitaleduconsulting.com/ As does this one. https://mixeddigitaleduconsulting.com/independent-school-marketing-communications/ However, this page does not have the error and is indexed by Google. The meta robots tag looks identical. https://mixeddigitaleduconsulting.com/blog/leadership-team/jill-goodman/ Any and all help is appreciated.
Technical SEO | | Sean_White_Consult0 -
Handling XML Sitemaps for Ad Classified Sites
Let's put on a scenario for a Job Classified site, So far the way we are handling xml sitemaps is in a consecutive number containing only ads historically: http://site.com/sitemap_ads_1.xml http://site.com/sitemap_ads_2.xml http://site.com/sitemap_ads_99.xml Those sitemaps are constantly updating as each ad is published, keeping expired ads but I'm sure there is a better way to handle them. For instance we have other source of content besides ads pages, like those related to search results (Careers, Location, Salary, level, type of contract, etc) and blog content, but we are not adding them yet So what I'm suggesting is to reduce the amount of xml sitemaps ads to just one, including just the ones that are active (not expired), add another xml sitemap based on search results, another one on blog content, another one on images and finally one for static content such as home, faq, contact, etc. Do you guys think this is the right way to go?
Technical SEO | | JoaoCJ0 -
Resubmit sitemaps on every change?
Hello Mozers, Our sitemaps were submitted to Google and Bing, and are successfully indexed. Every time pages are added to our store (ecommerce), we re-generate the xml sitemap. My question is: should we be resubmitting the sitemaps every time their content change, or since they were submitted once can we assume that the crawlers will re-download the sitemaps by themselves (I don't like to assume). What are best practices here? Thanks!
Technical SEO | | yacpro131 -
Changes to 'links to your site' in WebMaster Tools?
We're writing more out of curiosity... Clicking on "Download latest links" within 'Links to your site' in Google's WebMaster Tools would usually bring back links discovered recently. However, the last few times (for numerous accounts) it has brought back a lot of legacy links - some from 2011 - and includes nothing recent. We would usually expect to see a dozen at least each month. ...Has anyone else noticed this? Or, do you have any advice? Thanks in advance, Ant!
Technical SEO | | AbsoluteDesign0 -
Duplicate Content - What's the best bad idea?
Hi all, I have 1000s of products where the product description is very technical and extremely hard to rewrite or create an unique one. I'll probably will have to use the contend provided by the brands, which can already be found in dozens of other sites. My options are: Use the Google on/off tags "don't index
Technical SEO | | Carlos-R
" Put the content in an image Are there any other options? We'd always write our own unique copy to go with the technical bit. Cheers0 -
Best XML Sitemap Generator for Mac?
Hi all, Recently moved from PC to Mac when starting a new job. One of the things I'm missing from my PC is G Site Crawler, and I haven't yet found a decent equivalent for the Mac. Can anybody recommend something as good as G Site Crawler for the Mac? I.e. I need the flexibility to exclude by URL parameter etc etc. Cheers everyone, Mark
Technical SEO | | markadoi840 -
Will changing our colocation affect our site's link juice?
If we change our site's server location to a new IP, will this affect anything involving SEO? The site name and links will not be changing.
Technical SEO | | 9Studios0 -
Blocking URL's with specific parameters from Googlebot
Hi, I've discovered that Googlebot's are voting on products listed on our website and as a result are creating negative ratings by placing votes from 1 to 5 for every product. The voting function is handled using Javascript, as shown below, and the script prevents multiple votes so most products end up with a vote of 1, which translates to "poor". How do I go about using robots.txt to block a URL with specific parameters only? I'm worried that I might end up blocking the whole product listing, which would result in de-listing from Google and the loss of many highly ranked pages. DON'T want to block: http://www.mysite.com/product.php?productid=1234 WANT to block: http://www.mysite.com/product.php?mode=vote&productid=1234&vote=2 Javacript button code: onclick="javascript: document.voteform.submit();" Thanks in advance for any advice given. Regards,
Technical SEO | | aethereal
Asim0