Is there a way for me to automatically download a website's sitemap.xml every month?
-
From now on we want to store all our sitemap.xml over the next years. Its a nice archive to have that allows us to analyse how many pages we have on our website and which ones were removed/redirected.
Any suggestions?
Thanks
-
If you use a MySQL database to store your website data, I think that to do this kind of automatic "archival" work by creating an automatic PHP script would take between 2 to 5 hours work. I don't see why it should take more than that.
If someone tells you that it is going to take more than that, I would be suspicious. Either the programmer is not good enough, or wants to cheat on you. That unfortunately happens more than you think!!
Be sure to ask for a step-by-step description of how they plan to complete the job. If you have doubts, please feel free to ask me, I am a pretty expert PHP programmer. I don't work for others, but just for myself (I built and keep tweaking my own websites virtualsheetmusic.com, musicianspage.com and others with very little help from external programmers).
Good luck!
-
Hi Fabrizo,
How long would it take for a PHP programmer to write this code approximately? Since we would have to outsource this I would like an indication to oversee costs involved.
Thanks!
-
The way I would do it would be to make a simple PHP (or Perl) program that every day, week or month (as you may need it), archives your sitemap.xml on a specific directory on your server, and possibly zip it. As a PHP programmer myself, I can tell you that that's really simply to do. Just ask to a PHP programmer, I am sure it will make it in a couple hours!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
No: 'noindex' detected in 'robots' meta tag
Pages on my site show No: 'noindex' detected in 'robots' meta tag. However, when I inspect the pages html, it does not show noindex. In fact, it shows index, follow. Majority of pages show the error and are not indexed by Google...Not sure why this is happening. The page below in search console shows the error above...
Technical SEO | | Sean_White_Consult0 -
How to submit Google xml sitemap properly in 2016?
Hello everyone!
Technical SEO | | SEObd
I'm new in the field of SEO. I'm looking for submitting XML web site guideline or tutorial. But there is no proper guideline. All of the tutorials are about the wordpress website. What should I do for my PHP website? Can I submit XML site map without help of developer? Please help me.0 -
Will multiple internal links with the same anchor text hurt a site's ranking?
Hello, I just watched this video from the Google Webmasters channel at YouTube: http://www.youtube.com/watch?v=6ybpXU0ckKQ My question: If a site is built up on subdomains, will linking the different subdomains with exact anchor text hurt the site's ranking? Thanks
Technical SEO | | arnoldwender0 -
Will an XML sitemap override a robots.txt
I have a client that has a robots.txt file that is blocking an entire subdomain, entirely by accident. Their original solution, not realizing the robots.txt error, was to submit an xml sitemap to get their pages indexed. I did not think this tactic would work, as the robots.txt would take precedent over the xmls sitemap. But it worked... I have no explanation as to how or why. Does anyone have an answer to this? or any experience with a website that has had a clear Disallow: / for months , that somehow has pages in the index?
Technical SEO | | KCBackofen0 -
Additional product information: the product's sales page or a blog post?
I want to go in-depth about different customizations for custom caps, which is one of the products we offer. I just don't know whether it would be better--from an SEO perspective--to expand the caps sales page we already have or to write a blog post to give the site another valuable indexed page. From a user standpoint, I don't think it's as important, because if I do it the blog way, I can't just put a link on the page saying, Want more customizations? Visit our blog post. Any opinions?
Technical SEO | | UnderRugSwept1 -
Schema Markup and Google's Rich Snippet Tool
Has anyone ever used the snippet tool and gotten the following error "could not fetch website"? When using the tool and placing an url that does not have markup present it will show that as the error. Or if part of markup is wrong, it will diagnose it accordingly. Did a search online and found limited info...one of which someone had this error but when other users tested it, they were not getting the same error.
Technical SEO | | andrewv0 -
What's the best free tool for checking for broken links?
I'm trying to find the best tool to check for broken links on our site. We have over 11k pages and I'm looking for something fast and thorough! I've tried Xenu and LinkChecker. Any other ideas?
Technical SEO | | CIEEwebTeam0 -
Segmenting Website into XML Sitemaps
Hi all, I'm about to begin the process of chopping up a 1,000 page website into separate sitemaps. I'm going for a three tiered approach so that I can check indexation on each level for: Category, Subcategory, Product What's the easiest way to create three separate XML sitemaps for this? Thanks, Nick
Technical SEO | | NickPateman810