Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Export Website into XML File
-
Hi,
I am having an agency optimize the content on my sites. I need to create XML Schema before I export the content into XML.
What is best way to export content including meta tags for an entire site along with the steps on how to?
-
I don't know if it does anything more than an offline copy. I haven't encountered your use case before, so haven't looked for that. You might look to see if that program or others has those types of options that could help you.
-
will this software be able to export the site in xml or bascially just a offline copy?
-
I've used http://www.httrack.com/ HTTrack Website Copier before. Website copy software is one keyword search to get you started to find tools like this.
-
That would probably work, keri. What are the tools you speak of?
-
There are tools that will crawl and scrape your entire site and make a local copy of it. Would that work as something you could hand off to the agency?
-
i want a copy of the site content (on-page content and meta data) to give to an agency to optimize. its a regular site hosted on apache server
-
Are you talking about a Wordpress Blog ? What are you trying to do by exporting site content/meta data into an XML File ? Are you trying to use it as a backup or what ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any Tips for Reviving Old Websites?
Hi, I have a series of websites that have been offline for seven years. Do you guys have any tips that might help restore them to their former SERPs glory? Nothing about the sites themselves has changes since they went offline. Same domains, same content, and only a different server. What has changed is the SERPs landscape. I've noticed competitive terms that these sites used to rank on the first page for with far more results now. I have also noticed some terms result in what seems like a thesaurus similar language results from traditionally more authoritative websites instead of the exact phrase searched for. This concerns me because I could see a less relevant page outranking me just because it is on a .gov domain with similar vocabulary even though the result is not what people searching for the term are most likely searching for. The sites have also lost numerous backlinks but still have some really good ones.
Intermediate & Advanced SEO | | CopBlaster.com1 -
Website copying in Tweets from Twitter
Just noticed a web developer I work with has been copying tweets into the website - and these are displayed (and saved) one page at a time across hundreds of pages (this is so they can populate a twitter feed, I am told). How would you tackle this, now that the deed's been done? This is in Drupal. Your thoughts would be welcome as this is a new one to me. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
Website completely delisted - reasons?
Hi, I got a request from a potential client as he do not understand why his website cannot be found on Google. I've checked that and found out that the complete website is not listed (complete delist) at all - expect just one pdf file.
Intermediate & Advanced SEO | | TheHecksler
I've checked his robots.txt - but this is ok. I've checked the META Robots - but they are on index,follow ... ok so far. I've checked his backlinks but could not found any massive linking from bad pages - just 6 backlinks and only four of them from designdomains.com which looks like a linklist or so. I've requested access to their GWT account if available in hope to find more infos, but does anyone of you may have a quick idea what els it could be? What could be the issue? I think that they got delisted due to any bad reason ... Let me know your Ideas 🙂 THANX 🙂 Sebi0 -
Noindex xml RSS feed
Hey, How can I tell search engines not to index my xml RSS feed? The RSS feed is created by Yoast on WordPress. Thanks, Luke.
Intermediate & Advanced SEO | | NoisyLittleMonkey0 -
Archiving a festival website - subdomain or directory?
Hi guys I look after a festival website whose program changes year in and year out. There are a handful of mainstay events in the festival which remain each year, but there are a bunch of other events which change each year around the mainstay programming.This often results in us redoing the website each year (a frustrating experience indeed!) We don't archive our past festivals online, but I'd like to start doing so for a number of reasons 1. These past festivals have historical value - they happened, and they contribute to telling the story of the festival over the years. They can also be used as useful windows into the upcoming festival. 2. The old events (while no longer running) often get many social shares, high quality links and in some instances still drive traffic. We try out best to 301 redirect these high value pages to the new festival website, but it's not always possible to find a similar alternative (so these redirects often go to the homepage) Anyway, I've noticed some festivals archive their content into a subdirectory - i.e. www.event.com/2012 However, I'm thinking it would actually be easier for my team to archive via a subdomain like 2012.event.com - and always use the www.event.com URL for the current year's event. I'm thinking universally redirecting the content would be easier, as would cloning the site / database etc. My question is - is one approach (i.e. directory vs. subdomain) better than the other? Do I need to be mindful of using a subdomain for archival purposes? Hope this all makes sense. Many thanks!
Intermediate & Advanced SEO | | cos20300 -
Different domains for multilingual website
Hey guys, A site that I'm currently working on as different domains for each website language. So for example: word1word2.com for the english version word3word4.com for the french version word5word6.com for spanish version .... Is it better to move all of the different languages to the same domain and use subfolders for each language /fr/... Please note that the domains being used bring in organic traffic as well as they are EMDs. Thank You.
Intermediate & Advanced SEO | | BruLee0 -
XML Sitemap Index Percentage (Large Sites)
Hi all I'm wanting to find out from those who have experience dealing with large sites (10s/100s of millions of pages). What's a typical (or highest) percentage of indexed pages vs. submitted pages you've seen? This information can be found in webmaster tools where Google shows you the pages submitted & indexed for each of your sitemap. I'm trying to figure out whether, The average index % out there There is a ceiling (i.e. will never reach 100%) It's possible to improve the indexing percentage further Just to give you some background, sitemap index files (according to schema.org) have been implemented to improve crawl efficiency and I'm wanting to find out other ways to improve this further. I've been thinking about looking at the URL parameters to exclude as there are hundreds (e-commerce site) to help Google improve crawl efficiency and utilise the daily crawl quote more effectively to discover pages that have not been discovered yet. However, I'm not sure yet whether this is the best path to take or I'm just flogging a dead horse if there is such a ceiling or if I'm already at the average ballpark for large sites. Any suggestions/insights would be appreciated. Thanks.
Intermediate & Advanced SEO | | danng0 -
Sitemaps. When compressed do you use the .gz file format or the (untidy looking, IMHO) .xml.gz format?
When submitting compressed sitemaps to Google I normally use the a file named sitemap.gz A customer is banging on that his web guy says that sitemap.xml.gz is a better format. Google spiders sitemap.gz just fine and in Webmaster Tools everything looks OK... Interested to know other SEOmoz Pro's preferences here and also to check I haven't made an error that is going to bite me in the ass soon! Over to you.
Intermediate & Advanced SEO | | NoisyLittleMonkey0