Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Export Website into XML File
-
Hi,
I am having an agency optimize the content on my sites. I need to create XML Schema before I export the content into XML.
What is best way to export content including meta tags for an entire site along with the steps on how to?
-
I don't know if it does anything more than an offline copy. I haven't encountered your use case before, so haven't looked for that. You might look to see if that program or others has those types of options that could help you.
-
will this software be able to export the site in xml or bascially just a offline copy?
-
I've used http://www.httrack.com/ HTTrack Website Copier before. Website copy software is one keyword search to get you started to find tools like this.
-
That would probably work, keri. What are the tools you speak of?
-
There are tools that will crawl and scrape your entire site and make a local copy of it. Would that work as something you could hand off to the agency?
-
i want a copy of the site content (on-page content and meta data) to give to an agency to optimize. its a regular site hosted on apache server
-
Are you talking about a Wordpress Blog ? What are you trying to do by exporting site content/meta data into an XML File ? Are you trying to use it as a backup or what ?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My website is penalized from google with no message in GWT.
On 26 of October 2018 My website have around 1 million pages indexed on google. but after hour when I checked my website was banned from google and all pages were removed. I checked my GWT and I did not receive any message. Can any one tell me what are the possible reasons and how can I recover my website? My website link is https://www.whoseno.com
Intermediate & Advanced SEO | | WhoseNo0 -
Regex in Disavow Files?
Hi, Will Regex expressions work in a disavow file? If i include website.com/* will that work or would you recommend just website.com? Thanks.
Intermediate & Advanced SEO | | Fubra0 -
Website Snippet Update in Search Console?
I have a company that I started working with that has an outdated and inaccurate snippet coming up. See the link below. They changed their name from DK on Pittsburgh Sports to just DK Pittsburgh Sports several years ago, but the snippet is still putting the old info, including outdated and incorrect description. I'm not seeing that title or description anywhere on the site or a schema plugin. How can we get it updated? I have updated titles, etc. for the home page, and done a Fetch to get re-indexed. Does Snippet have a different type of refresh that I can submit or edit? Thanks in advance https://g.co/kgs/qZAnAC
Intermediate & Advanced SEO | | jeremyskillings0 -
How to integrate two websites, post-merger?
One of my clients has just been bought by a much larger company and thus will be losing their website and brand name. My client's site has built up a lot of traffic and authority in its space, so we are very nervous about losing all of this after the sale has gone through. The purchasing company intends for my client's services to be represented on its own website, so I am wondering, from a technical standpoint, what the best way is of going ahead with this, since my client will continue to work with the new company and would like to keep us onboard. Should we doing an 80/20 analysis, recreate our most valuable pages (eg. 70%+ of traffic is to home page) on the new site, then 301 each of these pages individually to its equivalent on the new site, while retaining as much of the old pages' on-page content/structure as possible? One thing I am concerned about is the fact that a large chunk of traffic is from brand searches. Again, should we simply recreate the home page with a page title of e.g. "X company is now part of Y company" in order that we'll still rank highly for the old company's brand name? Any advice on how to go about this is much appreciated.
Intermediate & Advanced SEO | | zakkyg0 -
Large robots.txt file
We're looking at potentially creating a robots.txt with 1450 lines in it. This will remove 100k+ pages from the crawl that are all old pages (I know, the ideal would be to delete/noindex but not viable unfortunately) Now the issue i'm thinking is that a large robots.txt will either stop the robots.txt from being followed or will slow our crawl rate down. Does anybody have any experience with a robots.txt of that size?
Intermediate & Advanced SEO | | ThomasHarvey0 -
How to de-index old URLs after redesigning the website?
Thank you for reading. After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain). It would be nonsense to 301 redirect them as there are to many URLs. (or would it be nonsense?) What is the best way to deal with this issue?
Intermediate & Advanced SEO | | Chemometec0 -
All page files in root? Or to use directories?
We have thousands of pages on our website; news articles, forum topics, download pages... etc - and at present they all reside in the root of the domain /. For example: /aosta-valley-i6816.html
Intermediate & Advanced SEO | | Peter264
/flight-sim-concorde-d1101.html
/what-is-best-addon-t3360.html We are considering moving over to a new URL system where we use directories. For example, the above URLs would be the following: /images/aosta-valley-i6816.html
/downloads/flight-sim-concorde-d1101.html
/forums/what-is-best-addon-t3360.html Would we have any benefit in using directories for SEO purposes? Would our current system perhaps mean too many files in the root / flagging as spammy? Would it be even better to use the following system which removes file endings completely and suggests each page is a directory: /images/aosta-valley/6816/
/downloads/flight-sim-concorde/1101/
/forums/what-is-best-addon/3360/ If so, what would be better: /images/aosta-valley/6816/ or /images/6816/aosta-valley/ Just looking for some clarity to our problem! Thank you for your help guys!0 -
Xml sitemap advice for website with over 100,000 articles
Hi, I have read numerous articles that support submitting multiple XML sitemaps for websites that have thousands of articles... in our case we have over 100,000. So, I was thinking I should submit one sitemap for each news category. My question is how many page levels should each sitemap instruct the spiders to go? Would it not be enough to just submit the top level URL for each category and then let the spiders follow the rest of the links organically? So, if I have 12 categories the total number of URL´s will be 12??? If this is true, how do you suggest handling or home page, where the latest articles are displayed regardless of their category... so I.E. the spiders will find l links to a given article both on the home page and in the category it belongs to. We are using canonical tags. Thanks, Jarrett
Intermediate & Advanced SEO | | jarrett.mackay0