How to create XML sitemap for larger website?
-
We need to create XML sitemap for a website that has more than 2 million pages. Please suggest me the best software to create XML sitemap for the website.
Since there are different strategies that larger websites submit sitemaps, let me know the best way to submit this sitemap for website of this size.
Or Is there any tool provided by SEOmoz for XML sitemap generation for larger websites?
-
Thanks a lot for your suggestion. We have successfully run 500 URI limit using trial version of this Screaming Frog software. We will check up with the licensed version of the software.
-
Our website is built with ASP and there is no plugin/extension for XML sitemaps
-
Screaming Frog is quite good I seem to remember, it will crawl all your URL's then create the sitemap for you. You may need to change the "priority" of the pages but the hard work is pretty much done for you.
-
What's the website built with? Doesn't it have a module/plugin/extension for XML sitemaps?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I define that one area of my website is a regualr news (no subscription) and the other part of the website is news that only subscribers can read?
Hi I have a client that have a news website, he asked me if he can define one area of his website to be a regular news that google can show on google news search results (no subscription) and the other part of the website is news that only subscribers can read? Thanks Roy
Intermediate & Advanced SEO | | kadut1 -
Can you create town focused landing pages for a website without breaking Google guidelines?
I recently watched a webmaster video that said that town focused landing pages are seen as doorway pages if they only exist to capture search traffic. And then I read that just because you can sell your product/service in a certain area, doesn't mean you can have a page for it on your website. Is it possible to create town focused landing pages for a website without breaking Google guidelines?
Intermediate & Advanced SEO | | Silkstream1 -
Best Way to Create SEO Content for Multiple, International Websites
I have a client that has multiple websites for providing to other countries. For instance, they have a .com website for the US (abccompany.com), a .co.uk website for the UK (abccompany.co.uk), a .de website for Germany (abccompany.de), and so on. The have websites for the Netherlands, France, and even China. These all act as separate websites. They have their own addresses, their own content (some duplicated but translated), their own pricing, their own Domain Authority, backlinks, etc. Right now, I write content for the US site. The goal is to write content for long and medium tail keywords. However, the UK site is interested in having myself write content for them as well. The issue I'm having is how can I differentiate the content? And what is the best way to target content for each country? Does it make sense to write separate content for each website to target results in that country? The .com site will still show up in UK web results still fairly high. Does it make sense to just duplicate the content but in a different language or for the specific audience in that country? I guess the biggest question I'm asking is, what is the best way of creating content for multiples countries' search results? I don't want the different websites to compete with each other in a sense nor do I want to spend extra time trying to rank content for multiple sites when I could just focus on trying to rank one for all countries. Any help is appreciated!
Intermediate & Advanced SEO | | cody1090 -
Bing seriously hitting our website
Hi I have a strange query, Bing in the last four weeks have been seriously crawling our site, and while at the moment this isn't affecting our servers, coming into the busy time of the year I am just wondering if anyone else is seeing this. They are basically crawling the entire site (including nofollow pages) even though according to Bing Webmaster tools, they know our sitemap. Is anybody else seeing any unusual activity with the Bing bot. Thanks Andy
Intermediate & Advanced SEO | | Andy-Halliday0 -
Google Processing but Not Indexing XML Sitemap
Like it says above, Google is processing but not indexing our latest XML sitemap. I noticed this Monday afternoon - Indexed status was still Pending - and didn't think anything of it. But when it still said Pending on Tuesday, it seemed strange. I deleted and resubmitted our XML sitemap on Tuesday. It now shows that it was processed on Tuesday, but the Indexed status is still Pending. I've never seen this much of a lag, hence the concern. Our site IS indexed in Google - it shows up with a site:xxxx.com search with the same number of pages as it always has. The only thing I can see that triggered this is Sunday the site failed verification via Google, but we quickly fixed that and re-verified via WMT Monday morning. Anyone know what's going on?
Intermediate & Advanced SEO | | Kingof50 -
How important are sitemap errors?
If there aren't any crawling / indexing issues with your site, how important do thing sitemap errors are? Do you work to always fix all errors? I know here: http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholds Duane Forrester mentions that sites with many 302's 301's will be punished--does any one know Googe's take on this?
Intermediate & Advanced SEO | | nicole.healthline0 -
How does a competing website with clearly black hat style SEO tactics, have a far higher domain authority than our website that only uses legitimate link building tactics?
Through SEO Moz link analysis tools, we looked at a competing websites external followed links and discovered a large number of links going to Blog pages with domain authorities in the 90's (their blog page authorities were between 40 and 60), however the single blog post written by this website was exactly the same in every instance and had been posted in August 2011. Some of these blog sites had 160 or so links linking back to this competing website whose domain authority is 49 while ours is 28, their Moz Trust is 5.43 while ours is 5.18. An example of some of the blogs that link to the competing website are: http://advocacy.mit.edu/coulter/blog/?p=13 http://pest-control-termite-inspection.posterous.com/\ However many of these links are "no follow" and yet still show up on Open Site Explorer as some of this competing websites top linking pages. Admittedly, they have 584 linking root domains while we have only 35, but if most of them are the kind of websites posted above, we don't understand how Google is rewarding them with a higher domain authority. Our website is www.anteater.com.au Are these tactics now the only way to get ahead?
Intermediate & Advanced SEO | | Peter.Huxley590 -
Sitemap not indexing pages
My website has about 5000 pages submitted in the sitemap but only 900 being indexed. When I checked Google Webmaster Tools about a week ago 4500 pages were being indexed. Any suggestions about what happened or how to fix it? Thanks!
Intermediate & Advanced SEO | | theLotter0