Xml Sitemap for a large automobile website
-
Hello moz fellas, I need expert advice for PakWheels about xml sitemap generation. There are hundreds of thousands of pages (mostly USG) and these are increasing day by day. What is the best practice of controlling all these pages in xml format. Where can we generate sitemap.xml to submit in Google and Bing webmaster tools.
Your input may help us in managing these URLs in an xml format. Thanks
-
Hi,
Are your pages manually driven or created out a database system? If they are in a database of some kind, you should be able to query for the URLs and generate the XML sitemap. If they aren't able to generate the sitemap to /sitemap.xml file, make sure you put the URL of the sitemap on your robots.txt file. (For example, if the sitemap ends up being sitemap.php instead.)
If they aren't database driven pages or if you developers aren't able to create an XML sitemap for you, I'd recommend this service:
http://www.xml-sitemaps.com/standalone-google-sitemap-generator.html
You can setup a scheduled job on your server to run this on whatever schedule you want. I've used it on a number of client sites where the data drive XML sitemap isn't an option.
Hope that helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL indexed but not submitted in sitemap, however the URL is in the sitemap
Dear Community, I have the following problem and would be super helpful if you guys would be able to help. Cheers Symptoms : On the search console, Google says that some of our old URLs are indexed but not submitted in sitemap However, those URLs are in the sitemap Also the sitemap as been successfully submitted. No error message Potential explanation : We have an automatic cache clearing process within the company once a day. In the sitemap, we use this as last modification date. Let's imagine url www.example.com/hello was modified last time in 2017. But because the cache is cleared daily, in the sitemap we will have last modified : yesterday, even if the content of the page did not changed since 2017. We have a Z after sitemap time, can it be that the bot does not understands the time format ? We have in the sitemap only http URL. And our HTTPS URLs are not in the sitemap What do you think?
Intermediate & Advanced SEO | | ZozoMe0 -
In Search Console, why is the XML sitemap "issue" count 5x higher than the URL submission count?
Google Search Console is telling us that there are 5,193 sitemap "issues" - URLs that are present on the XML sitemap that are blocked by robots.txt However, there are only 1,222 total URLs submitted on the XML sitemap. I only found 83 instances of URLs that fit their example description. Why is the number of "issues" so high? Does it compound over time as Google re-crawls the sitemap?
Intermediate & Advanced SEO | | FPD_NYC0 -
Same website, seperate subfolders or separete websites? 12 stores in two cities
I have a situation where there are 12 stores in separate suburbs across two cities. Currently the chain store has one eCommerce website. So I could keep the one website with all the attendant link building benefits of one domain. I would keep a separate webpage for each store with address details to assist with some Local SEO. But (1) each store has slightly different inventory and (2) I would like to garner the (Local) SEO benefits of being in a searchers suburb. So I'm wondering if I should go down the subfolder route with each store having its own eCommerce store and blog eg example.com/suburb? This is sort of what Apple does (albeit with countries) and is used as a best practice for international SEO (according to a moz seminar I watched awhile back). Or I could go down the separate eCommerce website domain track? However I feel that is too much effort for not much extra return. Any thoughts? Thanks, Bruce.
Intermediate & Advanced SEO | | BruceMcG0 -
Understanding how to fix a 403 issue with my website
Hi guys, I hope you can help solve a mystery for me! My site FranceForFamilies.com has been around for 9 years and has always ranked well - at least until I launched a new Wordpress version earlier this year. The purpose of the relaunch was to improve the look of the site, so I kept the content and meta titles the same but created a new design. However, from the day of the new launch the search engine rankings have plummeted, to the point where most seem to have disappeared all together. I have found that when Moz crawls the site, it only crawls one page. I asked the Moz team about this and they said that the site is returning a 403. They also tested this using a curl and received a 406 response: curl -I www.franceforfamilies.com/ HTTP/1.1 406 Not Acceptable However, when I check our Google Webmaster tools I can't recreate the issue. I don't really know what is going on, and I don't have the technical knowledge to solve this - can you help? Thanks, Daniel
Intermediate & Advanced SEO | | LeDanJohnson0 -
How would the rich snippets be treated in AJAX website?
Hi guys We have started to rewrite our website http://www.edamam.com on AJAX, and the idea is to have all the website on AJAX in the next few months. Although it would probably be difficult to index even with the Google Crawling protocol, and some other issues might appear, the engineers insist that from technology point of view this is the best way to go. We have already rewritten the internal search result pages, e.g. http://www.edamam.com/recipes/pasta and last week we set the Google Crawling protocol for AJAX to some of the individual recipe pages to test it. I'd like to ask for you opinion on whether the rich snippets we have in the search results will be affected by this change? Are there specific actions we need to take to preserve them? What other hot tips you have for dealing with AJAX on any level of the website? Thanks in advance Lily
Intermediate & Advanced SEO | | wspwsp0 -
Rel=canonical an iframed version of the same website?
My issue is that we have two websites with the same content. For the sake of an example lets say they are: jackson.com jacksonboats.com When you go to jacksonboats.com, the website is an iframed version of jackson.com. However all of the companies email addresses are example@jacksonboats.com so a 301 is not possible. What would be the best way to forward over the link juice from jacksonboats.com to jackson.com? I'm thinking a rel=canonical tag, but I wanted to ask first. Thanks,
Intermediate & Advanced SEO | | BenGMKT0 -
New Website. Changing TLD or not?
Hi, At my company we are making a new website because the days of the old one are numbered. We already decided that the folder structure will be changed so we have more "clean" url's. Now we also would like to change from .net/nl to .nl . Since we already are redirecting all url's (>10.000), we think this is the moment to switch the TLD. What do you guys think? Is their anyone who has some kind of experience/tip they would like to share?
Intermediate & Advanced SEO | | SEO_ACSI0 -
Need a mobile XML Sitemap?
We're going to be running our mobile site on the same domain and generating content for users on mobile devices with style sheets (will not have m.domain). The content on our URLs will be the exact same. My question is if we need to create a mobile XML Sitemap to submit to the search engines. Do we need to create the Sitemap, that will contain the exact same URLs as our non-mobile Sitemap, and just include <mobile><mobile>tags around the URLs? Or do we need to create a mobile Sitemap at all to alert the search engines that we have mobile content?</mobile></mobile> Thanks!
Intermediate & Advanced SEO | | bonnierSEO0