Multiple sitemaps for various media?
-
Hello,
We have always included videos, pages, and images in the same sitemap.xml file, and after reading through the Google sitemap info page, I am wondering if we should break those up into respective types of sitemaps (i.e one for video, one for images, etc)? If this is so, how to name the files and submit them? And then, should I submit a sitemap.xml directory sitemap? Note: we have the normal amount of images, videos, pages..not an ecommerce site.
Thanks in advance
-
By DW do you mean DreamWeaver or DemandWare?
I wouldn't build an XML sitemap in Dreamweaver, and I'm sure DemandWare would have a built-in tool for this.
You can go to Google and search for "Free XML Sitemap Generator" or something similar and find a few good options. I use the one from Audit My PC from time to time still, but there are many others. The one below does include images and video, but I don't know if they segment them. Worth a try: http://www.xml-sitemaps.com/ .
-
Hello- No, we actually use DW. I have used Yoast in the past; however, didn't know about the video plugin. Thanks! Any thoughts for what to use with DW?
Thanks,
L
-
Are you running WordPress as your CMS? If that's the case give WordPress SEO by Yoast a try with the additional Video Plugin. It's the best out there and takes care of everything related to sitemaps for both videos, images and normal URLs.
-
For web I use ScreamingFrog or http://www.web-site-map.com/ (if I need it quick and dirty and under 3k or so pages). For Images I don't know any tools - I usually craft that by hand as there is also some attention to detail needed (geo location etc).
It should be fast if you can export it with a script and then edit it when and where needed.
But again - you need to see the ROI for this. If you don't have that many images and you need to spend a lot of time doing the xml image site map - it's not really worth it. If you can deploy one fast - in under 30-60 min of work - then it might worth having it there.
Just my 2c.
-
Thank you so much! Is there a special tool or sitemap generator specifically for culling the images, videos, and content respectively that you use or could recommend? Or do I need to build it manually?
Thanks so so much again.
-
Hi,
The sitemaps helps to speed up the index process and also to "manage" the files - get some feedback as far as when those files were processed etc.
It's a good idea to split the xml sitemaps in image sitemaps, content and videos. It's easier to manage and get feedback out of it. Also there are different tags for each.
As far as names - it doesn’t matter. Just use something that makes sense for you – in order to be able to manage them. For example I use sitemap-images-v2.xml, sitemap-images-v3.xml, same with video and content. I also split and use multiple files for content as for me, it helps to add a new smaller xml sitemap then replace the old one - again, mainly to get feedback and see how google is processing those.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Multiple robots.txt files on server
Hi! I have previously hired a developer to put up my site and noticed afterwards that he did not know much about SEO. This lead me to starting to learn myself and applying some changes step by step. One of the things I am currently doing is inserting sitemap reference in robots.txt file (which was not there before). But just now when I wanted to upload the file via FTP to my server I found multiple ones - in different sizes - and I dont know what to do with them? Can I remove them? I have downloaded and opened them and they seem to be 2 textfiles and 2 dupplicates. Names: robots.txt (original dupplicate)
Technical SEO | | mjukhud
robots.txt-Original (original)
robots.txt-NEW (other content)
robots.txt-Working (other content dupplicate) Would really appreciate help and expertise suggestions. Thanks!0 -
Consolidate Multiple Sites Into Brand Website
Hi Moz Community, I am looking for some advice and/or guidance on consolidating multiple sites into the brand website. At the moment, we have our brand website as well as keyword matching/topic matching websites for each service. (Note: We're a professional services firm). The service websites do rank well for their keywords, but it's my hope that the single brand website will create a more unified presence on search. We have just about completed our re-design (consolidating the service websites into the brand website), but before going live, I wanted to reach out to the Moz communtiy to seek advice on things like 301 redirects, submitting sitemaps, etc. This is the first time I am consolidating existing websites into an existing website, so it's a little daunting. Thanks Tim
Technical SEO | | AinsleyAgency0 -
Multiple sub domain appearing
Hi Everyone, Hope were well!. Have a strange one!!. New clients website http://www.allsee-tech.com. Just found out he is appearing for every subdomain possible. a.alsee-tech.com b.allsee-tech.com. I have requested htaccess as this is where I think the issue lies but he advises there isn't anything out of place there. Any ideas in case it isn't? Regards Neil
Technical SEO | | nezona0 -
Have I constructed my robots.txt file correctly for sitemap autodiscovery?
Hi, Here is my sitemap: User-agent: * Sitemap: http://www.bedsite.co.uk/sitemaps/sitemap.xml Directories Disallow: /sendfriend/
Technical SEO | | Bedsite
Disallow: /catalog/product_compare/
Disallow: /media/catalog/product/cache/
Disallow: /checkout/
Disallow: /categories/
Disallow: /blog/index.php/
Disallow: /catalogsearch/result/index/
Disallow: /links.html I'm using Magento and want to make sure I have constructed my robots.txt file correctly with the sitemap autodiscovery? thanks,0 -
What may be the reason a sitemap is not indexed in Webmaster Tools?
Hi,
Technical SEO | | SorinaDascalu
I have a problem with a client's website. I searched many related questions here about the same problem but couldn't figure out a solution. Their website is in 2 languages and they submitted 2 sitemaps to Webmaster Tools. One got 100% indexed. From the second one, from over 800 URLs only 32 are indexed. I checked the following hypothesis why the second sitemap may not get indexed: sitemap is wrongly formatted - False sitemap contains URLs that don't return 200 status - False, there are no URLs that return 404, 301 or 302 status codes sitemap contains URLs that are blocked by robots.txt - False internal duplicate content problems - False issues with meta canonical tags - False For clarification, URLs from the sitemap that is not indexed completely also don't show up in Google index. Can someone tell me what can I also check to fix this issue?0 -
Affects of multiple subdomains on homebrew CDN for images
We're creating our own CDN such that instead of serving images from http://mydomain.com/images/shoe.jpg It will appear at all of the following subdomains: http://cdn1.mydomain.com/images/shoe.jpg http://cdn2.mydomain.com/images/shoe.jpg http://cdn3.mydomain.com/images/shoe.jpg http://cdn4.mydomain.com/images/shoe.jpg Image tags on our pages will randomly choose any subdomain for the src. The thought was this will make page loading faster by paralellizing requests across many cookie-less domains. How does this affect : -Ranking of images on Google image search. -Ranking of pages they appear on -Domain authority (images are linked to heavily in our social media efforts, so we will 301 redirect image urls to cdn1.mydomain.com) Should we disallow all but one CDN domain in robots.txt? Will robots.txt on an image only subdomain even be retrieved? Should we just use 1 CDN subdomain instead?
Technical SEO | | cat5com0 -
Removing Redirected URLs from XML Sitemap
If I'm updating a URL and 301 redirecting the old URL to the new URL, Google recommends I remove the old URL from our XML sitemap and add the new URL. That makes sense. However, can anyone speak to how Google transfers the ranking value (link value) from the old URL to the new URL? My suspicion is this happens outside the sitemap. If Google already has the old URL indexed, the next time it crawls that URL, Googlebot discovers the 301 redirect and that starts the process of URL value transfer. I guess my question revolves around whether removing the old URL (or the timing of the removal) from the sitemap can impact Googlebot's transfer of the old URL value to the new URL.
Technical SEO | | RyanOD0 -
When is the best time to submit a sitemap?
What changes to a website constitute resubmitting a sitemap? For example, if I add new in-site links, should I then resubmit? Or is it more for changes to URLs, Page titles, etc?
Technical SEO | | MichaelWeisbaum0