Multiple sitemaps for various media?
-
Hello,
We have always included videos, pages, and images in the same sitemap.xml file, and after reading through the Google sitemap info page, I am wondering if we should break those up into respective types of sitemaps (i.e one for video, one for images, etc)? If this is so, how to name the files and submit them? And then, should I submit a sitemap.xml directory sitemap? Note: we have the normal amount of images, videos, pages..not an ecommerce site.
Thanks in advance
-
By DW do you mean DreamWeaver or DemandWare?
I wouldn't build an XML sitemap in Dreamweaver, and I'm sure DemandWare would have a built-in tool for this.
You can go to Google and search for "Free XML Sitemap Generator" or something similar and find a few good options. I use the one from Audit My PC from time to time still, but there are many others. The one below does include images and video, but I don't know if they segment them. Worth a try: http://www.xml-sitemaps.com/ .
-
Hello- No, we actually use DW. I have used Yoast in the past; however, didn't know about the video plugin. Thanks! Any thoughts for what to use with DW?
Thanks,
L
-
Are you running WordPress as your CMS? If that's the case give WordPress SEO by Yoast a try with the additional Video Plugin. It's the best out there and takes care of everything related to sitemaps for both videos, images and normal URLs.
-
For web I use ScreamingFrog or http://www.web-site-map.com/ (if I need it quick and dirty and under 3k or so pages). For Images I don't know any tools - I usually craft that by hand as there is also some attention to detail needed (geo location etc).
It should be fast if you can export it with a script and then edit it when and where needed.
But again - you need to see the ROI for this. If you don't have that many images and you need to spend a lot of time doing the xml image site map - it's not really worth it. If you can deploy one fast - in under 30-60 min of work - then it might worth having it there.
Just my 2c.
-
Thank you so much! Is there a special tool or sitemap generator specifically for culling the images, videos, and content respectively that you use or could recommend? Or do I need to build it manually?
Thanks so so much again.
-
Hi,
The sitemaps helps to speed up the index process and also to "manage" the files - get some feedback as far as when those files were processed etc.
It's a good idea to split the xml sitemaps in image sitemaps, content and videos. It's easier to manage and get feedback out of it. Also there are different tags for each.
As far as names - it doesn’t matter. Just use something that makes sense for you – in order to be able to manage them. For example I use sitemap-images-v2.xml, sitemap-images-v3.xml, same with video and content. I also split and use multiple files for content as for me, it helps to add a new smaller xml sitemap then replace the old one - again, mainly to get feedback and see how google is processing those.
Hope it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitemap.xml strategy for site with thousands of pages
I have a client that has a HUGE website with thousands of product pages. We don't currently have a sitemap.xml because it would take so much power to map the sitemap. I have thought about creating a sitemap for the key pages on the website - but didn't want to hurt the SEO on the thousands of product pages. If you have a sitemap.xml that only has some of the pages on your site - will it negatively impact the other pages, that Google has indexed - but are not listed on the sitemap.xml.
Technical SEO | | jerrico10 -
Can a H1 Tag Have Multiple Spans Within It?
H1 tags on my client's website follow the template [Service] + [Location]. These two have their own span, meaning there are two spans in an H1 tag. class="what">Truck Repair near class="where">California, CA How do crawl bots see this? Is that okay for SEO?
Technical SEO | | kevinpark1910 -
Are multiple sites needed to rank one website?
My SEO guy for a mortgage website says that we should have 30 websites, with about 250 pages each on each site plus 50 blogs in order to even think of ranking for mortgage keywords. Is that correct?
Technical SEO | | simermeet0 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue:Â Url blocked by robots.txt. Description:Â Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap:Â www.example.org/author.xml Value:Â http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860 -
Should each new blog post be added to Sitemap.xml
Hello everyone, I have a website that has only static content. I have recently added a Blog to my website and I am wondering if I need to add each new Blog post to my Sitemap.xml file? Or is there another way/better way to get the Blog posting index? Any advice is greatly appreciated!
Technical SEO | | threebiz0 -
Targeting multiple keywords with index page
Quick keyword question.... I just started working with a client that is ranking fairly well for a number of keywords with his index page. Right now he has a bunch of duplicate titles, descriptions, etc across the entire site. There are 5 different keywords in the title of the index page alone. I am wondering if it OK to target 3 different keywords with the index page? Or, if I should cut it down to 1. Think blue widget, red widget, and widget making machines.  I want each of the individual keywords to improve but don't want to lose what I have either. Any ideas? THANKS!!!!
Technical SEO | | SixTwoInteractive0 -
Multiple URLs in CMS - duplicate content issue?
So about a month ago, we finally ported our site over to a content management system called Umbraco.  Overall, it's okay, and certainly better than what we had before (i.e. nothing - just static pages).  However, I did discover a problem with the URL management within the system. We had a number of pages that existed as follows: sparkenergy.com/state/name However, they exist now within certain folders, like so: sparkenergy.com/about-us/service-map/name So we had an aliasing system set up whereby you could call the URL basically whatever you want, so that allowed us to retain the old URL structure.  However, we have found that the alias does not override, but just adds another option to finding a page.  Which means the same pages can open under at least two different URLs, such as http://www.sparkenergy.com/state/texas and http://www.sparkenergy.com/about-us/service-map/texas.  I've tried pointing to the aliased URL in other parts of the site with the rel canonical tag, without success.  How much of a problem is this with respect to duplicate content?  Should we bite the bullet, remove the aliased URLs and do 301s to the new folder structure?
Technical SEO | | ufmedia0 -
Multiple Domains, Same IP address, redirecting to preferred domain (301) -site is still indexed under wrong domains
Due to acquisitions over time and the merging of many microsites into one major site, we currently have 20+ TLD's pointing to the same IP address as our "preferred domain:" for our consolidated website http://goo.gl/gH33w. They are all set up as 301 redirects on apache - including both the www and non www versions. When we launched this consolidated website, (April 2010) we accidentally left the settings of our site open to accept any of our domains on the same IP. This was later fixed but unfortunately Google indexed our site under multiple of these URL's (ignoring the redirects) using the same content from our main website but swapping out the domain. We added some additional redirects on apache to redirect these individual pages pages indexed under the wrong domain to the same page under our main domain http://goo.gl/gH33w. This seemed to help resolve the issue and moved hundreds of pages off the index. However, in December of 2010 we made significant changes in our external dns for our ip addresses and now since December, we see pages indexed under these redirecting domains on the rise again. If you do a search query of : site:laboratoryid.com you will see a few hundred examples of pages indexed under the wrong domain. When you click on the link, it does redirect to the same page but under the preferred domain. So the redirect is working and has been confirmed as 301. But for some reason Google continues to crawl our site and index under this incorrect domains. Why is this? Is there a setting we are missing?  These domain level and page level redirects should be decreasing the pages being indexed under the wrong domain but it appears it is doing the reverse. All of these old domains currently point to our production IP address where are preferred domain is also pointing. Could this be the issue? None of the pages indexed today are from the old version of these sites. They only seem to be the new content from the new site but not under the preferred domain. Any insight would be much appreciated because we have tried many things without success to get this resolved.
Technical SEO | | sboelter0