Multiple Sitemaps
-
Hello everyone!
I am in the process of updating the sitemap of an ecommerce website and I was thinking to upload three different sitemaps for different part (general/categories and subcategories/productgroups and products) of the site in order to keep them easy to update in the future.
Am I allowed to do so? would that be a good idea?
Open to suggestion
-
Right! I think now I have the complete picture and I can crack on working on it!
Thank you very much indeed!
Best Regards
Oscar
-
If you are talking about the sitemap for the visitors on your website, if you think the newly added pages are going to be helpful to them, you can update your visitors sitemap accordingly. But the Sitemap.xml file is a supplemental indexing tool meant for the search engines to find the pages on your website easily and needs to be updated and resubmitted to search engines using webmaster tools accounts whenever new pages are added to your website.
Hope that helps.
Best,
Devanur Rafi
-
Thanks a lot guys!
I really appreciated your help, although all this information made me realize I have tons of work to do to update the sitemaps and I have to start creating new ones.
Just another question, after I create the new sitemaps I will also have to update the sitemap on the website, is that right?
-
it should be added to the end of your robots.txt and be proceeded by 'Sitemap', like:
Sitemap: http://www.exmaple.com/sitemap1.xml
Sitemap: http://www.exmaple.com/sitemap2.xml -
No problem my friend. You are most welcome. Yes, you just need to give the location of your sitemap.xml file as given below:
Sitemap: http://example.com/sitemap_location.xml Here you go for more: https://support.google.com/webmasters/answer/183669?hl=en
-
Oh I see, thank you very much for your help, I haven't got much experience dealing with sitemaps.
So in order to put them in the robots.txt I will just have to put the link in it without anything else, is that right?
-
Hi there, robots.txt file is one of the initial things that search engine spiders look at when they visit your website and a reference to the Sitemap.xml file in there will aid the search engine spider to quickly access to important URLs on your website then and there.
Best,
Devanur Rafi
-
Why should I put the sitemaps in the robots.txt?
I ve been looking around and some sites do and some don't, what's the reason for it?
-
Thanks for the response my friend. The problem without an index sitemap file is, when you have to resubmit multiple sitemap.xml files in webmaster tools account, you will have to resubmit each of them at a time. With an index sitemap file, you just need to submit the index file and it would take care of the job.
Here you go for more: https://support.google.com/webmasters/answer/71453?hl=en
Best,
Devanur Rafi
-
You don't actually need to use a sitemap index file to use multiple sitemaps. You can list and submit them separately in robots.txt file and Google Web Master Tools.
-
Yes this is fine, from Google:
Whether you list all URLs in a single Sitemap or in multiple Sitemaps (in the same directory of different directories) is simply based on what's easiest for you to maintain. We treat the URLs equally for each of these methods of organization. More info can be found here multiple sitemaps in same directory
-
Hi there, though a single Sitemap.xml file can accommodate upto 50K URLs, it is not uncommon to go for multiple Sitemap.xml files for many purposes even with few hundreds on each.
You need to come up with a total of 4 Sitemap files and one among these would be an index sitemap that lists the other 3 Sitemap.xml files with URLs.
Here you go for more: http://www.sitemaps.org/protocol.html
Best,
Devanur Rafi
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best program to create an html sitemap?
I already have an xml sitemap, so I've been researching how to create an html sitemap with over 10,000 urls for an ecommerce website. Any program, paid or unpaid, just needs to be created so it looks good to put in the footer of our website.
Technical SEO | | ntsupply0 -
Site splitting value of our pages with multiple variations. How can I fix this with the least impact?
Just started at a company recently, and there is a preexisting problem that I could use some help with. Somebody please tell me there is a low impact fix for this: My company's website is structured so all of the main links used on the nav are listed as .asp pages. All the canonical stuff. However, for "SEO Purposes," we have a number of similar (not exact) pages in .html on the same topic on our site. So, for example, let's say we're a bakery. The main URL, as linked in the nav, for our Chocolate Cakes, would be http://www.oursite.com/chocolate-cakes.asp. This differentiates the page from our other cake varieties, such as http://www.oursite.com/pound-cakes.asp and http://www.oursite.com/carrot-cakes.asp. Alas, fully indexed in Google with links existing only in our sitemap, we also have: http://www.oursite.com/chocolate-cakes.html http://www.oursite.com/chocolatecakes.html http://www.oursite.com/cakes-chocolate.html This seems CRAZY to me, because wouldn't this split our search results 4 ways? Am I right in assuming this is destroying the rankings of our canonical pages? I want to change this, but problem is, none of the content is the same on any of the variants, and some of these pages rank really well - albeit mostly for long tail keywords instead of the good, solid keywords we're after. So, what I'm asking you guys is: How do I burn these .html pages to the ground without completely destroying our rankings for the other keywords? I want to 301 those pages to our canonical nav URLs but, because of the wildly different content, I'm afraid that we could see a heavy drop in search traffic. Am I just being overly cautious? Thanks in advance!
Technical SEO | | jdsnyc20 -
Sitemap indexed pages dropping
About a month ago I noticed my pages indexed from my sitemap are dropping.There are 134 pages in my sitemap and only 11 are indexed. It used to be 117 pages and just died off quickly. I still seem to be getting consistant search traffic but I'm just not sure whats causing this. There are no warnings or manual actions required in GWT that I can find.
Technical SEO | | zenstorageunits0 -
Do I need a link to my sitemap?
I have a very large sitemap. I submit it to both Google and Bing, but do I need a link to it? If someone went there it would probably lock their browser. Is there any danger of not having a link if I submit it to Google and Bing?
Technical SEO | | EcommerceSite0 -
Lost with conical, nofollow noindex. Not sure how to use it on a dyanmic php site with multiple region select options
I have a site with multiple regions the main page after a region is selected is login.php but the regions are defined by ?rid=11 , 12, etc. These are being picked up as duplicate content but they are all different regions. As i hired external php coders to develop most of the site I am scared to start meddling with any of the raw code and would like some advise on how to not show these as duplicate content. should i use noindex nofollow or connical? if Connical how do i set it up on the main login.php page? p.s. i am an extreme nube to seo
Technical SEO | | moby1230 -
Are multiple sites needed to rank one website?
My SEO guy for a mortgage website says that we should have 30 websites, with about 250 pages each on each site plus 50 blogs in order to even think of ranking for mortgage keywords. Is that correct?
Technical SEO | | simermeet0 -
Use of Multiple Tags
Hi, I have been monitoring some of the authority sites and I noticed something with one of them. This high authority site suddenly started using multiple tags for each post. And I mean, loads of tags, not just three of four. I see that each post comes with at least 10-20 tags. And these tags don't always make sense either. Let's say there is a video for "Bourne Legacy", they list tags like bourne, bourney legacy, bourne series, bourne videos, videos, crime movies, movies, crime etc. They don't even seem to care about duplicate content issues. Let's say the movie is named The Dragon, they would inclue dragon and the-dragon in tags list and despite those two category pages(/dragon and /the-dragon) being exactly the same now, they still wouldn't mind listing both the tags underneath the article. And no they don't use canonical tag. (there isn't even a canonical meta on any page of that site) So I am curious. Do they just know they have a very high DA, they don't need to worry about duplicate content issues? or; I am missing something here? Maybe the extra tags are doing more good than harm?
Technical SEO | | Gamer070 -
Google , 301 redirects, and multiple domains pointing to the same content.
Google, 301 redirects, and multiple domains pointing to the same content. This is my first post here. I would like to begin by thanking anyone in advance for their help. It is much appreciated. Secondly, I'm posting in the wrong place or something please forgive me simply point me in the right direction I'm a quick learner. I think I'm battling a redirect problem but I want to be sure before I make changes. In order to accurately assess the situation a little background is necessary. I have had a site called tx-laws.com for about 15 years. It was a site that was used primarily by private resource and as such was never SEO'd. The site itself was in fact quite Seo unfriendly. despite a complete lack of marketing or SEO efforts, over time, SEO aside, this domain eventually made it to page one of Google Yahoo and Bing under the keywords Texas laws. About six months ago I decided to revamp the site and create a new resource aimed at a public market. A good deal of effort was made to re-work the SEO. The new site was developed at a different domain name: easylawlook up.com. Within a few months this domain name surpassed tx-laws in Google and was holding its place in position number eight out of 190 million results. Note that at this point no marketing has been done, that is to say there has been no social networking, no e-mail campaigns, no blogs, -- nothing but content. All was well until a few weeks ago I decided to upgrade our network and our servers. During this period there was some downtime unfortunately. When the upgrade was complete everything seemed fine until a week or so later when our primary domain easy law look up vanished off Google. At first I thought it was downtime but now I'm not so sure. The current configuration reroutes traffic from tx-laws to easylawlookup in IIS by pointing both domains to the same root directory. Everything else was handled through scripting. As far as I know this is how it was always set up. At present there is no 301 Redirect in place for tx-laws (as I'm sure there probably should be). Interestingly enough the back links to easylaw also went away. Even more telling however is that now when I visit link: easylawlookup.com there is only one link, and that link is to a domain which references tx-laws not easy law. So it would appear that I have confused Google with regards to my actual intentions. My question is this. Right now my rankings for tx-laws remain unchanged. The last thing I want to have happen is to see those disappear as well. If easy law has somehow been penalized and I redirect tx-laws to easy through a 301 will I screw up my rankings for this domain as well? Any comments or input on the situation are welcome. I just want to think it through before I start making more changes which might make things worse instead of better. Ultimately though, there is no reason that the old domain can't be redirected to the new domain at this point unless it would mean that I run the risk of losing my listings for tx-laws, ending up with nothing instead of transferring any link juice and traffic to easy law. With regards to the down time, it was substantial over a couple of weeks with many hours off-line. However this downtime would have affected both domains the only difference being that the one domain had been in existence for 15 years as opposed to six months for the other. So is my problem downtime, lack of proper 301 redirect, or something else? and if I implement a 301 at this point do I risk damaging the remaining domain which is operational? Thanks again for any help.
Technical SEO | | Steviebone0