Delete or re-submit sitemaps for new products? How often?
-
When I add new products (approx. 10 a month), I usually delete the old sitemap and submit a new one. Is this ok to do, or should I just re-submit it with the new info included? Also, is once a month too much?
-
Hi Tiffany,
This looks like you're debating between two approaches to me.
approach 1 - is it ok to delete a sitemap file and submit a new one once a month with product updates.
approach 2 - Is it ok to resubmit the same existing sitemap with the updates included. also once a month.
As Eric mentions below, stick to having your current sitemap file and just updating it rather than having to delete and submit a fresh one each time.
Part 2 - Is updating once a month too much? no. Once a month updates are fine and you can still get away with more than just once a month depending on how much content you have available to update.
-
No need to delete the old one... an update will function just as well.
Once a month is rather rare. You could do update the sitemap for everyone of your 10 updates / month and not have any problem. I can't imagine any problem kicking in unless you regularly do multiple updates per day. Even doing that once in a while won't be an issue. Especially if it's 'legitimate' updates, and not an automated script submitting non-stop.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What do you do with product pages that are no longer used ? Delete/redirect to category/404 etc
We have a store with thousands of active items and thousands of sold items. Each product is unique so only one of each. All products are pinned and pushed online ... and then they sell and we have a product page for a sold item. All products are keyword researched and often can rank well for longtail keywords Would you :- 1. delete the page and let it 404 (we will get thousands) 2. See if the page has a decent PA, incoming links and traffic and if so redirect to a RELEVANT category page ? ~(again there will be thousands) 3. Re use the page for another product - for example a sold ruby ring gets replaces with ta new ruby ring and we use that same page /url for the new item. Gemma
Technical SEO | | acsilver0 -
Best Topography for eCommerce Site Product Pages (flat nav/off the root OR in products subfolder) ?
Hi Im SEO'ing a Shopify site (new/not yet live) at the moment and all the products are in a 'Products' subfolder along the lines of: domain.com/products/blue-widgets/ etc I understand that many ecommerce SEO's these days go 'Flat Navigation' with all products 'off the root' rather than in a sub folder. Then they communicate product & categories/departmental relationships via breadcrumbs & other internal linking etc In the case of a platform like Shopfy is this a good idea or is it best to leave 'as is' and the 'Products' subfolder is a perfectly good place for the product pages ? All Best Dan
Technical SEO | | Dan-Lawrence0 -
Google is not indexing my new URL structure. Why not?
Hi all, We launched a new website for a customer on April 29th. That same day we resubmitted the new sitemap & asked Google to fetch the new website. Screenshot is attached of this (GWT Indexed). However, when I look at Google Index (see attachment - Google Index), Automated Production's old website URL's still appear. It's been two weeks. Is it normal for Google's index to take this long to update? Thanks for your help. Cole VoLPjhy vfxVUsO
Technical SEO | | ColeLusby0 -
How to handle outdated products/pages?
Hi, I wonder how I should handle pages with outdated products. Our market is changing rapidly unfortunately so products we offered in the past are no longer existent. So for example one of our product was called "PRODUCT 1" The URL for this product is still:
Technical SEO | | bastelele
www.mydomain.com/product-1/ with different subpages about the product like:
www.mydomain.com/product-1/about/
www.mydomain.com/product-1/howto/
www.mydomain.com/product-1/pricing/
www.mydomain.com/product-1/whatever/ Our new product is totally different from the first one. Is it still a good idea to set up 301 redirects or are there any other ideas? Thanks in advance
Bastian0 -
Is this 302 re-direct on Magento a problem?
The SEOmoz crawler has pointed out this warning on my Magento store.. http://mydomain.co.uk redirects to http://www.mydomain.co.uk Is this a problem? My site is new, so don't want to get penalised this early on! Any advice appreciated.
Technical SEO | | MoA0 -
On-Site Sitemaps - Guidance Required
Hi, I am looking to find good examples of on-site sitemaps. We already submit our XML sitemap regularly through GWMT but I now wonder if we still need an on-site sitemap, as we have about 30 static pages and 300+ Wordpress blogs which in a sense makes that a spammy page as it has too many links and a higher than average keyword density. The reason I am looking for good examples is that I want to create a basic on-site sitemap that aids navigation but is styled to look ok as well. The Solution I have in mind: mydomain.com/link-example-one.php
Technical SEO | | tdsnet
mydomain.com/link-example-two.php
mydomain.com/liink-example-ten.php mydomain.com/blog then links to my 300 WP blogs, broken down into chunks navigated by using breadcrumbs. Will Google crawl this ok or should I stick to the current format listing ALL posts on one page? Thanks0 -
URL restructure and phasing out HTML sitemap
Hi SEOMozzies, Love the Q&A resource and already found lots of useful stuff too! I just started as an in-house SEO at a retailer and my first main challenge is to tidy up the complex URL structures and remove the ugly sub sitemap approach currently used. I already found a number of suggestions but it looks like I am dealing with a number of challenges that I need to resolve in a single release. So here is the current setup: The website is an ecommerce site (department store) with around 30k products. We are using multi select navigation (non Ajax). The main website uses a third party search engine to power the multi select navigation, that search engine has a very ugly URL structure. For example www.domain.tld/browse?location=1001/brand=100/color=575&size=1&various other params, or for multi select URL’s www.domain.tld/browse?location=1001/brand=100,104,506/color=575&size=1 &various other non used URL params. URL’s are easily up to 200 characters long and non-descriptive at all to our users. Many of these type of URL’s are indexed by search engines (we currently have 1.2 million of those URL’s indexed including session id’s and all other nasty URL params) Next to this the site is using a “sub site” that is sort of optimized for SEO, not 100% sure this is cloaking but it smells like it. It has a simplified navigation structure and better URL structure for products. Layout is similair to our main site but all complex HTMLelements like multi select, large top navigations menu's etc are all removed. Many of these links are indexed by search engines and rank higher than links from our main website. The URL structure is www.domain.tld/1/optimized-url .Currently 64.000 of these URL’s are indexed. We have links to this sub site in the footer of every page but a normal customer would never reach this site unless they come from organic search. Once a user lands on one of these pages we try to push him back to the main site as quickly as possible. My planned approach to improve this: 1.) Tidy up the URL structure in the main website (e.g. www.domain.tld/women/dresses and www.domain.tld/diesel-red-skirt-4563749. I plan to use Solution 2 as described in http://www.seomoz.org/blog/building-faceted-navigation-that-doesnt-suck to block multi select URL’s from being indexed and would like to use the URL param “location” as an indicator for search engines to ignore the link. A risk here is that all my currently indexed URL (1.2 million URL’s) will be blocked immediately after I put this live. I cannot redirect those URL’s to the optimized URL’s as the old URL’s should still be accessible. 2.) Remove the links to the sub site (www.domain.tld/1/optimized-url) from the footer and redirect (301) all those URL’s to the newly created SEO friendly product URL’s. URL’s that cannot be matched since there is no similar catalog location in the main website will be redirected (301) to our homepage. I wonder if this is a correct approach and if it would be better to do this in a phased way rather than the currently planned big bang? Any feedback would be highly appreciated, also let me know if things are not clear. Thanks! Chris
Technical SEO | | eCommerceSEO0