No Sub-Categories in XML Sitemap
-
I have a couple of sites using 3dcart, the ecommerce platform. Their tech support recently told me that they do not list sub-categories in the XML sitemap, only products and top-tier categories.
Am I the only one that sees a problem with this?
Thanks
-
The simple solution is to use a sitemap generator. There are many solutions of them. Just google it and find which one works best for you.
Personally, I don't care for companies I use to tell me "no". I would take that as an indicator I need to look for another ecommerce platform.
With the above noted, a sitemap is really not necessary for a well-designed site. Yes, I use one and submit it mainly because I use an easily automated process. As long as your content is well-linked, then Google will see all of it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Desktop & Mobile XML Sitemap Submitted But Only Desktop Sitemap Indexed On Google Search Console
Hi! The Problem We have submitted to GSC a sitemap index. Within that index there are 4 XML Sitemaps. Including one for the desktop site and one for the mobile site. The desktop sitemap has 3300 URLs, of which Google has indexed (according to GSC) 3,000 (approx). The mobile sitemap has 1,000 URLs of which Google has indexed 74 of them. The pages are crawlable, the site structure is logical. And performing a Landing Page URL search (showing only Google/Organic source/medium) on Google Analytics I can see that hundreds of those mobile URLs are being landed on. A search on mobile for a longtail keyword from a (randomly selected) page shows a result in the SERPs for the mobile page that judging by GSC has not been indexed. Could this be because we have recently added rel=alternate tags on our desktop pages (and of course corresponding canonical ones on mobile). Would Google then 'not index' rel=alternate page versions? Thanks for any input on this one. PmHmG
Technical SEO | | AlisonMills0 -
Woocommerce and individual category/product set-up
Hi All, Very new to SEO but trying to make small meaningful changes to wordpress site. My question is whether it would be better for me to bypass this category page (http://liliglace.com.br/categoria-produto/personalizados/) (website is in Portuguese) and go straight to the underlying product pages by creating individual categories for each product. I think this will increase SEO efficiency and clarity on the site with regard to these 3 products but I am worried about having a Woo-commerce category page with just one product page. I know that the plugin goes straight to the product page but is there a risk of duplicate content regarding the unused category page? Also long Urls! The Casamento (Wedding) category is already set up this way and same question applies. Any help or guidence wold be greatly appreciated. Thanks
Technical SEO | | Eoinfitz0 -
Sitemap all of a sudden only indexing 2 out of 5000+ pages
Any ideas why this happened? Our sitemap looks the same. Also, our total number of pages indexed has not decreased, just the sitemap. Could this eventually affect my pages being in the index?
Technical SEO | | rock220 -
Sub-domains for keyword targeting? (specific example question)
Hey everyone, I have a question I believe is interesting and may help others as well. Our competitor heavily (over 100-200) uses sub-domains to rank in the search engines... and is doing quite well. What's strange, however, is that all of these sub-domains are just archives -- they're 100% duplicate content! An example can be seen here where they just have a bunch of relevant posts archived with excerpts. How is this ranking so well? Many of them are top 5 for keywords in the 100k+ range. In fact their #1 source of traffic is SEO for many of the pages. As an added question: is this effective if you were to actually have a quality/non-duplicate page? Thanks! Loving this community.
Technical SEO | | naturalsociety0 -
Sitemap for 170 K webpages
I have 170 K pages on my website which I want to be indexed. I have created a multiple HTML sitemaps (e.g. sitemap1.html, sitemap2.html,...etc) with each sitemap page having 3000 links. Is this right approach or should i switch to xml based sitemaps and that too multiple one. Please suggest.
Technical SEO | | ArtiKalra0 -
Extra Sub Directory
Anything wrong with a URL structure like: www.mysite.com/process/widgets/red-widgets Where the DIR: /process/ is completely empty e.g. you get a 404 if you go to www.mysite.com/process/ and it has no content within. This URL structure was setup before they knew what SEO was...wondering if it's worth the pain the 301 and restructure new URLs or is it ok to leave as is?
Technical SEO | | SoulSurfer80 -
Robots.txt Sitemap with Relative Path
Hi Everyone, In robots.txt, can the sitemap be indicated with a relative path? I'm trying to roll out a robots file to ~200 websites, and they all have the same relative path for a sitemap but each is hosted on its own domain. Basically I'm trying to avoid needing to create 200 different robots.txt files just to change the domain. If I do need to do that, though, is there an easier way than just trudging through it?
Technical SEO | | MRCSearch0 -
How to Submit XML Site Map with more than 300 Subdomains?
Hi,
Technical SEO | | vaibhav45
I am creating sitemaps for site which has more than 500 Sub domains. Page varies from 20 to 500 in all subdomains & it will keep on adding in coming months. I have seen sites that create separate sitemap.xml for each subdomain which they mention in separate robots.txt file http://windows7.iyogi.com/robots.txt XML site map eg for subdomain: http://windows7.iyogi.com/sitemap.xml.gz , Currently in my website we have only 1 robots.txt file for main domain & sub domains. Please tell me shall i create separate robots.txt & XML site map file for each subdomain or 1 file. Creating separate xml for each sub-domain is not feasible as we have to verify in GWT separately. Is there any automatic way & do i have to ping separately if i add new pages in subdomain. Please advise me.0