Sitemap Question - E-commerce - Magento
-
Good Morning...
I have an ecommerce site running on Magento and the sitemap is automatically generated by Magento based on the categories and sub categories and products.
I have recently created new categories that i want to replace the old categories, but they are both in the auto-generated sitemap. The old categories are "active" (as in still exist if you know the URL to type) but not visible (you can't find it just by navigating through the site). The new category pages are active and visible...
If i want Google to rank one page (the new category page) and not the old page (old category page) should i remove the old page from the sitemap? Would removing the old page that used to target the same keywords improve my rankings on the newer category page?
Sitemap currently contains:
www.example.com/oldcategorypage
www.example.com/newcategorypage
Did I confuse you yet?
Any help or guidance is appreciated.
Thanks,
-
First thing would be to 301 redirect old to new so the new pages have the chance to rank. If you don't you might also run into keyword cannibalisation issues where both old and new try to rank for the same keywords.
In Magento I believe if you disable the old category, it will also be removed the in sitemap.xml it generates for you.
If you're generating the sitemap manually then yes definitely remove them from the sitemap after the redirects.
-
Hey Ian, thanks for the response.
The new categories have already been created so it seems like it's too late to rename the older categories and urls.
Question is should i remove the ones i don't want to rank from the sitemap...
Thanks
-
Is there a reason you need to keep those old categories? In magento you can rename the category and the URL into your new category and it will automatically 301 redirect it to whatever new category URL structure you give it, passing SEO value along.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain Question
Having a difficult time on our site and looking for some advice. Our site pages are indexed perfectly, however, we have a subdomain where we have all of our images and PDF's. We only have the main domain set-up in Search Console with our sitemap. We can't seem to get any of our images indexed by Google that are in the subdomain however all the PDF's are indexed. My thought is to add the subdomain to SC and create a new sitemap that is just for the subdomain. Assuming we are not blocking any folders or files with our robots.txt can anyone think of any other reasons why the images wouldn't get indexed.
Technical SEO | | cbathd
Thanks0 -
Sitemap Rules
Hello there, I have some questions pertaining to sitemaps that I would appreciate some guidance on. 1. Can an XML sitemap contain URLs that are blocked by robots.txt? Logically, it makes sense to me to not include pages blocked by robots.txt but would like some clarity on the matter i.e. will having pages blocked by robots.txt in a sitemap, negatively impact the benefit of a sitemap? 2. Can a XML sitemap include URLs from multiple subdomains? For example: http://www.example.com/www-sitemap.xml would include the home page URL of two other subdomains i.e. http://blog.example.com/ & http://blog2.example.com/ Thanks
Technical SEO | | SEONOW1230 -
Blog on subdomain of e-commerce site
Hi guys. I've got an e-commerce site which we have very little control over. As such, we've created a subdomain and are hosting a WordPress install there, instead. This means that all the great content we're putting out (via bespoke pages on the subdomain) are less effective than if they were on the main domain. I've looked at proxy forwarding, but unfortunately it isn't possible through our servers, leaving the only option I can see being permenant redirects... What would be the best solution given the limitations of the root site? I'm thinking of wildcard rewrite rules (eg. link site.com/blog/articleTitle to blog.site.com/articleTitle) but I'm wondering if there's much of an SEO benefit in doing this? Thanks in advance for everyone's help 🙂
Technical SEO | | JAR8970 -
Should all pagination pages be included in sitemaps
How important is it for a sitemap to include all individual urls for the paginated content. Assuming the rel next and prev tags are set up would it be ok to just have the page 1 in the sitemap ?
Technical SEO | | Saijo.George0 -
Content Organization Advice with Big Commerce
Hi folks, We have three places with unique content for our company. Our ecommerce site (hosted on big commerce), our help desk knowledge base (subdomain, hosted on zendesk), and our blog (separate domain, self hosted wordpress). We're about to refocus our efforts on generating high quality content, and I'm trying to figure out the best strategy to organize it. I think from an SEO perspective, if we had all of the content hosted directly on our ecommerce site, that would be best. Unfortunately Big Commerce doesn't have much by way of content management. We can't (yet) install a blogging platform or CMS onto our root domain. What's the next best option? Does it do any good to move our blog to a subdomain? Should I try to post all content on our root domain and just deal with the lack of content management (i.e. just make a new web page for each blog entry). Basically, what's the best strategy in this situation for SEO? Any advice appreciated. Thanks so much! Hal
Technical SEO | | AlabuSkinCare0 -
Robots.txt versus sitemap
Hi everyone, Lets say we have a robots.txt that disallows specific folders on our website, but a sitemap submitted in Google Webmaster Tools that lists content in those folders. Who wins? Will the sitemap content get indexed even if it's blocked by robots.txt? I know content that is blocked by robot.txt can still get indexed and display a URL if Google discovers it via a link so I'm wondering if that would happen in this scenario too. Thanks!
Technical SEO | | anthematic0 -
Redirect question
I would like to redirect http://example.com/index.html to http://www.example.com/ Is the code below correct ? RewriteEngine on RewriteCond %{HTTP_HOST}^example.comRewriteRule (.*) http://www.example.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/ RewriteRule ^index.html$ http://www.example.com/ [R=301,L]
Technical SEO | | seoug_20050 -
XML Sitemap without PHP
Is it possible to generate an XML sitemap for a site without PHP? If so, how?
Technical SEO | | jeffreytrull11