How to Delete the slug /category/ from wordpress category pages
-
Hi all,
I would like to ask you what's the better way to eliminate the slug /category/ form the wordpress category pages. I need to delete the slug /category/ to make the url seo frendly. The problem is that my site is an old site with the page indexed by Google for a long time.
Thanks for your advice.
-
Hi,
What David mentioned in his response is the best way to remove the /category/ from the URL. However, based on your comment, you can't remove the /category/ because it conflicts with the page URLs which kind of contradict your initial question.
I would suggest reworking on the page URL and the category URL. If you use 301 redirect to redirect the old URL to the new URL, everything should be fine once Google crawls your site again and see the redirect.
-
Hi David, thanks for your answer.
I usually use the plugin Wordpress Seo By Yoast to "Strip the category base (usually /category/) from the category URL" but in this case I'm not able to use it because It makes conflict with the pages url.
Regards.
-
If you're using the Yoast WordPress SEO plugin (which is the best for SEO, in my opinion), and you go under Permalinks within the SEO menu, the first item on the list is the option to "Strip the category base (usually
/category/
) from the category URL."If you do this, I would of course first compile a list of your current category URLs and all the URLs that live under them, and drop them into whatever you use for 301 redirects (I use a plugin called Redirection, which is great) and redirect everything on that list to the version of that URL without the /category/.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
My translated pages are categorized as subpages of the originals / Importance of hreflang tags
Hi there We have a website that is originally in German, but has an English translation for all pages.
Technical SEO | | Jess_Smunch
I recently created a crawl map for it, which showed that all our translated pages are indexed as subpages of the German originals. I wonder if this is normal, or if it will have a negative impact on our SEO. If they are subpages, will Google still index and rank them with the same importance as the originals?
If not, what can I do to make them standalone pages and not subpages? Also, we have a few issues with hreflang tags that we cannot fix easily as our CMS does not give us a flexible option for editing our code. I wonder how much impact hreflang tags have on our ranking and if we can just disregards these issues? We use Hubspot as a CMS, if that matters. Thanks for your feedback!0 -
Duplicate pages with "/" and without "/"
I seem to have duplicate pages like the examples below: https://example.com https://example.com/ This is happening on 3 pages and I'm not sure why or how to fix it. The first (https://example.com) is what I want and is what I have all my canonicals set too, but that doesn't seem to be doing anything. I've also setup 301 redirects for each page with "/" to be redirected to the page without it. Doing this didn't seem to fix anything as when I use the (https://example.com/) URL it doesn't redirect to (https://example.com) like it's supposed to. This issue has been going on for some time, so any help would be much appreciated. I'm using Squarespace as the design/hosting site.
Technical SEO | | granitemountain0 -
J-Son Schema via tagmanager for Blog Post Listed at Blog Category Page
Hi Experts, I am implementing J-son Schema via tagmanager. For my blog category page i have many blog post listed so now I have to use blog posting at category page via tag manager so can anyone guide me how to implement? Hope you are getting my question. Like my blog site - a) abcd.com/blog b) blog category - abcd.com/blog/cloth d) Blog Post - abcd.com/blog/cloth/ my-favourite-dress. My query is at this page - abcd.com/blog/cloth many blog post listed. So I have to implement for all post "@type": "BlogPosting" via tagmanager so how to do that? Without tag manager I know how to implement via loop but via tag manger dont know? Thanks!
Technical SEO | | Johny123450 -
Category URL Pagination where URLs don't change between pages
Hello, I am working on an e-commerce site where there are categories with multiple pages. In order to avoid pagination issues I was thinking of using rel=next and rel=prev and cannonical tags. I noticed a site where the URL doesn't change between pages, so whether you're on page 1,2, or 3 of the same category, the URL doesn't change. Would this be a cleaner way of dealing with pagination?
Technical SEO | | whiteonlySEO0 -
Blog.furnacefilterscanada.com/ or furnacefilterscanada.com/blog/
My shopping cart does not allow to instal a WordPress blog on a sub-domain like: furnacefilterscanada.com/blog/ But I can host my blog on another server with a sub-domain like: blog.furnacefilterscanada.com In a SEO point of view is there a difference between the 2? Link juice? Page authority? Thank you, BigBlaze
Technical SEO | | BigBlaze2050 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
OK to block /js/ folder using robots.txt?
I know Matt Cutts suggestions we allow bots to crawl css and javascript folders (http://www.youtube.com/watch?v=PNEipHjsEPU) But what if you have lots and lots of JS and you dont want to waste precious crawl resources? Also, as we update and improve the javascript on our site, we iterate the version number ?v=1.1... 1.2... 1.3... etc. And the legacy versions show up in Google Webmaster Tools as 404s. For example: http://www.discoverafrica.com/js/global_functions.js?v=1.1
Technical SEO | | AndreVanKets
http://www.discoverafrica.com/js/jquery.cookie.js?v=1.1
http://www.discoverafrica.com/js/global.js?v=1.2
http://www.discoverafrica.com/js/jquery.validate.min.js?v=1.1
http://www.discoverafrica.com/js/json2.js?v=1.1 Wouldn't it just be easier to prevent Googlebot from crawling the js folder altogether? Isn't that what robots.txt was made for? Just to be clear - we are NOT doing any sneaky redirects or other dodgy javascript hacks. We're just trying to power our content and UX elegantly with javascript. What do you guys say: Obey Matt? Or run the javascript gauntlet?0