Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Removing URL Parentheses in HTACCESS
-
Im reworking a website for a client, and their current URLs have parentheses. I'd like to get rid of these, but individual 301 redirects in htaccess is not practical, since the parentheses are located in many URLs.
Does anyone know an HTACCESS rule that will simply remove URL parantheses as a 301 redirect?
-
I thought I'd come back and re-post the solution in case this shows up in SERPs or anyone other Moz members are looking for this answer (courtesy Noah Wooden of Izoox). HTACCESS:
<ifmodule mod_rewrite.c="">RewriteEngine On
# Strip set of opening-closing parenthesis from URL and 301 redirect.
RewriteCond %{REQUEST_URI} [()]+
RewriteRule ^(.)[(]+([^)])[)]+(.*)$ /$1$2$3 [R=301,L]</ifmodule>Remember to put this in the proper 'order' on your htaccess file if you are doing any other redirecting. The code above 301 redirects URLs with parentheses into the exact same URL minus the parentheses.
-
thanks Merlin - Ill have their programmer try this.
-
Ok,
having looked at the site I would recommend that the changes are done via server side. I assume the htaccess file is been used to to parse category/subcategory/productname to a controller. Within that controller if the product is found clean the product name with the function below test it against the given url and do a 301 redirect to the cleaned name if they do not match
function clean_name($string) { $non_acceptable = '#[^-a-zA-Z0-9 ]#'; $string = preg_replace($non_acceptable, '', $string); $string = trim($string); $string = preg_replace('#[-_ ]+#', '-', $string); return $string; }
-
Hi Merlin - thank you.
Here is an example:
www.domain-name.com/category1/subcategory/product-name-details-(model-number)-length
needs to change to:
www.domain-name.com/category1/subcategory/product-name-details-model-number-length
Any suggestion would be great. Their programmer is having trouble creating a rule.
Thanks in advance
-
I have 1 or two suggestions. Could you provide an example of the URL? This will allow me to give a more definiive solution
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Appending a code at the end of a URL
Hi All, Some real estate/ news companies have a code appended to the end of a URL https://www.realestate.com.au/property-house-qld-ormiston-141747584 https://www.brisbanetimes.com.au/national/queensland/childcare-centre-could-face-prosecution-for-leaving-child-on-hot-bus-20230320-p5ctqs.html Can I ask if there's any negative SEO implications for doing this? Cheers Dave
Technical SEO | | Redooo0 -
SEO - New URL structure
Hi, Currently we have the following url structure for all pages, regardless of the hierarchy: domain.co.uk/page, such as domain/blog name. Can you, please confirm the following: 1. What is the benefit of organising the pages as a hierarchy, i.e. domain/features/feature-name or domain/industries/industry-name or domain/blog/blog name etc. 2. This will create too many 301s - what is Google's tolerance of redirects? Is it worth for us changing the url structure or would you only recommend to add breadcrumbs? Many thanks Katarina
Technical SEO | | Katarina-Borovska1 -
301 redirect syntax for htaccess
I'm working on some htaccess redirects for a few stray pages and have come across a few different varieties of 301s that are confusing me a bit....Most sources suggest: Redirect 301 /pageA.html http://www.site.com/pageB.html or using some combination of: RewriteRule + RewriteCond + RegEx I've also found examples of: RedirectPermanent /pageA.html http://www.site.com/pageB.html I'm confused because our current htaccess file has quite a few (working) redirects that look like this: Redirect permanent /pageA.html http://www.site.com/pageB.html This syntax seems to work, but I'm yet to find another Redirect permanent in the wild, only examples of Redirect 301 or RedirectPermanent Is there any difference between these? Would I benefit at all from replacing Redirect permanent with Redirect 301?
Technical SEO | | SamKlep1 -
Is Removing Breadcrumbs Detrimental for SEO?
We have full navigational breadcrumbs on our site for the menu and the brand menu. i.e. Home > Clothing > Jackets Brand > Brand Name > Brand Jackets There's been talk of removing this and having it like Chico's does, where on item pages they just have a link at the top to previous category (i.e. you're on a shirt product page and at the top it says "Back to Tops" instead of listing Home > Clothing > Tops) Is doing something like this detrimental to SEO? From what I've read Breadcrumbs are for user experience but I just want to be sure.
Technical SEO | | AliMac260 -
URL has caps, but canonical does not. Now what?
Hi, Just started working with a site that has the occasional url with a capital, but then the url in the canonical as lower case. Neither, when entered in a browser, resolves to the other. It's a Shopify site. What do you think I should do?
Technical SEO | | 945010 -
Tool to Generate All the URLs on a Domain
Hi all, I've been using xml-sitemaps.com for a while to generate a list of all the URLs that exist on a domain. However, this tool only works for websites with under 500 URLs on a domain. The paid tool doesn't offer what we are looking for either. I'm hoping someone can help with a recommendation. We're looking for a tool that can: Crawl, and list, all the indexed URLs on a domain, including .pdf and .doc files (ideally in a .xls or .txt file) Crawl multiple domains with unlimited URLs (we have 5 websites with 500+ URLs on them) Seems pretty simple, but we haven't been able to find something that isn't tailored toward management of a single domain or that can crawl a huge volume of content.
Technical SEO | | timfrick0 -
Using the Google Remove URL Tool to remove https pages
I have found a way to get a list of 'some' of my 180,000+ garbage URLs now, and I'm going through the tedious task of using the URL removal tool to put them in one at a time. Between that and my robots.txt file and the URL Parameters, I'm hoping to see some change each week. I have noticed when I put URL's starting with https:// in to the removal tool, it adds the http:// main URL at the front. For example, I add to the removal tool:- https://www.mydomain.com/blah.html?search_garbage_url_addition On the confirmation page, the URL actually shows as:- http://www.mydomain.com/https://www.mydomain.com/blah.html?search_garbage_url_addition I don't want to accidentally remove my main URL or cause problems. Is this the right way this should look? AND PART 2 OF MY QUESTION If you see the search description in Google for a page you want removed that says the following in the SERP results, should I still go to the trouble of putting in the removal request? www.domain.com/url.html?xsearch_... A description for this result is not available because of this site's robots.txt – learn more.
Technical SEO | | sparrowdog1