Lowercase Rewrite in web.config. Strange behavior.
-
So I am trying to implement a lower case rule in my rewrite section and I am using the same snippet below that can be found all over the web.
The lower casing of my url works fine but it adds my domain name into the URL for reasons I can't explain. so this www.SomeSite.com/Upper.aspx become www.somesite.com/somesite/upper.aspx
I removed all other rewrite rules but this one and still it happens. Has anyone seen this?
My server is IIS 7
<match url="[A-Z]" ignorecase="false"></match>
<action type="Redirect" url="{ToLower:{URL}}"></action>
-
Hi Greg, great to hear that it's working ok and was a cache issue. Thanks for the update!
-
Hi James,
I am not sure why but it seems that my problem was cache related... I went to work and saw that my urls were rewriting just fine. Sorry I didn't get back to this thread for so long.
Regards,
Greg
-
Hi Greg,
You may need to be more specific in your query, e.g.
<match url=".[A-Z]." ignoreCase="false" />
Instead of:
<match url="[A-Z]" ignorecase="false"></match>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best Practices For Angular Single Page Applications & Progressive Web Apps
Hi Moz Community, Is there a proper way to do SPA (client side rendered) and PWA without having a negative impact on SEO? Our dev team is currently trying to covert most of our pages to Angular single page application client side rendered. I told them we should use a prerendering service for users that have JS disabled or use server side rendering instead since this would ensure that most web crawlers would be able to render and index all the content on our pages even with all the heavy JS use. Is there an even better way to do this or some best practices? In terms of the PWA that they want to add along with changing the pages to SPA, I told them this is pretty much separate from SPA's because they are not dependent. Adding a manifest and service worker to our site would just be an enhancement. Also, if we do complete PWA with JS for populating content/data within the shell, meaning not just the header and footer, making the body a template with dynamic JS as well would that effect our SEO in any way, any best practices here as well? Thanks!
Technical SEO | | znotes0 -
Many errors in Search Console (strange parameters)
Hello, I have many strange parameters in my search console that make many 404 pages, for example: mywebsite.com/article-name/&ct=ga&cd=CAIyGjk4YjY4ZDExNTYxOTgzZTk6Y29tOmVuOlVT&usg=AFQjCNFvpYpYpYf9DoyRBBu8jbiQB8JcIQ mywebsite.com/article-name/&sa=U&ved=0ahUKEwj1zMLR0JbLAhUGM5oKHejjBJAQqQIILSgAMAk&usg=AFQjCNEBNFx3dG5B0-16X6eXTS7k-Srm6Q Can someone tell me how to solve it?
Technical SEO | | JohnPalmer0 -
Magento Rewrite Issue
Moz's Crawler has thrown up a bunch of crawl issue for my site.The site is a magento based site and I recently updated the themes so some routes may have have become redundant. Moz has identified 289 pages with Temporary Redirect. I thought magento managed the redirects if I set the "Auto-redirect to Base URL" to Yes(301 Moved permanently). But this is enabled on my store and I still get the errors. The only thing I could think of was to add a Robots.txt and handle the redirection of these links from here. But handling redirection for 289 links is no mean task. I was looking for any ideas that could fix this without me manually doing this .
Technical SEO | | abhishek19860 -
Domain Mapping - International Config - WP Multisite - Problems/issues experienced ?
Hi My new clients devs are setting up new client according to below domain mapping type config Is anyone aware of any probs this could cause ? or other way should be setting up ? Also will setting up an MA campaign for the root domain 'catch all' given the domain mapping aspect etc ? Cheers Dan Set up now using WP Multisite. The root domain for the network of websites we are going to roll out, is set up as "network.domain.com. This is a "parent" domain from which all languages variants will be children of this root domain name. so: "network.domain.com/uk/" - English Language
Technical SEO | | Dan-Lawrence
"network.domain.com/tr/" - Turkish Langauge
"network.domain.com/za/" - South African
etc I then will domain map domain names to each of these children, once I get DNS control of each language's settings. I have already mapped "www.domain.com" to "network.domain.com/uk/", so the english website is set up (and launched - as you know). I fully expect that "www.domain.tr" will be mapped to "network.domain.com/tr/" and so on, but depending on domain availability at the time of purchase. Any domain name can be mapped to each of these children, and the system doesn't care, or mind! I can also map one (or more) domain names to each child, and make ONE of them the primary domain name.0 -
Two "Twin" Domains Responding to Web Requests
I do not understand this point in my Campaign Set-Up. They are the same site as fas as I understand Can anyone help please? Quote from SEOMOZ "We have detected that the domain www.neuronlearning.eu and the domain neuronlearning.eu both respond to web requests and do not redirect. Having two "twin" domains that both resolve forces them to battle for SERP positions, making your SEO efforts less effective. We suggest redirecting one, then entering the other here." thanks John
Technical SEO | | johnneuron0 -
How to find original URLS after Hosting Company added canonical URLs, URL rewrites and duplicate content.
We recently changed hosting companies for our ecommerce website. The hosting company added some functionality such that duplicate content and/or mirrored pages appear in the search engines. To fix this problem, the hosting company created both canonical URLs and URL rewrites. Now, we have page A (which is the original page with all the link juice) and page B (which is the new page with no link juice or SEO value). Both pages have the same content, with different URLs. I understand that a canonical URL is the way to tell the search engines which page is the preferred page in cases of duplicate content and mirrored pages. I also understand that canonical URLs tell the search engine that page B is a copy of page A, but page A is the preferred page to index. The problem we now face is that the hosting company made page A a copy of page B, rather than the other way around. But page A is the original page with the seo value and link juice, while page B is the new page with no value. As a result, the search engines are now prioritizing the newly created page over the original one. I believe the solution is to reverse this and make it so that page B (the new page) is a copy of page A (the original page). Now, I would simply need to put the original URL as the canonical URL for the duplicate pages. The problem is, with all the rewrites and changes in functionality, I no longer know which URLs have the backlinks that are creating this SEO value. I figure if I can find the back links to the original page, then I can find out the original web address of the original pages. My question is, how can I search for back links on the web in such a way that I can figure out the URL that all of these back links are pointing to in order to make that URL the canonical URL for all the new, duplicate pages.
Technical SEO | | CABLES0 -
Missing Cache - very strange
Anyone have experience with a cache going missing from a page that had a cache in the past? We’re overhauling a page and noticed the cache was gone from Google results. We don’t know if this event is good/bad/doesn’t matter but I am curious why this happens. I am positive the cache was missing before we updated the page today because a programmer mentioned they did try checking for a cache for a historical load time prior to today for a different project. I have attached two screenshots to illustrate two things: 1) What google delivers for a cache: of the page instead of the normal cache page 2) Even though you can see a cache of any of the indexed pages we have from a serp, the cached link is missing in serps for the mentioned page Has anyone seen this before? thanks! IhnAf SL8ax
Technical SEO | | CouponCactus0