How do you find old linking url's that contain uppercase letters?
-
We have recently moved our back office systems, on the old system we had the ability to use upper and lower case letters int he url's. On the new system we can only use lower case, which we are happy with.
However any old url's being used from external sites to link into us that still have uppercase letterign now hit the 404 error page. So, how do we find them and any solutions?
Example:
http://www.christopherward.co.uk/men.html - works
http://www.christopherward.co.uk/Men.html - Fails
Kind regards
Mark
-
Hi Corey
Thanks for the response.
I'm looking at 301's and adding them in as required. Hosting is Apache based. As always as going through the data there were pre-existing issues that me beign causign a knock on effect.
Cheers
Mark
-
I would definitely setup some 301 redirects; done dynamically, if it's for a lot of pages. You'll want that old 'link juice', as well as any inbound visitors. Depending on whether your hosting is IIS or Apache-based, it's done a little bit differently, but should be pretty straight-forward.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Doubts with URL's structure
Hi guys i have some doubts with the correct URL structure for a new site. The question is about how show the city, the district and also the filters. I would do that: www.domain.com/category/city/disctict but maybe is better do that: **www.domain.com/category/city-district ** I also have 3 filters that are "individual/colective" "indoor/outdoor" and "young/adult" but that are not really interesting for the querys so where and how i put this filtters? At the end of the url showing these: **www.domain.com/cateogry/city/district#adult#outdoor#colective ** ? Well really i don't know what to do with the filters. Check if you could help me with that please. I also have a lof of interest in knowing if maybe is better use this combination **www.domain.com/category-city or domain.com/category/city **and know about the diference. Thank you very much!
Intermediate & Advanced SEO | | omarmoscatt0 -
If it's not in Webmaster Tools, is it Duplicate Title
I am showing a lot of errors in my SEOmoz reports for duplicate content and duplicate titles, many of which appear to be related to capitalization vs non-capitalization in the URL. Case in point, if a URL contains a lower character, such as: http://www.gallerydirect.com/art/product/allyson-krowitz/distinct-microstructure-i as opposed to the same URL having an upper character in the structure: http://www.gallerydirect.com/art/product/allyson-krowitz/distinct-microstructure-I I am finding that some of the internal links on the site use the former structure and other links use the latter structure. These show as duplicate title/content in the SEOmoz reports, but they don't appear as duplicate titles in Webmaster Tools. My question is, should I try to work with our developers to create a script to change all of the content with cap letters in the destination links internally on the site, or is this a non-issue since it doesn't appear in Webmaster Tools?
Intermediate & Advanced SEO | | sbaylor0 -
It's a good idea to have a directory on your website?
Currently I have a directory on a sub domain but Google apparently sees it as part of my main domain so all outgoing links may be affecting my rankings?
Intermediate & Advanced SEO | | Valarlf0 -
Looking for good examples of website's geotargeting
I am looking for some examples of sites that handle their global presence well (geotargeting and languages), ideally from a single .com domain thanks! Stephen
Intermediate & Advanced SEO | | firstconversion0 -
How to Handel Fashion Jewelry Stores's Listings?
Hi ! I m working with few brands who hardly make same designs again... which is good things for my buyers.. Buy i m worrying how i can handle this keeping seo in mind? should i delete pages after product is sold out? or there is some other better way to handle this? Thanks, Vinku
Intermediate & Advanced SEO | | vinku0 -
Best way to find broken links on a large site?
I've tried using Xenu, but this is a bit time consuming because it only tells you if the link sin't found & doesn't tell you which pages link to the 404'd page. Webmaster tools seems a bit dated & unreliable. Several of the links it lists as broken aren't. Does anyone have any other suggestions for compiling a list of broken links on a large site>
Intermediate & Advanced SEO | | nicole.healthline1 -
Rel canonical element for different URL's
Hello, We have a new client that has several sites with the exact same content. They do this for tracking purposes. We are facing political objections to combine and track differently. Basically, we have no choice but to deal with the situation given. We want to avoid duplicate content issues, and want to SEO only one of the sites. The other sites don't really matter for SEO (they have off-line campaigns pointing to them) we just want one of the sites to get all the credit for the content. My questions: 1. Can we use the rel canonical element on the irrelevent pages/URL's to point to the site we care about? I think I remember Matt Cutts saying this can't be done across URL's. Am I right or wrong? 2. If we can't, what options do I have (without making the client change their entire tracking strategy) to make the site we are SEO'ing the relevant content? Thanks a million! Todd
Intermediate & Advanced SEO | | GravitateOnline0 -
Robots.txt: Link Juice vs. Crawl Budget vs. Content 'Depth'
I run a quality vertical search engine. About 6 months ago we had a problem with our sitemaps, which resulted in most of our pages getting tossed out of Google's index. As part of the response, we put a bunch of robots.txt restrictions in place in our search results to prevent Google from crawling through pagination links and other parameter based variants of our results (sort order, etc). The idea was to 'preserve crawl budget' in order to speed the rate at which Google could get our millions of pages back in the index by focusing attention/resources on the right pages. The pages are back in the index now (and have been for a while), and the restrictions have stayed in place since that time. But, in doing a little SEOMoz reading this morning, I came to wonder whether that approach may now be harming us... http://www.seomoz.org/blog/restricting-robot-access-for-improved-seo
Intermediate & Advanced SEO | | kurus
http://www.seomoz.org/blog/serious-robotstxt-misuse-high-impact-solutions Specifically, I'm concerned that a) we're blocking the flow of link juice and that b) by preventing Google from crawling the full depth of our search results (i.e. pages >1), we may be making our site wrongfully look 'thin'. With respect to b), we've been hit by Panda and have been implementing plenty of changes to improve engagement, eliminate inadvertently low quality pages, etc, but we have yet to find 'the fix'... Thoughts? Kurus0