Duplicate title tags being caused by upper case and lower case version of urls'
-
Hi
GWT is reporting lots of duplicate titles for a clients new site.
Mainly these are due to 2x different versions of the url, one with words starting with an upper case character and the other all lower case.
Clients dev saying this has something to do with windows server and is ok!
Is this correct or should i be telling them to delete and then 301 redirect all upper case versions to the lower case (since lower case better practice) and that will deal with the reported dupe titles ?
All Best
Dan
-
Hi Tom
Sorry to be a pain but please can you confirm re my latest question/comment reply to yours, i.e. does your advice still stand if client on a windows server (since i dont think have .htaccess files or do/can they?)
And if not then is there a similar course of action that can be taken ?
Cheers
Dan
-
Many Thanks Raymond !
All Best
Dan
-
That great Tom thanks alot for your advice !
Clients on a windows server, am i right in thinking they dont have htaccess files since they are just on apache ?
If so is principle/code the same and just done in a different file ? if so any ideas what that file is ? or am i mistaken and windows do have htaccess files ?
Cheers
Dan
-
Hi Dan
I wouldn't want to leave anything to doubt and would prefer to have 1 version of each URL available.
Fortunately, a fairly simple solution can be put in place in your htaccess file. As always, please backup and test before trying any implementation - I can't tell you how many times I've made a simple mistake in the htaccess file that causes big problems!
Anyway, the code you'd want to enter at the top of the file is:
RewriteEngine On
RewriteMap lc int:tolower
RewriteCond %{REQUEST_URI} [A-Z]
RewriteRule (.*) ${lc:$1} [R=301,L]That code will basically rewrite any URL containing uppercase letters to the same URL using only lowercase.
Redirects are quicker and more reliable than canonical tags in my experience and this doesn't take long to get implemented, so best not leave anything to chance.
Hope this helps.
-
GWT sometimes takes a while to register the change. It just actually happened on one of my clients pages where there was a bug in the code made by the developer and they got 30,000 duplicate title and description errors. It was fixed about 2 months ago and while the errors have dropped to 2000 or so, it is still not at zero yet, so it does take some time depending on how often Google crawls the site.
If the developer says they did it, I would recommend the first thing you do is simply check that they did it. Go to one of the pages in question, both one with capital and without, and do a view source on both to check that the rel="canonical" tag is there and is correct. Programmers don't always know much about SEO so they may be implementing them wrong.
-
Thanks Raymond
The developers are saying they have done this
Does that mean that if its showing in GWT it hasnt been done ?
Cheers
Dan
-
As a simpler solution, why don't you just add a rel="canonical" tag to each page specifying the one you want? That should take care of any duplicate content issues.
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We have 2 versions of URLs. we have the mobile and the desktop. is that a duplicate content?
Hi, Our website has two version of URLs. dektop: www.myexample.com and mobile www.myexample.com/m If you go to our site from a mobile device you will land on our mobile URL, if you go to our site from desktop computer you will land on a regular URL. Both urls have the same content. Is that considered duplicate? If yes, then what can I do to fix it? Also, both URLs are indexed by google. We have two separate XML sitemaps- one for desktop and one for mobile. Is that a good SEO practice?
Technical SEO | | Armen-SEO0 -
Title Tag vs. H1 / H2
OK, Title tag, no problem, it's the SEO juice, appears on SERP, etc. Got it. But I'm reading up on H1 and getting conflicting bits of information ... Only use H1 once? H1 is crucial for SERP Use H1s for subheads Google almost never looks past H2 for relevance So say I've got a blog post with three sections ... do I use H1 three times (or does Google think you're playing them ...) Or do I create a "big" H1 subhead and then use H2s? Or just use all H2s because H1s are scary? 🙂 I frequently use subheads, it would seem weird to me to have one a font size bigger than another, but of course I can adjust that in settings ... Thoughts? Lisa
Technical SEO | | ChristianRubio0 -
Blocked URL's by robots.txt
In Google Webmaster Tools shows me 10,936 Blocked URL's by robots.txt and it is very strange when you go to the "Index Status" section where shows that since April 2012 robots.txt blocked many URL's. You can see more precise on the image attached (chart WMT) I can not explain why I have blocked URL's ? because I have nothing in robots.txt.
Technical SEO | | meralucian37
My robots.txt is like this: User-agent: * I thought I was penalized by Penguin in April 2012 because constantly i'am losing visitors now reaching over 40%. It may be a different penalty? Any help is welcome because i'm already so saturated. Mera robotstxt.jpg0 -
Different URLS for our multi language pages caused penalty?
Hi all, We have a website www.phoneboxlanguage.com with 4 different language versions (Spanish, French, Italian, German). We have all the different versions on totally different URLS. E.G the French URL is www.cours-telephone-anglais.com. Recently this month we saw a huge drop in SERPS for all the 'foreign' language pages. This had happened before for the Spanish and French, which we put down to keyword density issues, so created new URLS for those pages. However now all 4 foreign pages have dropped. Could this be due instead to a penalty for duplicate sites? The content is obviously different due to different languages, but the coding and templates for the sites are the same. How can we find out this is the case and what should we do? I was thinking after some research on the forum to create subfolders in the original (phoneboxlanguage.com) and then create 301 redirects, from the old dropped sites, or would their penalties be bad for our original site, if this were the case? We are obviously very keen to not further damage the site and the original site which remains o.k. Many thanks for your kind help. Quime.
Technical SEO | | Quime0 -
Duplicate content error - same URL
Hi, One of my sites is reporting a duplicate content and page title error. But it is the same page? And the home page at that. The only difference in the error report is a trailing slash. www.{mysite}.co.uk www.{mysite}.co.uk/ Is this an easy htaccess fix? Many thanks TT
Technical SEO | | TheTub1 -
What's the best format for a e-commerce URL product page
We have over 2000 non branded experiences and activities sold through our website. The website is having a face lift with the a new look and a stronger focus on SEO. As part of this, I am keen to establish what the best practice is for product based URLs. I've researched the market and come up with a few alternatives that are used: domain/category/subcategory/activity_name domain/activity_name/category/subcategory/activity_reference domain/generic_term/activity_reference/activity_name domain/category/activity_location/activity_name Activities are location based but the location can change (say once every 2 years). Activity names, category, subcategory and activity_reference rarely change. Are there any thoughts/ research on the best method? (If there is one) Many thanks in advance for your insights.
Technical SEO | | philwill0 -
Slash at end of URL causing Google crawler problems
Hello, We are having some problems with a few of our pages being crawled by Google and it looks like the slash at the end of the URL is causing the problem. Would appreciate any pointers on this. We have a redirect in place that redirects the "no slash" URL to the "slash" URL for all pages. The obvious solution would be to try turning this off, however, we're unable to figure our where this redirect is coming from. There doesn't appear to be an instruction in our .htaccess file doing this, and we've also tried using "DirectorySlash Off" in the .htaccess file, but that doesn't work either. (if it makes a difference it is a 302 redirect doing this, not a 301) If we can't get the above to work, then the other solution would be to somehow reconfigure the page so that it is recognizable with the slash at the end by Google. However, we're not sure how this would be done. I think the quickest solution would be to turn off the "add slash" redirect. Any ideas on where this command might be hiding, and how to turn it off would be greatly appreciated. Or any tips from people who have had similar crawl problems with google and any workarounds would be great! Thanks!
Technical SEO | | onetwentysix0 -
Handling '?' in URLs.
Adios! (or something), I've noticed in my SEOMoz campaign that I am getting duplicate content warnings for URLs with extensions. For example: /login.php?action=lostpassword /login.php?action=register etc. What is the best way to deal with these type of URLs to avoid duplicate content penelties in search engines? Thanks 🙂
Technical SEO | | craigycraig0