Removing UpperCase URLs from Indexing
-
This search - site:www.qjamba.com/online-savings/automotix
gives me this result from Google:
Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products.and Google tells me there is another one, which is 'very simliar'. When I click to see it I get:
Automotix online coupons and shopping - Qjamba
https://www.qjamba.com/online-savings/Automotix
Online Coupons and Shopping Savings for Automotix. Coupon codes for online discounts on Vehicles & Parts products.This is because I recently changed my program to redirect all urls with uppercase in them to lower case, as it appears that all lowercase is strongly recommended.
I assume that having 2 indexed urls for the same content dilutes link juice. Can I safely remove all of my UpperCase indexed pages from Google without it affecting the indexing of the lower case urls? And if, so what is the best way -- there are thousands.
-
Hi AMHC,
It makes sense that without hardly any backlinks built up Google wont find my upper case URLS since all the page links have been changed, however, I am writing out all of the urls that are redirected into email, and from that I can tell that Google is finding them--I guess they may have a list of urls from prior indexing that they crawl independent of what their crawler comes up with.
I'll keep looking to see what they have indexed and if it turns out they just aren't crawling certain pages, will put them in a sitemap to be crawled..It's a good idea for taking care of the problem quickly--so if it progresses too slowly I'll do that.
Thanks very much for your answers!
-
Google needs to crawl the bad pages that you 301d. If there are no live links to those pages, then Google can't find them to 301. In short, if you created new lower case URLs, you just increased your duplicate content problem.
To solve this problem, build an HTML sitemap with all of the bad URLs. Have Google fetch and submit the page and all of the pages it links to. Google will crawl all of your old pages and apply the 301s.
-
Thanks AMHC. In my case, I just don't have many back links so I don't have the urgency that you faced with getting Google to see all the redirects. But, I'm still not understanding--it sounds like you believe that once google sees the redirect it removes the old uppercase from its index. It doesn't look to me like that is what happened in my case because Google is currently indexing BOTH, and so that means it has crawled my new lowercase and I know it isn't crawling any uppercase anymore (it cant--all are redirected). So, that's why I wonder if I have to remove those uppercase urls...does that make sense or am I just not understanding it still?
EDIT: I just discovered I wasn't doing a 301 direct so it wasn't considered a permanent move. That, if I understand it right, will remove the upper case from googles index permanently.
-
Canonicals still drain link juice. Canonicals aren't like a 301. The link juice still stays on the canocalized page. All a canonical does is tell Google, in the case of duplicate content, which page is primary. Canonicals handle the duplicate content issue, they do not handle the link juice issue. If I have 2 pages: /product-name/ and /product-name=?khdfpohfo/ that are duplicates, you can via canonical, tell Google to ignore the page with the variable string and rank the page without the variable string. If the page with the variable string has links, the link juice stays on the page.
The HTML Sitemap is there to tell Google about the 301s. the sitemap would look like this:
After you do the 301 redirect, as well as set up parameters in the .htaccess file (I think - not the developer on this), everything should redirect to the lower case URL. The problem is that if you do a 301 redirect for your entire site, Google may not figure it out too quickly. When it crawls your home page downward, it's only going to see the new URLs, and can't crawl the old 301 URLs because there aren't any internal links pointing at them. The only way Google will see the 301 is via an external backlink. The way we solved this was to create an HTML sitemap of all of the old upper case URLs. We then had Google fetch and index/crawl the sitemap. As it crawls the sitemap, where all of the URLs are 301 redirects, it will likewise point all of the Link Juice at the new URLs.
-
I gotcha. Yeah, different thing going on here..these urls can be really difficult! I have uppercase lowercase, https http, urls that have different content(not just formatting) for mobile as desktop and vice versa, mobile urls that dont even exist for desktop, and desktop urls that dont exist for mobile..all under the same domain. 1000s of internal pages....In the desire to create a good website for users I've created an SEO monster because I didn't realize the many consequences with regard to search indexes.
If you know a true expert in these areas I need him/her. 4 years on this site, its live finally (2 months), and now I'm discovering all of these things have to be fixed, but i can't afford thousands of dollars..I'll do the work, I just need the knowledge!
-
I see where you are coming from, and I do not have a good answer then, when I did a lowercase redirect I started by creating the new lowercase pages then setting canonical to them. After a few months I removed the uppercase versions and redirected them to the new lowercase.
-
Hutch, thanks.
The site is dynamic with thousands of pages that are now being redirected to lower case, so I'm not seeing how using canonical would work because the upper case urls aren't on the site anymore. I guess I think of canonical as being useful when you have ongoing content on the site that duplicates one or more other pages on the same site. In my case none of the upper case urls exist anymore so they don't have 'ongoing' content. I'm still new to this so if it sounds like I have it wrong, please correct me.
-
Another quick fix would be to use a canonical tag on all of your pages pointing to the full lowercase versions.
So for the URLs example.com/UPPER; example.com/Upper; and example.com/upper you would place the following into the head so Google knows that these are just variations of the same page, and if will point search to the desired page example.com/upper
-
AMHC, thank you for your response. I'm in the middle of quite a mess, as this is one of several issues, so really appreciate your help. I must confess to not following everything you wrote exactly:
In your situation, I think i understand the redirect -- it is the same reason I am doing a redirect--it is so that anyone coming from to this site with uppercase in it will end up on the lower case page, and in the case of google will then index the page as a lower case page. BTW, for me that has been easy as I am doing it via php -- if the url doesn't equal its strtolower of the url , then I redirect to strtolower.
I think I get what you are saying about the sitemap -- it speeds up google crawling the site and seeing that all those upper cases should be lowercase from your redirect. In my case, i don't have the concern about Google discovering them as you did because my site is only a couple months old. And, I never have given Google a sitemap so many of my pages aren't crawled yet (I am trying to clean up my entire url structure before i submit a sitemap to them--however they have already crawled perhaps 20% of the site, so I'm now trying to examine what google has crawled and how it has been indexed to figure out what needs to be done).
What I'm not understanding is this: It seems to me that what you described should succeed for going forward to getting both Google and your users to the right ending page, but I don't see how it removes the prior uppercase urls from Google's index. What is it that tells Google your prior upper case urls should no longer be in their index? Is it the fact that they aren't in the sitemap you provide now? Or, do they literally have to be removed using some kind of removal or disavow tool? I discovered this (as you see in the op) because Google appears to never have removed the Uppercase ones even though they are indexing the lower case now.
Ted
-
We had the same issue. Boy, was it an education. I had no idea that URLs were case sensitive for Google, and neither did my SEO buddies. I bet if you asked 100 SEOs if URLs were case sensitive for Google, 95 would answer "No". We discovered the problem in GWT and GA when they had different statistics for the mixed case and all lower case versions of the URL. We believed that we had both a duplicate content issue as well as a link juice splitting issue, with backlinks being pointed at both URLs.
We solved the problem by doing a 301 redirect, but as we are an ecommerce site with thousands of products, it was a messy process. We had to redirect pretty much every page on the site since the mixed case categories contaminated subcategories and products.
The 301 went pretty smoothly, and we saw a minor bump up in some of our Rankings. I would strongly suggest that you create an HTML sitemap for every upper case URL that you are going to 301. Here were our thoughts - we could be wrong on this. If we just 301 a page, and don't tell Google, then Google won't know about it unless it tries to crawl the page. We felt like we needed to show Google that all of the pages are being redirected asap. Create an HTML sitemap with all of your upper case URLs. After you do the 301, have Google fetch and index the sitemap page and all of the pages that it links to. Leave the map up for a few days, and then you can take it down. This will expedite moving the link juice to the correct pages as Google will index the 301 for every page in the sitemap.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to get a large number of urls out of Google's Index when there are no pages to noindex tag?
Hi, I'm working with a site that has created a large group of urls (150,000) that have crept into Google's index. If these urls actually existed as pages, which they don't, I'd just noindex tag them and over time the number would drift down. The thing is, they created them through a complicated internal linking arrangement that adds affiliate code to the links and forwards them to the affiliate. GoogleBot would crawl a link that looks like it's to the client's same domain and wind up on Amazon or somewhere else with some affiiiate code. GoogleBot would then grab the original link on the clients domain and index it... even though the page served is on Amazon or somewhere else. Ergo, I don't have a page to noindex tag. I have to get this 150K block of cruft out of Google's index, but without actual pages to noindex tag, it's a bit of a puzzler. Any ideas? Thanks! Best... Michael P.S., All 150K urls seem to share the same url pattern... exmpledomain.com/item/... so /item/ is common to all of them, if that helps.
Intermediate & Advanced SEO | | 945010 -
Google does not want to index my page
I have a site that is hundreds of page indexed on Google. But there is a page that I put in the footer section that Google seems does not like and are not indexing that page. I've tried submitting it to their index through google webmaster and it will appear on Google index but then after a few days it's gone again. Before that page had canonical meta to another page, but it is removed now.
Intermediate & Advanced SEO | | odihost0 -
Our website is not being indexed
We have an issue with a site that we can't get to the bottom of. This site: (URL removed) is not being properly indexed. When we do a search for (URL removed) in google.com.au. The site appears as the 4th listing with the following title and description: (Title removed) A description for this result is not available because of this site's robots.txt – learn more. We have checked the site's robots.txt and can see its been now implemented correctly: (URL removed) About a week ago, we also went into Webmaster Tools and submitted a request for Google to recrawl our site. We are unsure what the issue is that is causing the site to not be properly indexed and how to resolve it. Any assistance on this topic would be most appreciated!
Intermediate & Advanced SEO | | Gavo0 -
How To Organise my URLS - Which is Optimal?
Hi all, I am currently in the process of re-writing my companies website URL structure. Compared to the way the website is structured at the minute, there's going to be a lot more URL's as the previous structure has missed out on a lot of search avenues that i intend to include within the rebuild. one of my issues is basically deciding under which category certain URL's come under, I can think of reasons for both sides but can't quite decide on which is optimal. My company is an automotive/car dealer so we sell cars for certain manufactures as well as offering a number of other services. what I'm curious about is what makes more sense in terms of the category that comes first in the URL. Here's what I am torn between; /(car manufacturer)/servicing OR /servicing/(car-manufacturer) To give you some more info that might influence the decision; In terms of generic keyword targeting, the majority would search in the order of '(car manufacturer) service' as opposed to 'service for (car manufacturer)'. Currently on our site, the sections /(manufacturer) are some of the most authoritative pages that we have on the website, but we've done very little work on /service in the past. For me, this would suggest that naturally the pages flowing from that URL would get an advantage in terms of authority/ranking. With either URL structure, the URL's are eventually going to cross paths - I just need to decide which one is best and should therefore feature first. Hopefully this is somewhat clear. I'd appreciate any suggestions or if you don't quite understand what I'm asking for then general URL advice is also appreciated. Many thanks Sam
Intermediate & Advanced SEO | | Sandicliffe0 -
Mixing static.htm urls and dynamic urls on a Windows IIS Server?
Hi all, We've had a website originally built using static html with .htm extensions ranking well in Google hence we want to keep those pages/urls. We are on a dedicated sever (Windows IIS). However our developer has custom made a new DYNAMIC section for the site which shows new added products dynamically and allows them to be booked online via shopping cart. We are having problems displaying them both on the same domain even if we put the dynamic section withing its own subfolder and keep the static htms in the root. Is it possible to have both function on IIS (even if they may have to function a little separately)? Does anyone have previous experience of this kind of issue or a way of making both work? What setup do we need to do on the dedicated server.
Intermediate & Advanced SEO | | emerald0 -
Complex URL Migration
Hi There, I have three separate questions which are all related. Some brief back ground. My client has an adventure tourism company that takes predominantly North American customers on adventure tours to three separate destinations: New Zealand, South America and the Himalayas. They previously had these sites on their own URL's. These URL's had the destination in the URL (eg: sitenewzealand.com). 2 of the three URL's had good age and lots of incoming links. This time last year a new web company was bought in and convinced them to pull all three sites onto a single domain and to put the sites under sub folders (eg: site.com/new-zealand). The built a brand new site for them on a Joomla platform. Unfortunately the new sites have not performed and halved the previous call to action rates. Organic traffic was not adversely affected with this change, however it hasn't grown either. I have been overhauling these new sites with a project team and we have managed to keep the new design but make usability/marketing changes that have the conversion rate nearly back to where it originally was and we have managed to keep the new design (and the CMS) in place. We have recently made programmatic changes to the joomla system to push the separate destination sites back onto their original URL's. My first question is around whether technically this was a good idea. Question 1 Does our logic below add up or is it flawed logic? The reasons we decided to migrate the sites back onto their old URL's were: We have assumed that with the majority of searches containing the actual destination (eg: "New Zealand") that all other things being equal it is likely to attract a higher click through rate on the domain www.sitenewzealand.com than for www.site.com/new-zealand. Having the "newzealand" in the actual URL would provide a rankings boost for target keyword phrases containing "new zealand" in them. We also wanted to create the consumer perception that we are specialists in each of the destinations which we service rather than having a single site which positions us as a "multi-destination" global travel company. Two of the old sites had solid incoming links and there has been very little new links acquired for the domain used for the past 12 months. It was also assumed that with the sites on their own domains that the theme for each site would be completely destination specific rather than having the single site with multiple destinations on it diluting this destination theme relevance. It is assumed that this would also help us to rank better for the destination specific search phrases (which account for 95% of all target keyword phrases). The downsides of this approach were that we were splitting out content onto three sites instead of one with a presumed associated drop in authority overall. The other major one was the actual disruption that a relatively complex domain migration could cause. Opinions on the logic we adopted for deciding to split these domains out would be highly appreciated. Question 2 We migrated the folder based destination specific sites back onto their old domains at the start of March. We were careful to thoroughly prepare the htaccess file to ensure we covered off all the new redirects needed and to directly redirect the old redirects to the new pages. The structure of each site and the content remained the same across the destination specific folders (eg: site.com/new-zealand/hiking became sitenewzealand.com/hiking). To achieve this splitting out of sites and the ability to keep the single instance of Joomla we wrote custom code to dynamically rewrite the URL's. This worked as designed. Unfortunately however, Joomla had a component which was dynamically creating the google site maps and as this had not had any code changes it got all confused and started feeding up a heap of URL's which never previously existed. This resulted in each site having 1000 - 2000 404's. It took us three weeks to work this out and to put a fix into place. This has now been done and we are down to zero 404's for each site in GWT and we have proper google site maps submitted (all done 3 days ago). In the meantime our organic rankings and traffic began to decline after around 5 days (after the migration) and after 10 days had dropped down to around 300 daily visitors from around 700 daily visitors. It has remained at that level for the past 2 weeks with no sign of any recovery. Now that we have fixed the 404's and have accurate site maps into google, how long do you think it will take to start to see an upwards trend again and how long it is likely to take to get to similar levels of organic traffic compared to pre-migration levels? (if at all). Question 3 The owner of the company is understandably nervous about the overall situation. He is wishing right now that we had never made the migration. If we decided to roll back to what we previously had are we likely to cause further recovery delays and would it come back to what we previously had in a reasonably quick time frame? A huge thanks to everyone for reading what is quite a technical and lengthy post and a big thank you in advance for any answers. Kind Regards
Intermediate & Advanced SEO | | activenz
Conrad0 -
Limit on Google Removal Tool?
I'm dealing with thousands of duplicate URL's caused by the CMS... So I am using some automation to get through them - What is the daily limit? weekly? monthly? Any ideas?? thanks, Ben
Intermediate & Advanced SEO | | bjs20100 -
Our login pages are being indexed by Google - How do you remove them?
Each of our login pages show up under different subdomains of our website. Currently these are accessible by Google which is a huge competitive advantage for our competitors looking for our client list. We've done a few things to try to rectify the problem: - No index/archive to each login page Robot.txt to all subdomains to block search engines gone into webmaster tools and added the subdomain of one of our bigger clients then requested to remove it from Google (This would be great to do for every subdomain but we have a LOT of clients and it would require tons of backend work to make this happen.) Other than the last option, is there something we can do that will remove subdomains from being viewed from search engines? We know the robots.txt are working since the message on search results say: "A description for this result is not available because of this site's robots.txt – learn more." But we'd like the whole link to disappear.. Any suggestions?
Intermediate & Advanced SEO | | desmond.liang1