301 Not Allowed...Other Solutions?
-
A client's site where both the www. and non-www. versions are both being indexed. The non-www. version have has roughly 1000 or so links where the www. version has over twice as much pointing back to the site. In addition, the www. version has higher domain authority.
Their programmer has suggested that they can't implement 301's permanent redirects across their site for a few reasons.
My question is, what would be the best alternative to block/redirect the non-www. version from being indexed yet still pass link-juice?
-
Hey James - I am curious as to why you think a 'canonical tag' wouldn't be a "long term fix"?
-
Thanks for the responses everyone. Everyone seems to be in agreement that a '301' is the proper course of action and should be explained that way.
I just have never run into a developer opposing the idea. So, thanks again for the feedback!
-
1st of all if the developer says he won't use the 301 then he's one very strange developer it's the basic of frontend development. But that being as it is, 301 is the best and most viable option but you do have a few other:
- tell GWT your prefered domain.
- have the devloper make a dynamic rel canonical, something like this (in php)
' ?>
in the above situation he of cause would have to make a function called checkURL to test for if the url begins with http://www and returns the right formatted version. and tell GWT what domain you prefer.
But again the only right way is a 301 and it's ridiculously simple to make.
-
If you properly communicate why 301 Redirects are the only proper solution, and you continue to run up against a dev who makes up pie in the sky technical "reasons" why they can't, then the question is whether they respect you enough, put enough value in your view, recommendation.
Setting that aside, the canonical tag, coupled with going into Google Webmaster Tools and setting the www version as the preferred version will help, but as has been pointed out, not in anything close to resembling an ideal way.
For long term sanity, I highly recommend you explore why you're getting the resistence. Is it because the dev feels threatened, or doesn't want to do the work? Setting up server-wide 301 Redirects is NOT that difficult or time consuming for anyone who knows what they're doing. So you may want to provide them links to the "how-to" for their particular server configuration.
If they are lazy, you'll need to find a way to show the decision maker(s) that failing to implement them is costing the company revenue.
-
My Advice is to make a deck/ business case for the programmer show him the problems with having two versions of the website indexed. I have encountered a few developers who are not really in tune with the whole issues around duplicate content and SEO. I think the best idea is to act on the same page, show the developer some respect, show him the design is good but then also educate him about SEO.
If you do a cononical tag, sorry to say but it is not going to be a long term fix, it will just be a short term fix.
Try and push for the 301's.
-
I have never encountered a developer who resisted using a redirect for the non-www URLs to the www form. If you have access to the server, the change should be able to be made.
If you decide not to use a 301,use a canonical tag to identify the correct version of the page. I would also use both Google and Bing Webmaster Tools to indicate which URL format you wish to use.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirects
One of our employees took an SEO class recently. She was told that having too many 301 redirects can hurt SEO. I have never heard of 301 redirects as having a negative impact. Any thoughts?
Intermediate & Advanced SEO | | Smart_Start0 -
301 Redirects... Redirect all content at once or in increments?
Hello, I have been reading a lot about site migration and 301s and sometimes get confused with conflicting suggestions from different sources... So, in a site migration. Should I 301 redirect all old URLs to the news at once or little by little? I've see this Google handout that suggests doing it all at once (minute 13)
Intermediate & Advanced SEO | | Koki.Mourao
https://plus.google.com/u/0/events/cfco632lor7bl55j3tg1g8332l0 But also have read the opposite in other forums...0 -
Client would like to 301 redirect the homepage to a category page
Hello MOZ Community!!! I would like your expert opinions on a scenario, please! My client is an ecommerce company, and currently has one of its category pages outranking its homepage for a few key phrases. The homepage, however, has a better conversion rate. So, the client is asking that we make the homepage the category URL, so http://www.theirsite.com/blue-clothes. The existing homepage URL - http://www.theirsite.com - would 301 REDIRECT to the category page - which would render the current version of the homepage. Therefore, there would be nothing, ZERO content, on the MAIN URL: http://www.theirsite.com Has anyone ever done this before? What are the pros and the cons of this practice? Here is my same client, for reference: https://moz.com/community/q/issue-with-category-ranking-on-page-1-vs-homepage-ranking-on-page-2
Intermediate & Advanced SEO | | accpar0 -
Remove URLs that 301 Redirect from Google's Index
I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?
Intermediate & Advanced SEO | | trung.ngo0 -
Should I run 302 first before implementing 301?
I just want to ask if it is necessary to run 302 redirections first before redirecting old to new URLs permanently. I heard that we should run temporary redirects first so we can check after and to avoid passing the link juice but I want to hear thoughts from experts. Do i need to test 302s for old pages that are still live or should we redirect old URLs once these pages already removed from the site?
Intermediate & Advanced SEO | | esiow20130 -
Aged domain and 301 redirect? (11 year old domain)
Hey everyone, I'm about to launch a new website for an accounting firm. They currently have a website, which has an 11 year old domain. They are doing very well locally for SEO, and i'm guessing it's because of the aged domain, as their website is very badly built, and contains almost no content. They would like to launch the new site with a simpler, easier to remember domain. If i launch the new site, point the aged domain using a 301 redirect, and do redirects for all of the old pages to the newer versions of them, is there a chance the company will lose their current SEO rankings? Thanks!
Intermediate & Advanced SEO | | RCDesign740 -
301 Redirect question
Which is the best way to set up the 301 redirect on my main home page? http://horsebuggy.com to http://www.horsebuggy.com Or does it make a difference? Boodreaux
Intermediate & Advanced SEO | | Boodreaux0 -
Title tag solution for a med sized site
Its the same old story, we all know it well. I have a client that has a site with 20k+ pages (not too big) and traffic levels around 450k/month. Now we have identified 15 pages with various conversion points/great backlink metrics etc. that we are going to explicitly target in the first round of recs. However, we are looking at about 18,000 dup title tags that I'd like to clean up. The site is not on a CMS and in the past I've had the dev team write a script to adopt the h1 tag or the name of the page etc as the title tag. This can cause a problem when some of these pages that are being found in long tail search lose their positions etc. I'm more hesitant than ever to make this move with this current client because they get a ton of long tail traffic spread over a ton of original content they wrote. How does everyone else usually handle this? Thoughts? Thanks in advance Mozzers!
Intermediate & Advanced SEO | | MikeCoughlin0