301 Not Allowed...Other Solutions?
-
A client's site where both the www. and non-www. versions are both being indexed. The non-www. version have has roughly 1000 or so links where the www. version has over twice as much pointing back to the site. In addition, the www. version has higher domain authority.
Their programmer has suggested that they can't implement 301's permanent redirects across their site for a few reasons.
My question is, what would be the best alternative to block/redirect the non-www. version from being indexed yet still pass link-juice?
-
Hey James - I am curious as to why you think a 'canonical tag' wouldn't be a "long term fix"?
-
Thanks for the responses everyone. Everyone seems to be in agreement that a '301' is the proper course of action and should be explained that way.
I just have never run into a developer opposing the idea. So, thanks again for the feedback!
-
1st of all if the developer says he won't use the 301 then he's one very strange developer it's the basic of frontend development. But that being as it is, 301 is the best and most viable option but you do have a few other:
- tell GWT your prefered domain.
- have the devloper make a dynamic rel canonical, something like this (in php)
' ?>
in the above situation he of cause would have to make a function called checkURL to test for if the url begins with http://www and returns the right formatted version. and tell GWT what domain you prefer.
But again the only right way is a 301 and it's ridiculously simple to make.
-
If you properly communicate why 301 Redirects are the only proper solution, and you continue to run up against a dev who makes up pie in the sky technical "reasons" why they can't, then the question is whether they respect you enough, put enough value in your view, recommendation.
Setting that aside, the canonical tag, coupled with going into Google Webmaster Tools and setting the www version as the preferred version will help, but as has been pointed out, not in anything close to resembling an ideal way.
For long term sanity, I highly recommend you explore why you're getting the resistence. Is it because the dev feels threatened, or doesn't want to do the work? Setting up server-wide 301 Redirects is NOT that difficult or time consuming for anyone who knows what they're doing. So you may want to provide them links to the "how-to" for their particular server configuration.
If they are lazy, you'll need to find a way to show the decision maker(s) that failing to implement them is costing the company revenue.
-
My Advice is to make a deck/ business case for the programmer show him the problems with having two versions of the website indexed. I have encountered a few developers who are not really in tune with the whole issues around duplicate content and SEO. I think the best idea is to act on the same page, show the developer some respect, show him the design is good but then also educate him about SEO.
If you do a cononical tag, sorry to say but it is not going to be a long term fix, it will just be a short term fix.
Try and push for the 301's.
-
I have never encountered a developer who resisted using a redirect for the non-www URLs to the www form. If you have access to the server, the change should be able to be made.
If you decide not to use a 301,use a canonical tag to identify the correct version of the page. I would also use both Google and Bing Webmaster Tools to indicate which URL format you wish to use.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to speed up transition towards new 301 redirected landing pages?
Hi SEO's, I have a question about moving local landing pages from many separate pages towards integrating them into a search results page. Currently we have many separate local pages (e.g. www.3dhubs.com/new-york). For both scalability and conversion reasons, we'll integrate our local pages into our search page (e.g. www.3dhubs.com/3d-print/Bangalore--India). **Implementation details: **To mitigate the risk of a sudden organic traffic drop, we're currently running a test on just 18 local pages (Bangalore) = 1 / 18). We applied a 301 redirect from the old URL's to the new URL's 3 weeks ago. Note: We didn't yet update the sitemap for this test (technical reasons) and will only do this once we 301 redirect all local pages. For the 18 test pages I manually told the crawlers to index them in webmaster tools. That should do I suppose. **Results so far: **The old url's of the 18 test cities are still generating > 99% of the traffic while the new pages are already indexed (see: https://www.google.nl/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:www.3dhubs.com/3d-print/&start=0). Overall organic traffic on test cities hasn't changed. Questions: 1. Will updating the sitemap for this test have a big impact? Google has already picked up the new URL's so that's not the issue. Furthermore, the 301 redirect on the old pages should tell Google to show the new page instead, right? 2. Is it normal that search impressions will slowly shift from the old page towards the new page? How long should I expect it to take before the new pages are consistently shown over the old pages in the SERPS?
Intermediate & Advanced SEO | | robdraaijer0 -
Robots.txt - Googlebot - Allow... what's it for?
Hello - I just came across this in robots.txt for the first time, and was wondering why it is used? Why would you have to proactively tell Googlebot to crawl JS/CSS and why would you want it to? Any help would be much appreciated - thanks, Luke User-Agent: Googlebot Allow: /.js Allow: /.css
Intermediate & Advanced SEO | | McTaggart0 -
Best practices for robotx.txt -- allow one page but not the others?
So, we have a page, like domain.com/searchhere, but results are being crawled (and shouldn't be), results look like domain.com/searchhere?query1. If I block /searchhere? will it block users from crawling the single page /searchere (because I still want that page to be indexed). What is the recommended best practice for this?
Intermediate & Advanced SEO | | nicole.healthline0 -
Effect of 301 redirect to a relative url to homepage?
One of our new clients recently encountered a site-wide ranking drop for many keywords and I'm pretty confident regarding their link profile as to being 98% legit. Background: 1. Client full site is https, and all http pages are 301 redirected to their https counterpart 2. Client has ~50 links partners (all legitimate sites + schools etc) links to client with urls such as www.example.com/portal/123.aspx that redirects to www.example.com. 3. Client homepage 301 redirects from www.example.com to www.example.com/default.aspx and then 301 redirects to the relative url "/Home.aspx". 4. Client launched some testing with Google website optimizer tool. ~1-2 months ago. Symptoms: 1. Rankings dropped for basically many/all 30-40+ keywords by ~15 positions 2. Seomoz reports close to a double of existing pages + (600+) duplicate content in the same date range. Webmasters only report 80 duplicate titles though. 3. Domain authority by seomoz reduced a bit + backlinks recorded by seomoz to the website nearly halved in the past 2 months. I'm not sure if I narrowed this towards the right direction, and it isn't clear when the relative url 301 redirect was implemented: 1. The 301 redirect to the relative page (www.example.com/default.aspx to "/home.aspx") is accounting for the loss of links recorded by seomoz. 2. The ~50 links the client currently use (www.example.com/portal.123.aspx 301 redirecting to www.example.com, also relative) as a tracking tool is being considered 301 redirect abuse. 3. Maybe something went wrong with the usage of google optimizer tool for SEO purposes? Visitor traffic to each of the tested pages looked fine. I would greatly appreciate any advice/insights on what I might be missing in terms of direction / factors. Thanks! Alex
Intermediate & Advanced SEO | | sixspokemedia0 -
301 redirect help
Hey guys, I normally work in WordPress and just use a 301 redirect plugin. I bought a site and rather than maintain two similar ones have decided to redirect one to the other. I am having trouble with the .htaccess file. Here is an example. These are two redirects: redirect 301 /category/models/next/2
Intermediate & Advanced SEO | | DanDeceuster
redirect 301 /category/models I want both of these URLs to redirect to the same URL of the new site. However, the /category/models is the only one working. It redirects to the new page just fine. The /category/models/next/2 is redirecting to nearly the same URL on the new site, only it is adding /next/2 to the end and that is bringing up a 404. Why is it adding /next/2 to the new URL? How can I fix this? There are several doing this. Help appreciated!0 -
New web site - 404 and 301
Hello, I have spent a lot of times on the forum trying to make sure how to deal with my client situation. I will tell you my understanding of the strategy to apply and I would appreciate if you could tell me if the strategy will be okay. CONTEXT I am working on a project where our client wants to replace its current web site with a new one. The current web site has at least 100 000 pages. The new web site will replace all the existing pages of the current site. What I have heard for the strategy the client wants to adopt is to 404 each pages and to 301 redirect each page. Every page would be redirect to a page that make sense in the new web site. But after reading other answers and reading the following comment, I am starting to be concerned: '(4) Be careful with a massive number of 301s. I would not 301 100s of pages at once. There's some evidence Google may view this as aggressive PR sculpting and devalue those 301s. In that case, I'd 301 selectively (based on page authority and back-links) and 404 the rest.' I have also read about performance issue ... QUESTION So, if we suppose that we can manage to map each of the old site pages to a page in the new web site, is a problem to do it? Do you see a performance issue or devaluation potential issue? If it is a problem, please comment the strategy I might considere to suggest: Identify the pages for which I gain links From that group, identify the pages, that gives me most of my juice 301 redirect them and for the other, create a real great 404 ... Thanks ! Nancy
Intermediate & Advanced SEO | | EnigmaSolution0 -
Title tag solution for a med sized site
Its the same old story, we all know it well. I have a client that has a site with 20k+ pages (not too big) and traffic levels around 450k/month. Now we have identified 15 pages with various conversion points/great backlink metrics etc. that we are going to explicitly target in the first round of recs. However, we are looking at about 18,000 dup title tags that I'd like to clean up. The site is not on a CMS and in the past I've had the dev team write a script to adopt the h1 tag or the name of the page etc as the title tag. This can cause a problem when some of these pages that are being found in long tail search lose their positions etc. I'm more hesitant than ever to make this move with this current client because they get a ton of long tail traffic spread over a ton of original content they wrote. How does everyone else usually handle this? Thoughts? Thanks in advance Mozzers!
Intermediate & Advanced SEO | | MikeCoughlin0 -
Multiple 301 redirects considered a redirection chain?
I need to redirect a ton of duplicate content, so I want to try redirect 301 /store/index.php /store redirect 301 /store/product-old /store/product-new redirect 301 /store/product-old1 /store/product-new1 redirect 301 /store/product-old2 /store/product-new2 redirect 301 /store/product-old3 /store/product-new3 redirect 301 /store/product-old4/file.html /store/product-old4/new4/file.html and then a whole bunch of old dead links to homepage. So we've had /index.php redirected to / on other parts of the site for awhile, and for the most part /store is a friendly URL, but then we have tons of dup content and work arounds that preceded my job here. I'm wondering if those redirects above would be considered a redirection chain? Since the all the redirects below the /index.php -> /store count on that one redirect. Thanks for any insight you may be able to give!
Intermediate & Advanced SEO | | Hondaspeder1