Question on 301s
-
Hi Everyone,
I have a questions on 301 redirects, i hope someone can give me some help on this.
There was some 301 redirects made on some of the URLs at the beginning of the year, however we are now re-structuring the whole website, which means the URLs which had been given a 301 redirect are now getting another 301.
The question is, should i delete the first 301 redirect from the htaccess file?
Kind Regards
-
Ryan your analogy is fantastic. I totally understand this now and it really makes sense to do it this way.
Thanks for being patient with me
Again thanks all for your feedback on this.
Kind Regards
-
Every URL which is no longer active would require a 301 redirect to the proper page. In the situation you describe:
/a should redirect to /abc
/ab should redirect to /abc
I recognize this seems confusing so forget it's a website for a moment. Think of it as mail after you move.
You lived at 100 Main Street. That is where you received your mail. Now you move to 200 Elm Street. You put in a forward order with the post office (a real world equivalent to a 301 redirect). Now any mail addressed to 100 Main Street will be received at 200 Elm Street.
Now you move again to 300 Wall Street. You would put in another forwarding order so your mail from 200 Elm Street gets delivered to your new address. This solution is fine BUT, your mail from 100 Main Street would be delayed. First it would get forwarded to the 200 Elm Street post office, who would then have to forward it to 300 Wall Street. This process is inefficient (in seo terms, you lose link juice).
You want to change your 100 Main Street forward order to direct your mail to the 300 Wall Street address. Now all of your mail is taken to the proper location in a single hop.
I hope this analogy helps!
-
What happens to the URL
If there are external backlinks going to the URL, are these not going to get lost?
Because as we have mentioned on these 301s, there has been 3 URLs in question.
Hope that makes sense.
-
In the simplest terms, the old page should always be directed to the new page. Think of it as a non-stop flight.
-
Hi Ryan,
Thanks for your feedback, however I am getting a little lost
So what your are saying if I understand is, the 301 should be this:
example.com/a is redirected to example.com/abc
Kind Regards
-
The only thing that concerns me is what CafePress had said "Google stops crawling a link after the 5th redirect or so."
You can offer 100 links on a page. All the links can be to "seomoz.org" and they will all be crawled even though the real URL is "www.seomoz.org" and all 100 links will get redirected.
What CafePress referred to is redirects for a single URL.
www.example.com/a redirects to /ab which redirects to /abc and so forth. A crawler will only follow a single URL so far through a chain of redirects before the PR is completely gone and it stops.
Therefore the preferred solution is to redirect any old or broken URLs to their new URL in a single redirect. I'll share an example based on your site:
Very old URL: example.com/a. It is redirected to example.com/ab
Old URL: example.com/ab. It is redirected to example.com/abc
You could leave these two redirects in place, as-is, and they will work, but it is not recommended. The reason is any traffic to /a will have a double re-direct. First the traffic will go to /ab then to the final destination of /abc. This double redirect is an unnecessary delay, it adds extra points of vulnerability and is a waste of SEO link juice. The preferred solution would be to modify the /a redirect to point to the /abc page directly.
I hope that makes sense.
-
Also, if a page is indexed, which is highly likely (due to XML sitemaps, Google Analytics, Google Toolbar etc), then just removing the 301 redirect (links or no links) means that when this page disappears due to the site changes then you will have an indexed page resulting in a 404 error.
I maintain that you should have single hop 301 redirects on all of the pages that will not be there or will have been moved due to the site updated.
I also agree with what Ryan Kent says about links - you may have some links that have been discovered but not yet recognized pr picked up. If there is a chance that the content has been indexed then it should have an appropriate redirect.
-
Hi Ryan,
The only thing that concerns me is what CafePress had said "Google stops crawling a link after the 5th redirect or so."
I have another issue regarding the 301 re-directs:
We have:
/abcd http://www.example.com/abcde this is actually a 301 on a product page, however we have the same product in a shop page /shop/abcd which we have decided to do away with the shop directory, is it best practice to also do a 301 from the /shop/abcd to /abcde?
Hope that makes sense.
Kind Regards
-
I don't agree with the recommendation to simply delete the 301 due to no visible links. There are two reasons why:
1. It is more work for you to go and research the links to each page
2. There can always be links you are not aware of such as bookmarks, e-mail links, links which don't show up for various reasons, etc.
Just simply modify the 301 to point to the correct URL and you are all set.
-
Thanks for the fantastic feedback.
An example of what has happened on the .htaccess:
/abc http://www.example.com/abcd - This is the 301 that was made in March this year.
/abcd http://www.example.com/abcde - This is the new 301
If i notice that there are no links going to /abc using Open Site Explorer should i just delete this 301?
Kind Regards
-
I would change the original 301 redirect to the new location.
I would then add an additional 301 redirect to the secondary page (the old redirect) to the new location.
So you will have your original URL and the older redirected URL both 301 redirected to where the content now resides. This way you only have one hop on the 301 redirects and you have both old URLs pointing to the new one.
-
should i delete the first 301 redirect from the htaccess file?
The best results would be achieved if each URL had a single 301 redirect to the target page. To that end, yes, you should delete the old 301 redirect and create a new one.
-
+1
Totally forgot about mentioning the inbound links part. Thanks for picking it up, Rick!
-
Hey Gary,
I partially agree with Cafe. However, I wouldn't remove any redirects for URLs which may have backlinks. Maybe it would be a good idea to figure out if any of the redirects which you are removing are from URLs that have earned links? An Open Site Explorer link export would help you figure out if any of those URLs still have value.
-
Hi Gary,
Yes, it is always a good idea to cut down the number of 301 redirects (or any redirects in general) because if I remember correctly, Google stops crawling a link after the 5th redirect or so. You also lose another 10% link juice for each additional redirect.
Lastly, don't forget to 301 redirect the URLs from the beginning of the year to the new re-structured website.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Question on Pagination - /blog/ vs /blog/?page=1
Question on Pagination Because we could have /blog/ or /blog/?page=1 as page one would this be the correct way to markup the difference between these two URL? The first page of a sequence could start with either one of these URLs. Clarity around what to do on this first page would be helpful. Example… Would this be the correct way to do this as these two URLs would have the exact content? Internal links would likely link to /blog/ so signal could be muddy. URL: https://www.somedomain.com/blog/
Technical SEO | | jorgensoncompanies
<link rel="canonical" href="https://www.somedomain.com/blog/?page=1"> URL: https://www.somedomain.com/blog/?page=1
<link rel="canonical" href="https://www.somedomain.com/blog/?page=1"> Google is now saying to just use the canonical to the correct paginated URL with page number. You can read that here:
https://developers.google.com/search/docs/advanced/ecommerce/pagination-and-incremental-page-loading But they do not clarify what to do on /blog/?page=1 vs /blog/ as they are the exact same thing. Thanks for your help.0 -
Fundamental HTTP to HTTPS Redirect Question
Hi All I'm planning a http to https migration for a site with over 500 pages. The site content and structure will be staying the same, this is simply a https migration. Can I just confirm the answer to this fundamental question? From my reading, I do not need to create 301 redirect for each and every page, but can add a single generic redirect so that all http references are redirected to https. Can I just double check this would suffice to preserve existing google rankings? Many Thanks
Technical SEO | | ruislip180 -
Question About Thin Content
Hello, We have an encyclopedia type page on our e-commerce site. Basically, it's a page with a list of terms related to our niche, product definitions, slang terms, etc. The terms on the encyclopedia page are each linked to their own page that contains the term and a very short definition (about 1-2 sentences). The purpose of these is to link them on product pages if a product has a feature or function that may be new to our customers. We have about 82 of these pages. Are these pages more likely to help us because they're providing information to visitors, or are they likely to hurt us because of the very small amount of content on each page? Thanks for the help!
Technical SEO | | mostcg0 -
Meta Data Question
Hi There, I am working on the umbraco CMS and we have a Menu page which sits under one page on the CMS. When accessing this page on the front end and navigating between the food menu / drinks menu, the url changes depending on which content you are on, however i have only one place to input a meta title and description meaning that it is seeing them as duplicate content as both the drinks menu url and food menu url are showing the same meta data. Hopefully this makes sense, does anyone have anything similair where a url change happens when content within the page changes.
Technical SEO | | AlexStanleyGK0 -
Questions About The Right Hosting
Hi All, I have a few questions about the right type of hosting that I should be using. I understand that many people say we should be using the best hosting that we can afford. However, when I have a website with just 650 pages / posts is it really worth worrying too much about where I am hosting. I am UK based so at the moment I am using a UK host along with a CDN. I have a unique IP address and on a server that has a limited amount of websites on it. The main question is there really any need to be looking at anything else. The truth is I have used cloud hosting before and the website loaded slower around the world with that than it does with my current setup. Thanks
Technical SEO | | TTGUK0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Question about Robot.txt
I just started my own e-commerce website and I hosted it to one of the popular e-commerce platform Pinnacle Cart. It has a lot of functions like, page sorting, mobile website, etc. After adjusting the URL parameters in Google webmaster last 3 weeks ago, I still get the same duplicate errors on meta titles and descriptions based from Google Crawl and SEOMOZ crawl. I am not sure if I made a mistake of choosing pinnacle cart because it is not that flexible in terms of editing the core website pages. There is now way to adjust the canonical, to insert robot.txt on every pages etc. however it has a function to submit just one page of robot.txt. and edit the .htcaccess. The website pages is in PHP format. For example this URL: www.mycompany.com has a duplicate title and description with www.mycompany.com/site-map.html (there is no way of editing the title and description of my sitemap) Another error is www.mycompany.com has a duplicate title and description with http://www.mycompany.com/brands?url=brands Is it possible to exclude those website with "url=" and my "sitemap.html" in the robot.txt? or the URL parameters from Google is enough and it just takes a lot of time. Can somebody help me on the format of Robot.txt. Please? thanks
Technical SEO | | paumer800 -
Site Hosting Question
We are UK based web designers who have recently been asked to build a website for an Australian Charity. Normally we would host the website in the UK with our current hosting company, but as this is an Australian website with an .au domain I was wondering if it would be better to host it in Australia. If it is better to host it in Australia, I would appreciate if someone could give me the name of a reasonably priced hosting company. Thanks Fraser
Technical SEO | | fraserhannah0