Question on 301s
-
Hi Everyone,
I have a questions on 301 redirects, i hope someone can give me some help on this.
There was some 301 redirects made on some of the URLs at the beginning of the year, however we are now re-structuring the whole website, which means the URLs which had been given a 301 redirect are now getting another 301.
The question is, should i delete the first 301 redirect from the htaccess file?
Kind Regards
-
Ryan your analogy is fantastic. I totally understand this now and it really makes sense to do it this way.
Thanks for being patient with me
Again thanks all for your feedback on this.
Kind Regards
-
Every URL which is no longer active would require a 301 redirect to the proper page. In the situation you describe:
/a should redirect to /abc
/ab should redirect to /abc
I recognize this seems confusing so forget it's a website for a moment. Think of it as mail after you move.
You lived at 100 Main Street. That is where you received your mail. Now you move to 200 Elm Street. You put in a forward order with the post office (a real world equivalent to a 301 redirect). Now any mail addressed to 100 Main Street will be received at 200 Elm Street.
Now you move again to 300 Wall Street. You would put in another forwarding order so your mail from 200 Elm Street gets delivered to your new address. This solution is fine BUT, your mail from 100 Main Street would be delayed. First it would get forwarded to the 200 Elm Street post office, who would then have to forward it to 300 Wall Street. This process is inefficient (in seo terms, you lose link juice).
You want to change your 100 Main Street forward order to direct your mail to the 300 Wall Street address. Now all of your mail is taken to the proper location in a single hop.
I hope this analogy helps!
-
What happens to the URL
If there are external backlinks going to the URL, are these not going to get lost?
Because as we have mentioned on these 301s, there has been 3 URLs in question.
Hope that makes sense.
-
In the simplest terms, the old page should always be directed to the new page. Think of it as a non-stop flight.
-
Hi Ryan,
Thanks for your feedback, however I am getting a little lost
So what your are saying if I understand is, the 301 should be this:
example.com/a is redirected to example.com/abc
Kind Regards
-
The only thing that concerns me is what CafePress had said "Google stops crawling a link after the 5th redirect or so."
You can offer 100 links on a page. All the links can be to "seomoz.org" and they will all be crawled even though the real URL is "www.seomoz.org" and all 100 links will get redirected.
What CafePress referred to is redirects for a single URL.
www.example.com/a redirects to /ab which redirects to /abc and so forth. A crawler will only follow a single URL so far through a chain of redirects before the PR is completely gone and it stops.
Therefore the preferred solution is to redirect any old or broken URLs to their new URL in a single redirect. I'll share an example based on your site:
Very old URL: example.com/a. It is redirected to example.com/ab
Old URL: example.com/ab. It is redirected to example.com/abc
You could leave these two redirects in place, as-is, and they will work, but it is not recommended. The reason is any traffic to /a will have a double re-direct. First the traffic will go to /ab then to the final destination of /abc. This double redirect is an unnecessary delay, it adds extra points of vulnerability and is a waste of SEO link juice. The preferred solution would be to modify the /a redirect to point to the /abc page directly.
I hope that makes sense.
-
Also, if a page is indexed, which is highly likely (due to XML sitemaps, Google Analytics, Google Toolbar etc), then just removing the 301 redirect (links or no links) means that when this page disappears due to the site changes then you will have an indexed page resulting in a 404 error.
I maintain that you should have single hop 301 redirects on all of the pages that will not be there or will have been moved due to the site updated.
I also agree with what Ryan Kent says about links - you may have some links that have been discovered but not yet recognized pr picked up. If there is a chance that the content has been indexed then it should have an appropriate redirect.
-
Hi Ryan,
The only thing that concerns me is what CafePress had said "Google stops crawling a link after the 5th redirect or so."
I have another issue regarding the 301 re-directs:
We have:
/abcd http://www.example.com/abcde this is actually a 301 on a product page, however we have the same product in a shop page /shop/abcd which we have decided to do away with the shop directory, is it best practice to also do a 301 from the /shop/abcd to /abcde?
Hope that makes sense.
Kind Regards
-
I don't agree with the recommendation to simply delete the 301 due to no visible links. There are two reasons why:
1. It is more work for you to go and research the links to each page
2. There can always be links you are not aware of such as bookmarks, e-mail links, links which don't show up for various reasons, etc.
Just simply modify the 301 to point to the correct URL and you are all set.
-
Thanks for the fantastic feedback.
An example of what has happened on the .htaccess:
/abc http://www.example.com/abcd - This is the 301 that was made in March this year.
/abcd http://www.example.com/abcde - This is the new 301
If i notice that there are no links going to /abc using Open Site Explorer should i just delete this 301?
Kind Regards
-
I would change the original 301 redirect to the new location.
I would then add an additional 301 redirect to the secondary page (the old redirect) to the new location.
So you will have your original URL and the older redirected URL both 301 redirected to where the content now resides. This way you only have one hop on the 301 redirects and you have both old URLs pointing to the new one.
-
should i delete the first 301 redirect from the htaccess file?
The best results would be achieved if each URL had a single 301 redirect to the target page. To that end, yes, you should delete the old 301 redirect and create a new one.
-
+1
Totally forgot about mentioning the inbound links part. Thanks for picking it up, Rick!
-
Hey Gary,
I partially agree with Cafe. However, I wouldn't remove any redirects for URLs which may have backlinks. Maybe it would be a good idea to figure out if any of the redirects which you are removing are from URLs that have earned links? An Open Site Explorer link export would help you figure out if any of those URLs still have value.
-
Hi Gary,
Yes, it is always a good idea to cut down the number of 301 redirects (or any redirects in general) because if I remember correctly, Google stops crawling a link after the 5th redirect or so. You also lose another 10% link juice for each additional redirect.
Lastly, don't forget to 301 redirect the URLs from the beginning of the year to the new re-structured website.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
301 redirect homepage question
Hi If i have a homepage which is available at both www.homepage.com and www.homepage.com// should i 301 the // version to the first version. Im curious as to whether slashes are taking into consideration Thanks in advance
Technical SEO | | TheZenAgency0 -
Specific question about pagination prompted by Adam Audette's Presentation at RKG Summit
This question is prompted by something Adam Audette said in this excellent presentation: http://www.rimmkaufman.com/blog/top-5-seo-conundrums/08062012/ First, I will lay out the issues: 1. All of our paginated pages have the same URL. To view this in action, go here: http://www.ccisolutions.com/StoreFront/category/audio-technica , scroll down to the bottom of the page and click "Next" - look at the URL. The URL is: http://www.ccisolutions.com/StoreFront/IAFDispatcher, and for every page after it, the same URL. 2. All of the paginated pages with non-unique URLs have canonical tags referencing the first page of the paginated series. 3. http://www.ccisolutions.com/StoreFront/IAFDispatcher has been instructed to be neither crawled nor indexed by Google. Now, on to what Adam said in his presentation: At about minute 24 Adam begins talking about pagination. At about 27:48 in the video, he is discussing the first of three ways to properly deal with pagination issues. He says [I am somewhat paraphrasing]: "Pages 2-N should have self-referencing canonical tags - Pages 2-N should all have their own unique URLs, titles and meta descriptions...The key is, with this is you want deeper pages to get crawled and all the products on there to get crawled too. The problem that we see a lot is, say you have ten pages, each one using rel canonical pointing back to page 1, and when that happens, the products or items on those deep pages don't get get crawled...because the rel canonical tag is sort of like a 301 and basically says 'Okay, this page is actually that page.' All the items and products on this deeper page don't get the love." Before I get to my question, I'll just throw out there that we are planning to fix the pagination issue by opting for the "View All" method, which Adam suggests as the second of three options in this video, so that fix is coming. My question is this: It seems based on what Adam said (and our current abysmal state for pagination) that the products on our paginated pages aren't being crawled or indexed. However, our products are all indexed in Google. Is this because we are submitting a sitemap? Even so, are we missing out on internal linking (authority flow) and Google love because Googlebot is finding way more products in our sitemap that what it is seeing on the site? (or missing out in other ways?) We experience a lot of volatility in our rankings where we rank extremely well for a set of products for a long time, and then disappear. Then something else will rank well for a while, and disappear. I am wondering if this issue is a major contributing factor. Oh, and did I mention that our sort feature sorts the products and imposes that new order for all subsequent visitors? it works like this: If I go to that same Audio-Technica page, and sort the 125+ resulting products by price, they will sort by price...but not just for me, for anyone who subsequently visits that page...until someone else re-sorts it some other way. So if we merchandise the order to be XYZ, and a visitor comes and sorts it ZYX and then googlebot crawls, google would potentially see entirely different products on the first page of the series than the default order marketing intended to be presented there....sigh. Additional thoughts, comments, sympathy cards and flowers most welcome. 🙂 Thanks all!
Technical SEO | | danatanseo0 -
Question about duplicate images used within a single site
I understand that using duplicate images across many websites was become an increasingly important duplicate content issue to be aware of. We have a couple dozen geotargeted landing pages on our site that are designed to promote our services to residents from various locations in our area. We've created 400+ word pieces of fresh, original content for each page, some of which talks about the specific region in some detail. However, we have a powerful list of top reasons to choose us that we'd like to use on each page as is, without rewriting them for each page. We'd like to simply present this bulleted list as an image file on each page to get around any duplicate written copy concerns. This image would not appear on any other websites but would appear on about two dozen landing pages for a single site. Is there anything to worry about this strategy from a duplicate content or duplicate image perspective in terms of SEO?
Technical SEO | | LeeAbrahamson0 -
Duplicate video content question
This is really two questions in one. 1. If we put a video on YouTube and on our site via Wistia, how would that affect our rankings/authority/credibility? Would we get punished for duplicate video content? 2. If we put a Wistia hosted video on our website twice, on two different pages, we would get hit for having duplicate content? Any other suggestions regarding hosting on Wistia and YouTube versus just Wistia for product videos would be much appreciated. Thank you!
Technical SEO | | ShawnHerrick1 -
Link building question
ok so we paid the top firm in seo to help us build an seo strategy and i think we have a good one. We are changing our link building tactics and making more Pr related links and creating awesome content on blogs or our own site to generate traffic and links to our site. We have data from our engineer which should be interesting and we are going to sponsor events, do some link baiting with some of our articles, get a pr firm to get us some good articles on major sites and go to events around phily where we will have unique content and a unique perspective such as car shows ect. The problem is even though all the content will be linked to our site how do we link them. We got hit by penguin but in these articles or blogs should we use the anchor text for the word we are using. The company says dont do it right now bc we got hit with penguin and should only use the brand. I have no idea how only using the brand and not the keywords will magically make us rank for certain keywords. Anyone have an opinion. Thank you and we do pretty well with seo but we did get little bit of a hit with penguin that we are eliminating links and making a new way of thinking when it comes to link building. We also just hired a designer so we are going to build 100s of pages on the site to increase seo with unique content and that is also a goal of ours for the year. We have two marketers on staff and 4 programmers so we are able to do anything. Our urls are terrible but the rest of the site is pretty good
Technical SEO | | goldjake17880 -
Summarize your question.Sitemap blocking or not blocking that is the question?
Hi from wet & overcast wetherby UK 😞 Ones question is this... " Is the sitemap plus boxes blocking bots ie they cant pass on this page http://www.langleys.com/Site-Map.aspx " Its just the + boxes that concern me, i remeber reading somewherte javascript nav can be toxic. Is there a way to test javascript nav set ups and see if they block bots or not? Thanks in advance 🙂
Technical SEO | | Nightwing0 -
Too many 301s?
Hi there, If there is a website that has accidently generated say 1,000 pages of duplicate content, would the seo be hurt if all those pages were re-directed to the origional source of the content? There are no plans to re-write the 1,000 duplicate pages, they are already cached and indexed by Google. I thought about canonical tags but as they have some traffic and a little seo value i thought 301 re-direct would be more appropiate to the relevant pages? I am also right in thinking you would be able to remove the 301 in the .htaccess file once the index has updated? Also once removed the 301 - i could use those urls later from scratch if i wanted? Any info much appreciated.
Technical SEO | | pauledwards0 -
Site Hosting Question
We are UK based web designers who have recently been asked to build a website for an Australian Charity. Normally we would host the website in the UK with our current hosting company, but as this is an Australian website with an .au domain I was wondering if it would be better to host it in Australia. If it is better to host it in Australia, I would appreciate if someone could give me the name of a reasonably priced hosting company. Thanks Fraser
Technical SEO | | fraserhannah0