New "Static" Site with 302s
-
Hey all,
Came across a bit of an interesting challenge recently, one that I was hoping some of you might have had experience with!
We're currently in the process of a website rebuild, for which I'm really excited. The new site is using Markdown to create an entirely static site. Load-times are fantastic, and the code is clean. Life is good, apart from the 302s.
One of the weird quirks I've realized is that with oldschool, non-server-generated page content is that every page of the site is an Index.html file in a directory. The resulting in a www.website.com/page-title will 302 to www.website.com/page-title/.
My solution off the bat has been to just be super diligent and try to stay on top of the link profile and send lots of helpful emails to the staff reminding them about how to build links, but I know that even the best laid plans often fail.
Has anyone had a similar challenge with a static site and found a way to overcome it?
-
Wow. I wasn't expecting such a detailed and awesome answer Danny. Thanks so much, I'm in the process of migrating away from S3 anyways (for other reasons) though you're right in that I'm going to miss the cost & load times.
I'm using Middleman for now, though the technical part of my brain is indeed interested in how you're going to accomplish the Jekyll solution. I'll look out for your post!
And thanks for the tip on my site. Another thing to add to the list
Arun
-
Hey Arun,
Thanks for posting! I was beginning to think that I was the only Inbound guy anywhere that had to deal with this kind of issue
Yup, I created the same bug with redirect loops trying to get around the slash issue. The problem is that S3 doesn't consider the slash as part of the rewrite data unless something comes after it.
Ultimately, my number one suggestion would be to go with a different service that allows you to install a Server App like Nginx or Apache. Others have agreed that redirections set up through a server app are the way that they feel the most comfortable that link equity is being passed.
If you're dead-set on S3, which I would understand as the load times are crazy-awesome-insane, I may have a solution for you soon. Our dev team is working on a script for Jekyll + S3 sites that will essentially create extension-less files (i.e. example.com/contact) that contain meta refresh + rel canon.
The script will use a list of desired redirections + rules that is structured the same way an htaccess file would be. I can't speak to how it will get past S3's default 302ing yet, but I know that it will use CURL. Look for a YouMoz post soon from me!
Anyways, I hope my notes here help! I'm gonna try and make that post soon after the script is created. Just as a last note, in taking a look at your site I noticed that a lot of the internal links on your homepage don't have the trailing slash in them. I would definitely start there and add those slashes, and perform a "submit page + linked page" to Webmaster Tools after!
-
Hi Danny-
I've got the exact same issue (static site on S3 redirecting with 302s), and surprisingly can't find a lot of information out there. If I do a S3 metadata based redirect from (for example) /blog to /blog/ I just end up in a redirect loop.
I checked out your site and it still looks like you're working on it. Did you end up figuring anything out? If there's any way that I can help get to a solution I'd be happy to spend some time on it.
Thanks!
Arun
-
Thanks for the reply David!
Yup, I think that this has just been a case of wrapping my head around a new way of doing things (i.e. redirections in the AWS bucket config rather than using .htdocs). Static sites are a crazy combination of complicated and simple!
Thanks! We're using Jekyll somewhat, although we've had issues with the image hosting. I've actually had better results using the local github client + "Mou", a local Markdown editor.
-
Nice! (for speed at least)
I would show your team some examples of external URLs pointing at the non trailing slash versions of your pages and explain the downside of the 302 redirect. Also consider that people and bots visiting those URLs will be adding overhead to your server, and on Amazon that will equal increased cost (small as it may be, the pennies add up!)
Reading the link you provided it looks like the default behaviour of the page metadata redirect under the s3 console is to create a 301 redirect. That makes me think the 302 is coming from somewhere else. Look at the following URL:
http://docs.aws.amazon.com/AmazonS3/latest/dev/HowDoIWebsiteConfiguration.html
It looks like you can add advanced redirects under "Enable website hosting -> edit redirection rules". I'd explore if there are redirects listed there and maybe chat to your developers further.
While you are it I spotted two other issues for you to consider. Currently the index.html files in your directories resolve to the same page as your main directory. I would 301 those pages back to the parent directory (slash version). Or you could add canonical URLs pointing back to the parent directory (with trailing slash). I'd make a case for adding canonical URLs to all pages.
Also, you currently have a number of redirect chains e.g.
http://www.strutta.com/resources/posts/share-your-contests-and-sweepstakes-all-over-social-media 301 redirects to http://www.strutta.com/resources which 302 redirects to http://www.strutta.com/resources/.
You need to find the original redirect and change it to 301 redirect to the trailing slash version of the directory. Screaming Frog can help you find these redirect chains.
-
Hi Danny!
I don't have much to add here, I think the guys have it right in that you'll need to figure out how to make the 301 work. I quickly read that documentation, then realized I wasn't a robot, so I found this: http://aws.typepad.com/aws/2012/10/amazon-s3-support-for-website-redirects.html which was a bit more friendly.
I wish I could help you out more, but I'm not using AWS. I'm assuming you'll be able to use wildcard or regex matching somewhere, and that should solve your problem.
Great site by the way, anything you're using to help out with the static blog? (Jekyll, Octopress?)
-
Follow-up answer:
Our new website (Strutta.com) is entirely static, hosted on S3. No Apache, just straight HTML files. No apache means no htaccess.
Instead of using htaccess, we have to use the S3 Console: http://docs.aws.amazon.com/AmazonS3/latest/dev/how-to-page-redirect.html
As far as I can tell, this sets up redirects the same way. Although this doesn't answer my initial question, I'm going to try using the control panel later on today to see if 301ing the directories there to include the / will get recognized before whatever is causing the 302 currently
-
Thanks all,
I think the problem is coming from the fact that we're hosted on Amazon Webservices, and the devs are using the "aws bucket config" settings to institute redirects instead of htaccess. SEO vs Dev Battle time.
-
Hey Danny,
As Maximilian suggested above the best solution is going to be to change those 302s to 301s. I generally like to redirect to trailing slash URLs for directories and non trailing slash URLs for files/pages (that's that standard convention). I find in practice hardly anyone who links organically ever includes a trailing slash when linking to a page, but when it's the homepage I don't worry about it too much, browsers and Google can figure that out.
Basically you need to figure out where the 302 is coming from and hopefully it is in your .htaccess file. If you can edit your .htaccess file you need to change that to a 301 redirect, or you could remove the redirect and just use a canonical URL pointing at the / version of the page. I would prefer to go with the 301 though. Just be sure to look at how these redirects are being implemented and in what order, you don't want to end up with redirect chains either.
Can you get access to your .htaccess file or is the server running something funky?
-
Perhaps this is too obvious, but can you not change the 302 to 301's?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need suggestions on what might be causing rankings drop from top5 to "not in 50"?
Hi All, Below a list of 4 keywords & respective URLs which raked in top 3 to 5 till around 2 months back, now all these are "not in top 50", and I need help with finding the exact reason. Can you all please help with suggestions on what I should be looking for under the hood. Oticon Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/oticon-hearing-aids.aspx Phonak Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/phonak-hearing-aids.aspx Widex Hearing Aids: http://www.leightonshearingcare.co.uk/hearing-aids/widex-hearing-aids.aspx Resound Hearing Aids:http://www.leightonshearingcare.co.uk/hearing-aids/siemens-hearing-aids.aspx Thanks in advance, any help will be very much appreciated, checked all the basic stuff, and appreciate that there is scope for improvement in terms of page content, internal links etc etc, but cant figure out the reason for such a massive drop in such a short while given the fact, that the these URLs ranked in top 3 to 5 for a few years till 2 months back. Please help!!!
Technical SEO | | LolhcSEO0 -
Campaign Issue: Rel Canonical - Does this mean it should be "on" or "off?"
Hello, somewhat new to the finer details of SEO - I know what canonical tags are, but I am confused by how SEOmoz identifies the issue in campaigns. I run a site on a wordpress foundation, and I have turned on the option for "canonical URLs" in the All in one SEO plugin. I did this because in all cases, our content is original and not duplicated from elsewhere. SEOmoz has identified every one of my pages with this issue, but the explanation of the status simply states that canonical tags "indicate to search engines which URL should be seen as the original." So, it seems to me that if I turn this OFF on my site, I turn off the notice from SEOmoz, but do not have canonical tags on my site. Which way should I be doing this? THANK YOU.
Technical SEO | | mrbradleyferguson0 -
Should this site start again on a new domain
Hi We have not done SEO on this site they have used another company who looks like they outsourced and the links have been built by a third party all blog networks and this company have said they cannot get the links removed. Google flagged artificial links on this web site in February and in April it lost over 10000 visitors in a month and its just free falled ever since. The categories have been recreated and no redirects created due to the amount of backlinks from the blog sites to the original category pages but the site is not recovering its down to 1500 visitors a month and used to get 14000 a month. So should my customer ditch the domain and move this site to fresh domain? http://www.kids-beds-online.com Any answers would really be appreciated. thanks Tracy
Technical SEO | | dashesndots0 -
Does the rel="bookmark" tag have any SEO impication?
I'm assuming the rel="bookmark" tag doesn't have any SEO implications but I just wanted to make sure it wasn't viewed like a nofollow by search engines.
Technical SEO | | eli.boda0 -
Prospective new client it by webspam looking for new resource
Background:
Technical SEO | | tcmktg
Prospective client recently hit by webspam update. (I have verified hundreds of low-quality links, porn links, backlink exchanges etc.) They want us to step in and remove bad links and start over. Question:
What is the best way to examine all the links to determine which need to be removed? We can create the report from open site, but how can we identify the bad links? Here are the site metrics. 5000+ linking domains, so in this example we need to research the 5000 links, and possibly send notifications to thousands of webmasters to remove the links? Open site states about 25,000 total links, but root links are shown below. Yikes. Domain Authority 75
External Followed Links 112,000
Total External Links 115,000
Total Links 150,000,
Followed Linking Root Domains 3,900
Total Linking Root Domains 5,300
Linking C Blocks 2,7000 -
.me vs .com for new personal blog site
Hi guys, this is my first ever post on SEOMoz (woo!) I have researched and I did see someone else ask something similar but I still wasn't clear, so i hope this question is not considered a duplicate and can go on to help other people too Enough waffle For various reasons I am moving our company blog to startup a personal blog instead and I have bought a couple of appropriate domain names in a firstname/lastname format for the new blog, basically: myname.me and iammyname.com My question is, which would you consider 'better', if either, for SEO? (bonus point: are there any other non-SEO factors I should consider?) Obviously the second name is longer, but it is a .com and I hear all the time that .com is king and .me is waaaay behind Ultimately I want to rank #1 for my name If it was your site and your blog and you had my choices which one would you go for? Many thanks for your help. I'm looking forward to being part of the SEOMoz community and learning a lot from you guys, cheers, Nick
Technical SEO | | NickDavis0 -
How to add "no follow" to feeds
Hey all, I just had a crawl test done on my site(created using wordpress) and I received a ton of missing meta tag descriptions to fix. The odd thing is though I use "All in One" SEO Tool and the actual pages or posts on the site do have meta tag descriptions, however I noticed for every post an RSS Feed is being automatically generated and this Feed is the culprit without meta tag descriptions. I am totally clueless on how to resolve these errors as I havent installed any WP plugins that generate feeds automatically. Has anyone encountered this problem before or know how to fix this?? The site url is http:// GovernmentGrantsAustralia . org I have left spaces above to avoid being a link dropper 🙂 Would really appreciate if anyone can help! Thanks a million, Jus
Technical SEO | | justin990 -
Is it possible to build a new site and replace it
Hi i have a site that i want to re build because it has moved away from what i wanted it to do and deleted a lot of the site and due to this has caused a lot of errors. My site is built in the old joomla and i want to build a brand new site in the new version. My site in2town.co.uk which is a lifestyle magazine runs very slowly and what i want to do because it has a lot of problems under the google webmaster control panel, is to rebuild the site and turn it back into the site it should be instead of trying to make it into so many different things. The problem i have is, i have thousands and thousands of great links going to the site and i am worried about losing these links. It would take me about a day to turn it into the site it should be and improve it. My hosting company have said that i can put it under a sub domain the new site and then when i have built it, they can then move it under the correct name. But what i am worried about is, the site is very slow at the moment, and i am worried that by doing this that it would not solve the problem of making it faster even though i have a dedicated server. I want the site to be brand new with no errors and i am worried by doing this that the site will run slowly because i am still working off the old script. I have tried migrating the older version into the new joomla but it could not be done and caused problems so this could be my only other option. Any advice would be great
Technical SEO | | ClaireH-1848860