Usage of HTTP Status Code 303
-
Hello,
is there anybody who has got some experience with 303 HTTP Status Code?
Our software development would like to use 303 "See Others" instead of 301 for redirecting old product-links to the site-root, instead of showing 404 errors.
What is the best practise for redirecting old product links which are gone in online-shop context?
Best regards
Steffen
-
I would recommend using a 301 redirect to the home page as this will pass link juice. If they can be redirected to the specific product category that would be useful.
An alternative would be to still serve up the old page so it results in a 200 code or a 301 to a product suggestion page. Having a products like this suggestion page and or a search for products page would likely convert better than just a blanket 301 redirect to the home page.
Another thing you could do is create an intelligent "catch" page that uses the search parameter (if there is one) or the title of the page referring the site and use that as a parameter for searching your products database and serving up some relevant products.
-
It probably will not pass your link juice if any. Zhis is the difference: 301 status codes are passing on 90% of the link juice the inbound links are giving to your pages.
For users it is good to redirect them to semothing else. The fact that a products period is over does not mean that it will not be searched anymore. Keeping old pages at least in the sitemap will not blow your pages at all. I would do that, however technically if there are no inbound links pointing to the pages that you want to 303 redirect, it will not hurt your seo.
-
Hi,
our old content is definitly gone away. We have a lot of volatile content which has got a lifetime from 6-12 month and somtimes shorter. I believe keeping old URLs will blow-up indexed pages.
But my general question was about 303 code. Do you have some experience about the difference between 301/303?
BR
-
Hello,
Are your products gone forever for sure? If you place 301 or 303 the visitors clicking your pages from the serps will see a new content instead of a 404 eror page that is for sure, so it has its user side benefits. However if you are ranking for these products in google and these words are bringing in traffic to your side i would think twice to delete those pages. I you delete the actual content you are ranking with the useres and the engines will see a totally new content, so if you lose your product specific pages you will also lose your rankings sooner or later.
I would leave those pages but do a little reorganizatin on the landing page. I would push the current content a bit downwords and place a one-two line convincing text why you have finished to sell those products (why users should not serych for them longer) and give an alternate better solution for the product type they are searching. So like we have finished selling lithium batteries as the new xy technique has longer 2x life period, and has half the time to charge. You can look at these astonishing products here
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Description tag in code is different from what is shown in SERPS...
Hi there: We have a client whose website we built in WP, using Yoast Pro as our SEO plugin. I was reading some reports (actually coming out of SEMrush but we use Moz as well) and I am getting really varying results in the description are of the SERPS. Even though I'm seeing the copy we wrote in Yoast in the description tag code, the SERP is showing an excerpt from the copywriting on the site. What's even weirder is that SEMrush is pulling an entirely DIFFERENT description. I'm obviously missing out on the finer points of description tags, as Google clearly does not always choose to feature what is actually written in the description tag itself. Can someone explain to me what might be going on here? Thanks in advance,
Intermediate & Advanced SEO | | Daaveey1 -
Losing backlinks between http and https
Late last year, Shopify moved all their sites from http to https. I did an audit of a Shopify site recently and discovered the following... The https version of the site has just 13 backlinks from 5 domains The http version of the site has 568 backlinks from 48 domains So I went into the blog (which is on a different domain - long story) and changed all the backlinks from http to https. One week later... the https version of the site now has 278 backlinks from the same 5 domains. I've been told that Google doesn't worry about this when it comes to rankings. Not sure if this is true or not. But I definitely believe this DOES affect MOZ Domain Authority. Can anybody confirm or deny this? If the backlinks do not change from http to https following a migration, does this impact MOZ Domain Authority and/or Google rankings?
Intermediate & Advanced SEO | | muzzmoz0 -
Code to Redirect Mobile Subdomain to Desktop Site
Hi Everyone, My client is switching from a mobile subdomain to a responsive site. All URLs are the same on mobile subdomain vs desktop so we just need a wildcard rule to redirect m. to www. Anyone have this wildcard redirect code for an .htaccess file? Much appreciated! Dan
Intermediate & Advanced SEO | | kernmedia0 -
Status Codes - Deleted URLs
Hi I have a dev team 'cleaning' their database and from what I can tell deleting old URL's - which they say are not in use. I don't have much visibility on how our URLs are managed in the back end of the site, but my concern is these URLs should never be deleted, they should have a 301, 404 or 410. This includes product pages no longer available and category pages - my concern is losing authority. Am I worrying over nothing or is this a big issue?
Intermediate & Advanced SEO | | BeckyKey0 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Code Monitor Recommendations
Hi all, I was wondering if you have any recommendations for a code monitor? We'd like to keep track of any code and content changes on a couple of websites. We've taken a look at Page Monitor: https://chrome.google.com/webstore/detail/page-monitor/pemhgklkefakciniebenbfclihhmmfcd?hl=en but I'm not sure if it tracks code changes? Any suggestions for free or paid tools would be appreciated. Edit: We'd also like to avoid a tool that requires any tracking code changes or anything that involves a database/FTP connection.
Intermediate & Advanced SEO | | ecommercebc0 -
Http to https conversion - what were your experiences?
Background: Our devs have been talking about changing some of our websites so that all pages are https vs just those that are a part of our logins and shopping carts. From what I have read things that need to be done as a part of this are make sure that https pages will allow caching setup new site in GWT put 301 redirects in place update all internal links, social profiles etc everywhere you can to https URLs can server handle the extra load so no impact on site speed There is an old MCutts video that says essentially, "works for Pay Pal" and "you can try it, but test it on a smaller site first" http://www.youtube.com/watch?v=xeFo4ytOk8M The comments below the video are all stories about loss in rank and traffic with some coming back. Not sure if these folks did the move correctly, but still, you never know. Question: Have any of you done a technically "correct" move of an entire site from http to https using the suggestions above? What was your experience? Any gotchas? Just to be clear, I am not talking about setting up a site from scratch, but wanted to know impact on an established site. Thx
Intermediate & Advanced SEO | | CleverPhD1 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0