Proper way to 404 a page on an Ecommerce Website
-
Hello. I am working on a website that has over 15000 products.
When one of these is no longer available - like it's discontinued or something - the page it's on 302s to a 404 page.
Example - www.greatdomain.com/awesome-widget
Awesome widget is no longer available
302s to
-www.greatdomain.com/404 page.
For the most part, these are not worthy of 301s because of lack of page rank/suitable LPs, but is this the correct way to handle them for search engines? I've seen varying opinions.
Thanks!
-
Hi Nakul,
I appreciate your willingness to help! We actually resolved the issue with help from our developer - standard 404 page - both to viewer and bots - but we've implemented a routine to regularly search for viable redirects to eliminate as many as possible.
On a related note - pretty good post on SEOmoz blog today about this very topic - coincidence?!
-
PM me your website URL with example 404/302 of the discontinued products.
-
Hi Nakul,
These products would be gone forever - like a discontinued item.
The 302 to 404 is my main concern - I agree with each of you that from a UEx perspective redirecting to relevant category pages is ideal.
Is this a standard way of setting this up on a large website (I didn't do it and it seems strange to me). Is there a better way (strictly from the SE perspective?).
Thanks.
-
Agree with Sean. If you were a user and searched for 'Stainless Steel Cookware Set with 4 Saucepans' and that product no longer became available, would you rather land on a 404 page or a 'Cookware Set' or 'Stainless Steel Cookware' type category page?
-
If these product goes away, do you expect them to go away forever or they may come back when they are in stock ? Are these "out of stock" scenarios or is it a gone forever scenario ?
If they are gone forever, a 301 to a category page make sense from usability perspective. If I am a blogger and I blogged about a product of yours, linking to it...I and all my users would prefer/expect the link to be working. If it's not available, out of stock message or if it's gone forever, a 301 to a category page or a plain 404 is better. Why do you have a 302 / Temporary redirect to a 404 page ? Are these really temporarily gone products or permanently gone ?
-
I am not saying this strictly from a link juice but also from a user experience. I would much rather hit another page than hit a 404 with an image.
-
Thanks Sean. I'm not too concerned about 301s in this case - any products that are worth the link would show up in webmaster tools - and search engines expect a certain number of 404s returned. I'm just wondering if this is the correct way to report the 404 error to the engines - with a 302 onto a 404 page? Temporary redirects have me scared. Thanks.
-
Depending how customizable your platform is, you could if discontinued, 301 to next level page in the breadcrumbs. Which would be a category/sub category page
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
[Organization schema] Which Facebook page should be put in "sameAs" if our organization has separate Facebook pages for different countries?
We operate in several countries and have this kind of domain structure:
Technical SEO | | Telsenome
example.com/us
example.com/gb
example.com/au For our schemas we've planned to add an Organization schema on our top domain, and let all pages point to it. This introduces a problem and that is that we have a separate Facebook page for every country. Should we put one Facebook page in the "sameAs" array? Or all of our Facebook pages? Or should we skip it altogether? Only one Facebook page:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
], All Facebook pages:
{
"@type": "Organization",
"@id": "https://example.com/org/#organization",
"name": "Org name",
"url": "https://example.com/org/",
"sameAs": [
"https://www.linkedin.com/company/xxx",
"https://www.facebook.com/xxx_us"
"https://www.facebook.com/xxx_gb"
"https://www.facebook.com/xxx_au"
], Bonus question: This reasoning springs from the thought that we only should have one Organization schema? Or can we have a multiple sub organizations?0 -
Gradual Drop in GWT Indexed Pages for large website
Hey all, I am working on SEO for a massive sports website. The information provided will be limited but I will give you as much context as possible. I just started digging into it and have found several on-page SEO issues of which I will fix when I get to the meat of it but this seems like something else could be going on. I have attached an image below. It doesn't seem like it's a GWT bug as reported at one point either as it's been gradually dropping over the past year. Also, there is about a 20% drop in traffic in Google Analytics over this time as well. This website has hundreds of thousands of pages of player profiles, sports team information and more all marked up with JSON-LD. Some of the on-page stuff that needs to be fixed are the h1 and h2, title tags and meta description. Also, some of the descriptions are pulled from wikipedia and linked to a "view more" area. Anchor text has "sign up" language as well. Not looking for a magic bullet but to be pointed in the right direction. Where should I start checking off to ensure I cover my bases besides the on page stuff above? There aren't any serious errors and I don't see any manual penalties. There are 4,300 404's but I have seen plenty of sites with that many 404's all of which still got traffic. It doesn't look like a sitemap was submitted to GWT and when I try submitting sitemap.xml, I get a 504 error (network unreachable). Thanks for reading. I am just getting started on this project but would like to spend as much time sharpening the axe before getting to work. lJWk8Rh
Technical SEO | | ArashG0 -
404 Errors for Form Generated Pages - No index, no follow or 301 redirect
Hi there I wonder if someone can help me out and provide the best solution for a problem with form generated pages. I have blocked the search results pages from being indexed by using the 'no index' tag, and I wondered if I should take this approach for the following pages. I have seen a huge increase in 404 errors since the new site structure and forms being filled in. This is because every time a form is filled in, this generates a new page, which only Google Search Console is reporting as a 404. Whilst some 404's can be explained and resolved, I wondered what is best to prevent Google from crawling these pages, like this: mydomain.com/webapp/wcs/stores/servlet/TopCategoriesDisplay?langId=-1&storeId=90&catalogId=1008&homePage=Y Implement 301 redirect using rules, which will mean that all these pages will redirect to the homepage. Whilst in theory this will protect any linked to pages, it does not resolve this issue of why GSC is recording as 404's in the first place. Also could come across to Google as 100,000+ redirected links, which might look spammy. Place No index tag on these pages too, so they will not get picked up, in the same way the search result pages are not being indexed. Block in robots - this will prevent any 'result' pages being crawled, which will improve the crawl time currently being taken up. However, I'm not entirely sure if the block will be possible? I would need to block anything after the domain/webapp/wcs/stores/servlet/TopCategoriesDisplay?. Hopefully this is possible? The no index tag will take time to set up, as needs to be scheduled in with development team, but the robots.txt will be an quicker fix as this can be done in GSC. I really appreciate any feedback on this one. Many thanks
Technical SEO | | Ric_McHale0 -
3,511 Pages Indexed and 3,331 Pages Blocked by Robots
Morning, So I checked our site's index status on WMT, and I'm being told that Google is indexing 3,511 pages and the robots are blocking 3,331. This seems slightly odd as we're only disallowing 24 pages on the robots.txt file. In light of this, I have the following queries: Do these figures mean that Google is indexing 3,511 pages and blocking 3,331 other pages? Or does it mean that it's blocking 3,331 pages of the 3,511 indexed? As there are only 24 URLs being disallowed on robots.text, why are 3,331 pages being blocked? Will these be variations of the URLs we've submitted? Currently, we don't have a sitemap. I know, I know, it's pretty unforgivable but the old one didn't really work and the developers are working on the new one. Once submitted, will this help? I think I know the answer to this, but is there any way to ascertain which pages are being blocked? Thanks in advance! Lewis
Technical SEO | | PeaSoupDigital0 -
Is there a way for me to automatically download a website's sitemap.xml every month?
From now on we want to store all our sitemap.xml over the next years. Its a nice archive to have that allows us to analyse how many pages we have on our website and which ones were removed/redirected. Any suggestions? Thanks
Technical SEO | | DeptAgency0 -
Website ranking went from page one to not in top 50 overnight. Help/suggestions?
One of our customer's websites initially ranked very well. For approximately 3 months it sat atop of Google for their optimized keywords. Suddenly, on November 17th, the ranking dropped and they were no longer in the top 50 for any keywords. We went through Google Webmaster tools and found no violations, so we emailed Google to see if we violated something and if they would reconsider. They responded "We reviewed your site and found no manual actions by the webspam team that might affect your site's ranking in Google." This is a site built on WordPress, so we turned off a majority of plugins in case one was somehow affecting the site. They have an incredible amount of business partners that link their website from their partner's website menus, so they have about 15,000 links all with anchor text "insurance." (every page on partner site is seen as a different link). Think this is affecting it? Maybe Google sees it as artificial? (P.S. This has been set up this way for a while before they came on with us). The site ranks on page one of Bing and Yahoo, but nowhere in top 50 for Google. Any suggestions? Appreciate the help!
Technical SEO | | Tosten0 -
Duplicate index.php/webpage pages on website. Help needed!
Hi Guys, Having a really frustrating problem with our website. It is a Joomla 1.7 site and we have some duplicate page issues. What is happening is that we have a webpage, lets say domain.com/webpage1 and then we also have domain.com/index.php/webpage1. Google is seeing these as duplicate pages and is causing me some real SEO problems. I have tried setting up a 301 redirect but it wn't let me redirect /index.php/webpage1 to /webpage1. Anyone have any ideas or plugins that can be used to sort this out? Any help will be really appreciated! Matt.
Technical SEO | | MatthewBarby0 -
How can I prevent duplicate content between www.page.com/ and www.page.com
SEOMoz's recent crawl showed me that I had an error for duplicate content and duplicate page titles. This is a problem because it found the same page twice because of a '/' on the end of one url. e.g. www.page.com/ vs. www.page.com My question is do I need to be concerned about this. And is there anything I should put in my htaccess file to prevent this happening. Thanks!
Technical SEO | | onlineexpression
Karl0