Server return 404 and 200 status
-
Dear All,
We have a website where we are showing some products. Many times it happens when we remove any product and add that again later. In our site, we have product list page and detail page. So if any product get deleted and client hits the detail page with deleted product url, then we are returning 404. Next time when that product will be available our server will return 200.I have two questions :
1. Is is right way to deal with deleted product ?
2. After deploying it, we observed that our keywords ranking is going down, is that really affect ?
Thanks,
Om -
Then 404 is the correct way to handle it.
You should also be sure that you clean up any internal links to these pages after they are gone. Search engines can understand other sites linking to pages that are no longer available but they do frown on your own site sending users to missing pages.
-
Hi,
Nice comments. But one thing, since our site have millions of products and may be most of them we wont get back. So in that case, the url already indexed should get removed and should not crawl again. Thats why we were returning 404, so that crawler will understand that url is no more valid.
Thanks,
Om
-
Hi Sunit,
Stephanie Chang wrote a nice blog post about how to handle this situation here on Moz.
She points out there is no right way to do this. I however really like the option to leave the page and content on the site but mark the product page as out of stock, or unavailable right now. I used to run my own e-commerce store and I had many items that were purchased from overseas. At times I could not get an accurate re-stock time from my suppliers due to language and international challenges. My solution was to create and out of stock (email me when available again) notice. I found when those products did come back in stock, that started selling again quickly do to the emails that went out.
The other thing Search Engines don't particularly like content that comes and goes so that may be why you noticed a hit in some pages.
Hope that helps,
Don
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
60,000 404 errors
Do 404 errors on a large scale really matter? I'm just aware that I now have over 60,000 and was wondering if the community think that I should address them by putting 301 redirects in place. Thanks
Technical SEO | | the-gate-films0 -
Exclude status codes in Screaming Frog
I have a very large ecommerce site I'm trying to spider using screaming frog. Problem is I keep hanging even though I have turned off the high memory safeguard under configuration. The site has approximately 190,000 pages according to the results of a Google site: command. The site architecture is almost completely flat. Limiting the search by depth is a possiblity, but it will take quite a bit of manual labor as there are literally hundreds of directories one level below the root. There are many, many duplicate pages. I've been able to exclude some of them from being crawled using the exclude configuration parameters. There are thousands of redirects. I haven't been able to exclude those from the spider b/c they don't have a distinguishing character string in their URLs. Does anyone know how to exclude files using status codes? I know that would help. If it helps, the site is kodylighting.com. Thanks in advance for any guidance you can provide.
Technical SEO | | DonnaDuncan0 -
Pages appear fine in browser but 404 error when crawled?
I am working on an eCommerce website that has been written in WordPress with the shop pages in E commerce Plus PHP v6.2.7. All the shop product pages appear to work fine in a browser but 404 errors are returned when the pages are crawled. WMT also returns a 404 error when ‘fetch as Google’ is used. Here is a typical page: http://www.flyingjacket.com/proddetail.php?prod=Hepburn-Jacket Why is this page returning a 404 error when crawled? Please help?
Technical SEO | | Web-Incite0 -
Error 404, Wordpress adds the domain automaticly to the end of the pages, WHY?
Hello guys, I'm using wordpress and the Yoast to help me improve my SEO. Everything went well except for today because "Moz" found 404 errors when scrolling the website saying showing the domain of my website at the end of 12 url. For example :
Technical SEO | | abonnisseau
www.domain.com/service-1/www.domain.com www.domain.com/contact-page/**www.domain.com ** Do you have any idea where does that come from ? Thanks Alex0 -
How to properly remove 404 errors
Hi, According to seomoz report I have two 404 errors on my site. (http://screencast.com/t/2FG8fA1dvGB) I removed them from google webmasters central about 2 weeks ago (http://screencast.com/t/MQ8XBvrFm ) , but they're still showing as an error in the next report (weekly update). Is there anything else you do about 404 or just remove urls through gwc? Or maybe seomoz data is delayed? Thanks in advance, JJ
Technical SEO | | jjtech0 -
Delete 301 redirected pages from server after redirect is in place?
Should I remove the redirected old pages from my site after the redirects are in place? Google is hating the redirects and we have tanked. I did over 50 redirects this week, consolidating content and making one great page our of 3-10 pages with very little content per page. But the old pages are still visible to google's bot. Also, I have not put a rel canonical to itself on the new pages. Is that necessary? Thanks! Jean
Technical SEO | | JeanYates0 -
On a dedicated server with multiple IP addresses, how can one address group be slow/time out and all other IP addresses OK?
We utilize a dedicated server to host roughly 60 sites on. The server is with a company that utilizes a lady who drives race cars.... About 4 months ago we realized we had a group of sites down thanks to monitoring alerts and checked it out. All were on the same IP address and the sites on the other IP address were still up and functioning well. When we contacted the support at first we were stonewalled, but eventually they said there was a problem and it was resolved within about 2 hours. Up until recently we had no problems. As a part of our ongoing SEO we check page load speed for our clients. A few days ago a client who has their site hosted by the same company was running very slow (about 8 seconds to load without cache). We ran every check we could and could not find a reason on our end. The client called the host and were told they needed to be on some other type of server (with the host) at a fee increase of roughly $10 per month. Yesterday, we noticed one group of sites on our server was down and, again, it was one IP address with about 8 sites on it. On chat with support, they kept saying it was our ISP. (We speed tested on multiple computers and were 22MB down and 9MB up +/-2MB). We ran a trace on the IP address and it went through without a problem on three occassions over about ten minutes. After about 30 minutes the sites were back up. Here's the twist: we had a couple of people in the building who were on other ISP's try and the sites came up and loaded on their machines. Does anyone have any idea as to what the issue is?
Technical SEO | | RobertFisher0