404's and Ecommerce - Products no longer for sale
-
Hi
We regularly have products which are no longer sold and discontinued. As we have such a large site, webmaster tools regularly picks up new 404's.
These 404 pages aren't linked to from anywhere on the site any longer, however WMT will still report them as errors. Does this affect site authority?
Thank you
-
Great answers, thank you for the links!
-
410 & 404's are treated almost the same way by bots (the 410 just gives a stronger indication that the page is gone forever) - check this article http://searchengineland.com/googles-matt-cutts-explains-differences-404-410-status-codes-189060 - so a 410 is also a valable option. As most sites already have a custom 404 page I guess the 404 solution is the easiest one;
I forgot to mention in my first reply - but there is also a good article on out-of-stock products on Moz: https://moz.com/blog/how-should-you-handle-expired-content
Dirk
-
What's the thought on using a 410 server response (Gone) for products that won't be in stock on the site anymore? I've seen that strategy work to get pages out of the index. Does anyone have opposition to using that strategy? That would be assuming the page isn't linked to on the site (or other sites for that matter). I've seen that work OK for product pages buried a littler further in the site architecture.
-
Hi Dirk,
Great thank you, I don't get any on my site crawls so that's a plus!
-
It's nothing to worry about - they will disappear after a while (if you don't link to them on your site). It's quite normal to have them & they have no SEO impact (https://support.google.com/webmasters/answer/2409439?hl=en)
You could check this advice from Matt Cuts here: http://searchenginewatch.com/sew/news/2334932/ecommerce-seo-tips-for-unavailable-products-from-googles-matt-cutts- for larger sites he advices to use the "unavailable after" metatag (http://googleblog.blogspot.nl/2007/07/robots-exclusion-protocol-now-with-even.html)
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moved brand's shop to a new domain. will our organic traffic recuperate?
Hello, We are a healthcare company with a strong domain authority and several thousand pages of service related content at brand.com. We've been operating an ancillary ecommerce store that sells related 3rd party products at brand.com/shop for a little over a year. We recently invested in a platform upgrade and moved our site to a new domain, brandshop.com. We implemented page-level 301 redirects including all category pages, product detail pages, canonical and non-canonical URLs, etc.. which the understanding that there would not be any loss in page rank. What we're seeing over the last 2 months is an initial dive in organic traffic, followed by a ramp-up period of if impressions (but not position) in the following weeks, another drop and we've steady at this low for the last 2 weeks. Another area that might have hurt us, the 301 redirects were implemented correctly immediately post launch (on a wednesday), but it was discovered on the following Monday that our .htaccess file had reverted to an old version without the redirect rules. For 3-4 days, all traffic was being redirected from brand.com/shop/url to brandshop.com/badurl. Can we expect to recover our organic traffic giving the launch screw up with the .htaccess file, or is it more of an issue with us separating from the brand.com domain? Thanks,
Intermediate & Advanced SEO | | eugene_p
Eugene0 -
Robots.txt - Googlebot - Allow... what's it for?
Hello - I just came across this in robots.txt for the first time, and was wondering why it is used? Why would you have to proactively tell Googlebot to crawl JS/CSS and why would you want it to? Any help would be much appreciated - thanks, Luke User-Agent: Googlebot Allow: /.js Allow: /.css
Intermediate & Advanced SEO | | McTaggart0 -
Mystery 404's
I have a large number of 404's that all have a similar structure: www.kempruge.com/example/kemprugelaw. kemprugelaw keeps getting stuck on the end of url's. While I created www.kempruge.com/example/ I never created the www.kempruge.com/example/kemprugelaw page or edited permalinks to have kemprugelaw at the end of the url. Any idea how this happens? And what I can do to make it stop? Thanks, Ruben
Intermediate & Advanced SEO | | KempRugeLawGroup0 -
Can SEO increase a page's Authority? Or can Authority only be earned via #RCS?
Hi all. I am asking this question to purposefully provoke a discussion. The CEO of the company where I am the in-house SEO sent me a directive this morning. The directive is to take our Website from a PR3 site to a PR5....in 6 months. Now, I know Page Rank is a bit of a deprecated concept, but I'm sure you would agree that "Authority" is still crucial to ranking well. When he first sent me the directive it was worded like this "I want a plan in place with the goal being to "beat" a specific competitor in 6 months." When I prodded him to define "beat," i.e. did he mean "outrank" for every keyword, he answered that he wanted our site to have the same "Authority" that this particular competitor has. So I am left pondering this question: Is it possible for SEO to increase the authority of a page? Or does "Authority" come from #RCS? The second part of this question is what would you do if you were in my shoes? I have been devoting huge amounts of time on technical SEO because the Website is a mess. Because I've dedicated so much time to technical issues, link-earning has taken a back seat. In my mind, why would anyone want to link to a crappy site that has serious technical issues (slow load times, no persistent cart, lots of 404s, etc)? Shouldn't we make the site awesome before trying to get people to link to us? Given this directive to improve our site's "Authority" - would you scrap the technical SEO and go whole hog into a link-earning binge, or would you hunker down and pound away at the technical issues? Which one would you do first if you couldn't do both at the same time? Comments, thoughts and insights would be greatly appreciated.
Intermediate & Advanced SEO | | danatanseo1 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Will blocking google and SE's from indexing images hurt SEO?
Hi, We have a bit of a problem where on a website we are managing, there are thousands of "Dynamically" re-sized images. These are stressing out the server as on any page there could be upto 100 dynamically re-sized images. Google alone is indexing 50,000 pages a day, so multiply that by the number of images and it is a huge drag on the server. I was wondering if it maybe an idea to blog Robots (in robots.txt) from indexing all the images in the image file, to reduce the server load until we have a proper fix in place. We don't get any real value from having our website images in "Google Images" so I am wondering if this could be a safe way of reducing server load? Are there any other potential SEO issues this could cause?? Thanks
Intermediate & Advanced SEO | | James770 -
Most Painless way of getting Duff Pages out of SE's Index
Hi, I've had a few issues that have been caused by our developers on our website. Basically we have a pretty complex method of automatically generating URL's and web pages on our website, and they have stuffed up the URL's at some point and managed to get 10's of thousands of duff URL's and pages indexed by the search engines. I've now got to get these pages out of the SE's indexes as painlessly as possible as I think they are causing a Panda penalty. All these URL's have an addition directory level in them called "home" which should not be there, so I have: www.mysite.com/home/page123 instead of the correct URL www.mysite.com/page123 All these are totally duff URL's with no links going to them, so I'm gaining nothing by 301 redirects, so I was wondering if there was a more painless less risky way of getting them all out the indexes (IE after the stuff up by our developers in the first place I'm wary of letting them loose on 301 redirects incase they cause another issue!) Thanks
Intermediate & Advanced SEO | | James770 -
ECommerce products duplicate content issues - is rel="canonical" the answer?
Howdy, I work on a fairly large eCommerce site, shop.confetti.co.uk. Our CMS doesn't allow us to have 1 product with multiple colour and size options so we created individual product pages for each product variation. This of course means that we have duplicate content issues. The layout of the shop works like this; there is a product group page (here is our disposable camera group) and individual product pages are below. We also use a Google shopping feed. I'm sure we're being penalised as so many of the products on our site are duplicated so, my question is this - is rel="canonical" the best way to stop being penalised and how can I implement it? If not, are there any better suggestions? Also, we have targeted some long-tail keywords in some of the product descriptions so will using rel-canonical effect this or the Google shopping feed? I'd love to hear experiences from people who have been through similar things and what the outcome was in terms of ranking/ROI. Thanks in advance.
Intermediate & Advanced SEO | | Confetti_Wedding0