HTML snapshot creating soft 404
-
Has anyone any experience with HTML snapshots? We have a recruitment client that has HTML snapshots against all job pages as they are built with AJAX.
The pages naturally die after around four weeks (the job vacancy runs out) and whilst the AJAX version of the page hard 404s, the HTML snapshot version returns a soft 404. How can we get it to mirror the dead page with 404 status?
-
A side note first. Something to consider on transient content for job listings like this that I have used on job sites I have worked on and worked pretty well - The unavailable after meta tag
http://searchengineland.com/googles-matt-cutts-seo-advice-unavailable-e-commerce-products-186882
"The “unavailable_after” Meta tag will allow you to tell Google that a page should expire from the search results at a specific time. "
This way your pages would be removed from the index on the date you list and if you have also removed the links from your sitemap etc, Google may not need to crawl them and find the 404 and/or soft404 to begin with.
The soft 404 (according to Google) means your server is not showing a 404 server response for the HTML snapshot version. I would try fetch as Google on those pages to see what Google is seeing and that may help you diagnose the situation. I may be that your server is giving a different response than the 404 and Google is questioning it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Non-standard HTML tags in content
I had coded my website's article content with a non-standard tag <cnt>that surrounded other standard tags that contained the article content, I.e.</cnt> , . The whole text was enclosed in a div that used Schema.org markup to identify the contents of the div as the articleBody. When looking at scraped data for stories in Webmaster Tools, the content of the story was there and identified as the articleBody correctly. It's recently been suggested by someone else that the presence of the non-standard <cnt>tags were actually making the content of the article uncrawlable by the Googlebot, this effectively rendering the content invisible. I did not believe this to be true, since the content appeared to be correctly indexed in Webmaster Tools, but for the sake of a test I agreed to removing them. In the last 6 weeks since they were removed, there have been no changes in impressions or traffic from organic search, which leads me to believe that the removal of the <cnt>tags actually had no effect, since the content was already being indexed successfully and nothing else has changed.</cnt></cnt> My question is whether or not an encapsulating non-standard tag as I've described would actually make the content invisible to Googlebot, or if it should not have made any difference so long as the correct Schema.org markup was in place? Thank you.
Technical SEO | | dlindsey0 -
404 not found page appears as 200 success in Google Fetch. What to do to correct?
We have received messages in Google webmaster tools that there is an increase in soft 404 errors. When we check the URLs they send to the 404 not found page:
Technical SEO | | Madlena
For example, http://www.geographics.com/images/01904_S.jpg
redirects to http://www.geographics.com/404.shtml.
When we used fetch as Google, here is what we got: .
#1 Server Response: http://www.geographics.com/404.shtml
HTTP/1.1 200 OK Date: Thu, 26 Sep 2013 14:26:59 GMT
What is wrong and what shall we do? The soft 404 errors are mainly for images that no longer exist in the server. Thanks!0 -
301 or a 404
Just had a discussion with a collegue about a page on our own website. We have some cases which are outdated. These pages receive some visitors but they arrive there when they search for the clients brand name, so for us they are irelevant. What's the best way to handle these kind of pages? Is a 301-redirect to the showcase overview the way to go or do we make it a 404 and include the showcase overview in this 404?
Technical SEO | | nvs.nim0 -
I am using SEOmoz pro software and my blog tags are bringing up 404 errors.
After checking they do bring back a 404 page, so i am wondering what to do. Do i remove all the blog tags? We use a Drupal cms system.
Technical SEO | | AITLtd0 -
Half Implemented HTML 5 Structure
Hi there, I have just notced on a website that it has a halt implemented html 5 structure. Well, when I say half implemented, it has the doctype and then one <header>section. After that all of the divs are custom ones that have been added for the CSS. Could this lack of structure have a negative effect on the site? Cheers, Edward </header>
Technical SEO | | edwardlewis0 -
Micro formats to block HTML text portions of pages
I have a client that wants to use micro formatting to keep a portion of their page (the disclaimer) from being read by the search engines. They want to do this because it will help with their keyword density on the rest of the page and block the “bad keywords” that come from their legally required disclaimer. We have suggested alternate methods to resolve this problem, but they do not want to implement those, they just want a POV from us explaining how this micro formatting process will work. And that’s where the problem is. I’ve never heard of this use case and can’t seem to find anyone who has. I'm posting the question to the Moz Community to see if anyone knows how microformats can keep copy from being crawled by the bots. Please include any links to sites that you know that are using micro formatting in this way. Have you implemented it and seen results? Do you know of a website that is using it now? We're looking for use cases please!
Technical SEO | | Merkle-Impaqt0 -
Duplicate XML sitemaps - 404 or leave alone?
We switched over from our standard XML sitemap to a sitemap index. Our old sitemap was called sitemap.xml and the new one is sitemapindex.xml. In Webmaster Tools it still shows the old sitemap.xml as valid. Also when you land on our sitemap.xml it will display the sitemap index, when really the index lives on sitemapindex.xml. The reason you can see the sitemap on both URLs is because this is set from the sitemap plugin. So the question is, should we change the plugin setting to let the old sitemap.xml 404, or should we allow the new sitemap index to be accessed on both URLs?
Technical SEO | | Hakkasan0 -
How to find links to 404 pages?
I know that I used to be able to do this, but I can't seem to remember. One of the sites I am working on has had a lot of pages moving around lately. I am sure some links got lost in the fray that I would like to recover, what is the easiest way to see links going to a domain that are pointing to 404 pages?
Technical SEO | | MarloSchneider0