Are there ways to avoid false positive "soft 404s" by Google
-
Sometimes I get alerts from Google Search Console that it has detected soft 404s on different websites, and since I take great care to never have true soft 404s, they are always false positives.
Today I got one on a website that has pages promoting some events. The language on the page for one event that has sold out says that "tickets are no longer available" which seems to have tripped up Google into thinking the page is a soft 404.
It's kind of incredible to me that in the current era we're in, with things like chatGPT that Google doesn't seem to understand natural language. But that has me thinking, are there some strategies or best practices we can use in how we write copy on the page so Google doesn't flag it as soft 404? It seems like anything that could tell a user that an item isn't available could trip it up into thinking it is a 404. In the case of my page, it's actually important information we need to tell the public that an event has sold out, but to use their interest in that event to promote other events. so I don't want the page deindexed or not to rank well!
-
@IrvCo_Interactive Google's algorithms are not perfect and sometimes can misinterpret the content on a page.
In terms of strategies or best practices for writing copy on a page to avoid triggering a soft 404, one approach is to ensure that the content is unique, relevant, and provides value to the user. Make sure that the page contains substantial content that gives context and information about the event, even if it is sold out. This can include details about past events, photos, videos, or testimonials from attendees.
You can also consider using structured data markup to explicitly indicate that the event is sold out, which can help Google better understand the page's content. This can be done using the "eventStatus" property in the Schema.org markup.
Another approach is to use clear and specific language when describing the event's availability. Instead of using phrases like "no longer available," consider using language like "this event is sold out" or "tickets for this event are no longer available." This can help make it clear to both users and search engines that the page is not a soft 404.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Search Console Showing 404 errors for product pages not in sitemap?
We have some products with url changes over the past several months. Google is showing these as having 404 errors even though they are not in sitemap (sitemap shows the correct NEW url). Is this expected? Will these errors eventually go away/stop being monitored by Google?
Technical SEO | | woshea0 -
Best practices for retiring 100s of blog posts?
Hi. I wanted to get best practices for retiring an enterprise blog with hundreds of old posts with subject matter that won't be repurposed. What would be the best course of action to retire and maintain the value of any SEO authority from those old blog pages? Is it enough to move those old posts into an archive subdirectory and Google would deprioritize those posts over time? Or would a mass redirect of old blog posts to the new blog's home page be allowed (even though the old blog post content isn't being specifically replaced)? Or would Google basically say that if there aren't 1:1 replacement URLs, that would be seen as soft-404s and treated like a 404?
White Hat / Black Hat SEO | | David_Fisher0 -
Many "spin-off" sites - 301 or 401/410?
Hi there, I've just started a new job with a rental car company with locations all over New Zealand and Australia. I've discovered that we have several websites along the lines of "rentalcarsnewzealand", "bigsaverentals" etc that are all essentially clones of our primary site. I'm assuming that these were set up as some sort of "interesting" SEO attempt. I want to get rid of them, as they create customer experience issues and they're not getting a hell of a lot of traffic (or driving bookings) anyway. I was going to just 301 them all to our homepage - is this the right approach? Several of the sites are indexed by Google and they've been linked up to a number of sites - the 301 move wouldn't be to try to derive any linkjuice or anything of that nature, but simply to get people to our main site if they do find themselves clicking a link to one of those sites. Thanks very much for your advice! Nicole
Technical SEO | | AceRentalCars0 -
Target="_blank"
Do href links that leave a site and use target="_blank" to open a new tab impact SEO?
Technical SEO | | ChristopherGlaeser0 -
Problem with Google SERPS
I am running yoast SEO plugin in WP. I just noticed when I google the client, none of their meta data is showing. I see that I had facebook OG clicked, which looks like it made duplicates of all the titles etc. Would that be the problem? I have since turned it off. I am hoping that was the problem. Also, when the client searches it says in the meta desc - you've viewed this site many times". What is that?
Technical SEO | | netviper0 -
Google WMT continues reporting fixed 404s - why?
I work with a news site that had a heavy restructuring last spring. This involved removing many pages that were duplicates, tags, etc. Since then, we have taken very careful steps to remove all links coming into these deleted pages, but for some reason, WMT continues to report them. By last August, we had cleared over 10k 404s to our site, but this lasted only for about 2 months and they started coming back. The "linked from" gives no data, and other crawlers like seomoz aren't detecting any of these errors. The pages aren't in the sitemap and I've confirmed that they're not really being linked from from anywhere. Why do these pages keep coming back? Should I even bother removing them over and over again? Thanks -Juanita
Technical SEO | | VoxxiVoxxi0 -
SeoMoz crawler giving false positives?
SeoMoz crawler indicated a few times that my site has a duplicate home page error (http://mysite.com and www.mysite.com) I eliminated the the couple remaining internal links that pointed to http://mysite on a couple pages (all other internal links point to http://www.mysite.com) I ran the crawl again and it said no errors this time. I naturally thought the duplicate page error problem was fixed. However this morning I got the regularly scheduled crawl report from SeoMoz that said again I have those duplicate error pages. No changes were made to any of my site's pages between the crawls. That makes me wonder if the crawler is providing false positives at times or was wrong when it said on the crawl a couple days ago that I don't have any errors (no duplicate page error). Now, I don't know what to think.
Technical SEO | | finalfrontier0 -
Google plus
With Google search plus your world, would i see results ONLY from Google plus followers ? or from someone who is my facebook friend as well.
Technical SEO | | seoug_20050