Just validating a theory I have about why Google is Soft 404ing some of my doctor pages.
-
As of June 1 doctor pages on our website that say "No ratings are available yet" are being Soft 404ed in our Google Console.
We suspect the issue is that wording, due to this post. https://www.contentkingapp.com/academy/index-coverage/faq/submitted-soft-404/
Just wondering if anyone with more expertise than me on 404s or local seo can validate that it is likely this issue.
Some examples:
https://www.nebraskamed.com/doctors/neil-s-kalsi
https://www.nebraskamed.com/doctors/leslie-a-eiland
https://www.nebraskamed.com/doctors/david-d-ingvoldstad -
@terentyev Thanks!
-
@patrick_at_nebraska_medicine I confirm that your assumption about the essence of soft 404s is correct. I am experiencing the same with several pages in online travel (eg. destination pages that have zero search results). To fix it, you have to be creative and find a way to display some content on these zero-ratings pages. I am afraid, that just removing the "no ratings are available" wording will not solve the problem.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG Regarding: https://etcnaija.com
Technical SEO | | etcna0 -
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
Technical SEO | | BVREID0 -
Google Sitelinks
Hello, Good afternoon. I am having a site issue with Sitelinks. For some reason when I search Google for the brand I represent "California Olive Ranch" Sitelinks are not being generated. When I search for "Cal Olive Ranch" our site links are being generated. Our domain is Californiaoliveranch.com. Is there a way to tell Google to to change the site links to match our domain and brand name? Is this something that can be done in Google Webmasters? Thank you very much for your help. Adam P
Technical SEO | | apost40 -
Duplicate pages in Google index despite canonical tag and URL Parameter in GWMT
Good morning Moz... This is a weird one. It seems to be a "bug" with Google, honest... We migrated our site www.three-clearance.co.uk to a Drupal platform over the new year. The old site used URL-based tracking for heat map purposes, so for instance www.three-clearance.co.uk/apple-phones.html ..could be reached via www.three-clearance.co.uk/apple-phones.html?ref=menu or www.three-clearance.co.uk/apple-phones.html?ref=sidebar and so on. GWMT was told of the ref parameter and the canonical meta tag used to indicate our preference. As expected we encountered no duplicate content issues and everything was good. This is the chain of events: Site migrated to new platform following best practice, as far as I can attest to. Only known issue was that the verification for both google analytics (meta tag) and GWMT (HTML file) didn't transfer as expected so between relaunch on the 22nd Dec and the fix on 2nd Jan we have no GA data, and presumably there was a period where GWMT became unverified. URL structure and URIs were maintained 100% (which may be a problem, now) Yesterday I discovered 200-ish 'duplicate meta titles' and 'duplicate meta descriptions' in GWMT. Uh oh, thought I. Expand the report out and the duplicates are in fact ?ref= versions of the same root URL. Double uh oh, thought I. Run, not walk, to google and do some Fu: http://is.gd/yJ3U24 (9 versions of the same page, in the index, the only variation being the ?ref= URI) Checked BING and it has indexed each root URL once, as it should. Situation now: Site no longer uses ?ref= parameter, although of course there still exists some external backlinks that use it. This was intentional and happened when we migrated. I 'reset' the URL parameter in GWMT yesterday, given that there's no "delete" option. The "URLs monitored" count went from 900 to 0, but today is at over 1,000 (another wtf moment) I also resubmitted the XML sitemap and fetched 5 'hub' pages as Google, including the homepage and HTML site-map page. The ?ref= URls in the index have the disadvantage of actually working, given that we transferred the URL structure and of course the webserver just ignores the nonsense arguments and serves the page. So I assume Google assumes the pages still exist, and won't drop them from the index but will instead apply a dupe content penalty. Or maybe call us a spam farm. Who knows. Options that occurred to me (other than maybe making our canonical tags bold or locating a Google bug submission form 😄 ) include A) robots.txt-ing .?ref=. but to me this says "you can't see these pages", not "these pages don't exist", so isn't correct B) Hand-removing the URLs from the index through a page removal request per indexed URL C) Apply 301 to each indexed URL (hello BING dirty sitemap penalty) D) Post on SEOMoz because I genuinely can't understand this. Even if the gap in verification caused GWMT to forget that we had set ?ref= as a URL parameter, the parameter was no longer in use because the verification only went missing when we relaunched the site without this tracking. Google is seemingly 100% ignoring our canonical tags as well as the GWMT URL setting - I have no idea why and can't think of the best way to correct the situation. Do you? 🙂 Edited To Add: As of this morning the "edit/reset" buttons have disappeared from GWMT URL Parameters page, along with the option to add a new one. There's no messages explaining why and of course the Google help page doesn't mention disappearing buttons (it doesn't even explain what 'reset' does, or why there's no 'remove' option).
Technical SEO | | Tinhat0 -
I was googling the word "best web hosting" and i notice the 1st and 3rd result were results with google plus. Does Google plus now play a role in improving ranking for the website?
I was googling the word "best web hosting" and i notice the 1st and 3rd result were results with google plus. Does Google plus now play a role in improving ranking for the website?I see a person's name next to the website too
Technical SEO | | mainguy0 -
Pages not Indexed after a successful Google Fetch
I am trying to understand why google isn't indexing key content on my site. www.BeyondTransition.com is indexed and new pages show up in a couple of hours. My key content is 6 pages of information for each of 3000 events (driven by mySQL on a wordpress platform). These pages are reached via a search page, but no direct navigation from the home page. When I link to an event page from an indexed page it doesn't show up in search results. When I use fetch on webmaster tools the fetch is successful but is then not indexed - or if it does appear in results it's directed to the internal search page e.g. http://www.beyondtransition.com/site/races/course/race110003/ has been fetched and submitted with links but when I search for BeyondTransition Ironman Cozumel I get these results.... So what have I done wrong and how do I go about fixing it? All thoughts and advice appreciated Thanks Denis
Technical SEO | | beyondtransition0 -
2000 pages indexed in Yahoo, 0 in Google. NO PR, What is wrong?
Hello Everyone, I have a friend with a blog site that has over 2000 pages indexed in Yahoo but none in Google and no page rank. The web site is http://www.livingorganicnews.com/ I know it is not the best site but I am guessing something is wrong and I don't see it. Can you spot it? Does he have some settings wrong? What should he do? Thank you.
Technical SEO | | QuietProgress0