Is "Author Rank," User Comments Driving Losses for YMYL Sites?
-
Hi, folks!
So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design.
We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag.
Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites.
As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all:
1. Are user comments on YMYL sites problematic for Google now?
I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic?
2. Is Google "Author Rank" finally happening, sort of?
From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way?
It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish).
Even worse -- our resource pages don't even list the author.
Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back!
Thanks!
-
We have informational and retail websites where we put a LOT of effort into our content. We are trying to produce the best-on-the-web. All of this content is created and edited by people who have both formal education and deep experience in the content area.
There is no way that we would allow user-generated content on these websites - even though we are not in a YMYL (your money, your life) type of industry. User-generated content can be excellent, but a high percentage of it is deeply flawed and far, far below our editorial standards. We have experience people in our own industry who want to submit content but we reject it because it is below our quality standards.
The above is why we don't allow user-generated content based upon editorial standards.
I have read information published by Google where they say that a vigorous comment section can be a sign of a quality website. But, I believe that applies to content types where opinion, kibitzing and prattle are acceptable. However, medical sites (and other types of websites) are an entirely different matter. Low quality content can result in problems for the reader - even if it is in a comments section. Nobody knows exactly how Google views this, but I am going to protect my visitors from BS and poor-quality information.
-
Many thanks, EGOL. I agree that the author profiles need to be improved for sure.
What do you think about the possibility that user-generated comments on a health news site are a concern for Google, re: readers reading comments that are not created by established experts? Could user comments now be a negative ranking factor for health sites?
-
Magdalena's example shows that you understand the problem. Implementation might significantly improve your situation. And just as important... implementation will enable your visitors to see Magdalena's credentials. Do it for your visitors even if Google is not a concern. Your authors also deserve to have this work done.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Homepage title tag: "Keywords for robots" vs "Phrases for users"
Hi all, We keep on listening and going through the articles that "Google is all about user" and people suggesting to just think about users but not search engine bots. I have gone through the title tags of all our competitors websites. Almost everybody directly targeted primary and secondary keywords and few more even. We have written a very good phrase as definite title tag for users beginning with keyword. But we are not getting ranked well comparing to the less optimised or backlinked websites. Two things here to mention is our title tag is almost 2 years old. Title tag begins with secondary keyword with primary keyword like "seo google" is secondary keyword and "seo" is primary keyword". Do I need to completely focus on only primary keyword to rank for it? Thanks
Algorithm Updates | | vtmoz0 -
Dramatic drop in SEO rankings after recovering from hacking
A few months ago my client's website was hacked which created over 20,000+ spammy links on the site. I dealt with removing the malware and got google to remove the malware warning shortly within a week of the hacking. Then started the long process to do 301 redirects and disavowing links under Webmaster tools over these few months. The hacking only caused a slight drop in rankings at the time. Now just as of last week the site had a dramatic drop in rankings. When doing a keyword search I noticed the homepage doesn't even get listed on Google Maps and for Google Search instead the inner pages like the Contact Us page show up instead of the homepage. Does anyone have any insight to the sudden drop happening now and why the inner pages are ranking higher than the homepage now?
Algorithm Updates | | FPK0 -
"Update" in Search Console is NOT an Algo Update
We've had a few questions about the line labeled "Update" in Google Search Console on the Search Analytics timeline graph (see attached image). Asking around the industry, there seems to be a fair amount of confusion about whether this indicates a Google algorithm update. This is not an algorithm update - it indicates an internal update in how Google is measuring search traffic. Your numbers before and after the update may look different, but this is because Google has essentially changed how they calculate your search traffic for reporting purposes. Your actual ranking and traffic have not changed due to these updates. The latest updated happened on April 27th and is described by Google on this page: Data anomalies in Search Console Given the historical connotations of "update" in reference to Google search, this is a poor choice of words and I've contacted the Webmaster Team about it. 2CsyN7Q
Algorithm Updates | | Dr-Pete12 -
Can someone explain the attached rankings?
I just don't understand how we can have a long list of 1's and 2's in google, but in Yahoo and Bing, some are close, some are in the 20's and some are not even in the top 50. Is there something that Yahoo and Bing care about, that google doesn't? I know about the meta language being more important for Yahoo and Bing than Google, so I added that. There's nothing I can do about the domain age, which I know is important to Yahoo and Bing. Is there anything else? thanks, Ruben [URL]]([URL=http://imgur.com/Vem594l][IMG]http://i.imgur.com/Vem594l.jpg[/IMG][/URL]) [IMG]](http://i.imgur.com/Vem594l.jpg[/IMG])
Algorithm Updates | | KempRugeLawGroup0 -
How can I tell Google two sites are non-competing?
We have two sites, both English language. One is a .ca and the other is a .com, I am worried that they are hurting one another in the search results. I'd like to obviously direct google.ca towards the .ca domain and .com towards the .com domain and let Google know they are connected sites, non-competing.
Algorithm Updates | | absoauto0 -
Privacy page ranking above home page in serps
I'm using OSE to try and get some clues as to why my privacy page would rank higher than my home page. Could anyone help me figure out which metrics to review to rectify the issue? My key word is: Mardi Gras Parade Tickets The url that is ranking is <cite>www.mardigrasparadetickets.com/pages/privacy</cite> I'm happy to be ranking in the top 3 for the keyword, but I'd rather hoped it wouldn't be my privacy page. Any help would be awesome, Cy
Algorithm Updates | | Nola5040 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Google Page Rank?
We have had a quality website for 12 years now, and it seems no matter how many more links we get and how much new content we add daily, we have stayed at PR3 for the past 10 years or so. Our SEOMoz domain authority is 52. We have over 950,000 pages linking to us from 829 unique root domains. Is this in line with PR3 or should we be approaching PR4 soon? We do daily blog posts with all unique, fresh quality content that has not been published elsewhere. We try to do everything with 'white hat' methods, and we are constantly trying to provide genuine content and high quality products, and customer service. How can we improve our PR and how important is PR today?
Algorithm Updates | | applesofgold0