Is "Author Rank," User Comments Driving Losses for YMYL Sites?
-
Hi, folks!
So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design.
We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag.
Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites.
As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all:
1. Are user comments on YMYL sites problematic for Google now?
I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic?
2. Is Google "Author Rank" finally happening, sort of?
From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way?
It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish).
Even worse -- our resource pages don't even list the author.
Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back!
Thanks!
-
We have informational and retail websites where we put a LOT of effort into our content. We are trying to produce the best-on-the-web. All of this content is created and edited by people who have both formal education and deep experience in the content area.
There is no way that we would allow user-generated content on these websites - even though we are not in a YMYL (your money, your life) type of industry. User-generated content can be excellent, but a high percentage of it is deeply flawed and far, far below our editorial standards. We have experience people in our own industry who want to submit content but we reject it because it is below our quality standards.
The above is why we don't allow user-generated content based upon editorial standards.
I have read information published by Google where they say that a vigorous comment section can be a sign of a quality website. But, I believe that applies to content types where opinion, kibitzing and prattle are acceptable. However, medical sites (and other types of websites) are an entirely different matter. Low quality content can result in problems for the reader - even if it is in a comments section. Nobody knows exactly how Google views this, but I am going to protect my visitors from BS and poor-quality information.
-
Many thanks, EGOL. I agree that the author profiles need to be improved for sure.
What do you think about the possibility that user-generated comments on a health news site are a concern for Google, re: readers reading comments that are not created by established experts? Could user comments now be a negative ranking factor for health sites?
-
Magdalena's example shows that you understand the problem. Implementation might significantly improve your situation. And just as important... implementation will enable your visitors to see Magdalena's credentials. Do it for your visitors even if Google is not a concern. Your authors also deserve to have this work done.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Delay between being indexed and ranking for new pages.
I've noticed with the last few pages i've built that there's a delay between them being indexed and them actually ranking. Anyone else finding that? And why is it like that? Not much of an issue as they tend to pop up after a week or so, but I am curious. Isaac.
Algorithm Updates | | isaac6630 -
Penguin 3.0 Site Dropped after Update
Hi We was hit by the Penguin update a long time ago and we lost a lot of traffic/positions because of this. For a long time we worked really hard to identify all off our links that may have caused us to recieve this penalty. After Months of work we submitted the disavow file and reconsideration request and in June 2014 we recieved confirmation from google in webmaster tools that the manual spam action had been revoked. over time we then started to recieve more traffic and better positions in the serps, however since penguin 3.0 we have dropped again for a range of keywords. many going from page 1 to 2 or page 2 to 3/4 Any ideas what we should do here , any help will be really appriciated as I'm totally confused We havent done any link building at all since the penalty / recovery
Algorithm Updates | | AMG1000 -
Long term rankings drop after swapping primary domain
Hey...this is my first post on Moz so please go easy on me! I've recently been baffled by the ranking behavior of a domain I do SEO for. In short, the primary domain was "musashispicymayo.com". After several months of SEO efforts and a really solid PR run the site managed to run up to #1 for several target keywords. For the purposes of this question I'd like to focus on the term "spicy mayo". "Musashispicymayo.com" was steadily climbing for as far back as page 5 until it ultimately reached #1 rank on Google for "spicy mayo". We also had another domain "musashifoods.com" which was originally 301 redirecting to "Musashispicymayo.com". About 3 months ago (shortly after acquiring the top ranking) the client wanted to reverse the domains so we started using "musashifoods.com" as the primary and redirecting "musashispicymayo.com" to that. In summary:
Algorithm Updates | | Andy-Twizen
ORIGINALLY: musashifoods.com 301 redirect -> musashispicymayo.com
NOW: musashispicymayo.com 301 redirect -> musashifoods.com At the time of the swap I did the following: Redirected the domain using a 301 via htaccess (made sure "www" requests are forwarded too) Created a new Google analytics account / webmaster account for "musashifoods.com" Went into my old webmaster tools account and used the change of address tool In the new webmaster tools account i submitted a sitemap and requested a crawl of the new domain Ensured the new primary domain was properly configured and all pages had the correct urls in the source code Verified that Google has updated their index and "musashifoods.com" now shows in the results. Now of course musashispicymayo has the keyword in the domain but I find it hard to believe that that is what caused such a dramatic and swift drop in rankings. In fact a good portion of the backlinks actually point to "musashifoods.com"...Did I miss something else here? Does Google penalize you for reversing 301 redirects like that instead of just using a new domain altogether? Let me know if I can provide any additional info that would help clarify...any advice is greatly appreciated!0 -
Number of Items As a Google Ranking Factor??
If I search for "hiking boots" and scan down the SERPs I see the following... Google reports "483 items" for the Zappos.com page. Google reports "Results 1 - 36 of 85" for the Shoebuy.com page (and that does not appear in their code). So, Google is obviously paying attention to the depth of your information or the number of items that you are showing. If they think that is important enough to count and report in the SERPs, might they also be using that information as a ranking factor?? PRACTICAL APPLICATION FOR SEO: If google is using this information, perhaps people should list all of their color, size, etc variants on a single page. For example if you sell widgets in five colors, instead of making one page for each color, list all five on the same page.
Algorithm Updates | | EGOL1 -
Does having a few URLs pointing to another url via 301 "create" duplicate content?
Hello! I have a few URLs all related to the same business sector. Can I point them all at my home domain or should I point them to different relevant content within it? Ioan
Algorithm Updates | | IoanSaid1 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
How To Rank High In Google Places?
Hello SEOmoz, This question has been hounding me for a long time and I've never seen a single reliable information from the web that answers it. Anyway here's my question; Supposing that there are three Google places for three different websites having the same categories and almost same keywords and same district/city/IP how does Google rank one high from the other? Or simply put if you own one of those websites and you would want to rank higher over your competitors in Google places Search results how does one do it? A number of theories were brought up by some of my colleagues: 1. The age of the listing 2. The number of links pointing to the listing (supposing that one can build links to ones listing) 3. The name/url of the listing, tags, description, etc. 4. The address of the listing. 5. Authority of the domain (linked website) You see some listings have either no description, and only one category and yet they rank number one for a specific term/keyword whereas others have complete categories, descriptions etc. If you could please give me a definite answer I will surely appreciate it. Thank you very much and more power!
Algorithm Updates | | LeeAnn300 -
Were you affected by the "Farmer Update?" What are you doing about it?
I woke up on Friday morning to see that my traffic from Google on Thursday was down 30% on one of my sites. Traffic hasn't bounced back, and I'm wondering why I've been lumped in with the content farms. My site only has original, high quality content. It has a great link profile with tons of links from .edu page, and I've always played by Google's rules. I can't understand why my site has been negatively affected, which makes it hard to do something about it. Right now, the only thing that I can come up with is to work really hard at building more links. Were you affected? What are you doing about it?
Algorithm Updates | | WillyF0