News Errors In Google Search Console
-
Years ago a site I'm working on was publishing news as one form of content on the site.
Since then, has stopped publishing news, but still has a q&a forum, blogs, articles... all kinds of stuff.
Now, it triggers "News Errors" in GWT under crawl errors. These errors are
"Article disproportionately short"
"Article fragmented" on some q&a forum pages
"Article too long" on some longer q&a forum pages
"No sentences found"
Since there are thousands of these forum pages and it's problem seems to be a news critique, I'm wondering what I should do about it. It seems to be holding these non-news pages to a news standard:
https://support.google.com/news/publisher/answer/40787?hl=en
For instance, is there a way and would it be a good idea to get the hell out of Google News, since we don't publish news anymore? Would there be possible negatives worth considering?
What's baffling is, these are not designated news urls. The ones we used to have were /news/title-of-the-story per...
https://support.google.com/news/publisher/answer/2481373?hl=en&ref_topic=2481296
Or, does this really not matter and I should just blow it off as a problem.
The weird thing is that we recently went from http to https and The Google News interface still has us as http and gives the option to add https, which I am reluctant to do sine we aren't really in the news business anymore.
What do you think I should do?
Thanks!
-
Update: 5 months later, the problem has long since gone away.
-
Thanks for the answers Matthew & Martijn. So, I'm going to go with what I have in place... 301s to the main forums page to pick up/recycle the incoming links to that section and removing the /news category from where Google News looks for news, but not leave the Google News program altogether right now.
Thanks, again!
-
I'd opt for 301s if there is any link equity on those pages worth pushing to the forum pages. If not, I'd robots.txt them out to save on wasted crawl.
-
HI Matthew,
Thanks for the insight. On your, "I think the important part is that Google doesn't suddenly crawl a bunch of 404s that they think are news pages of some importance" would you robots.txt out the old category they looked at (/news) or leave it and figure out via the mass of 301s and the removal from the url structure we told them to crawl, so that Google figures it out on it's own?
-
Hi there,
It wouldn't be a problem to get out of Google News, but by the same logic, it wouldn't hurt just to leave it alone and let them figure out you're not publishing news anymore. It won't affect your web search rankings, since you're not targeting traffic from the news onebox or news.google.com.
These old pages redirect to forum pages? I think the important part is that Google doesn't suddenly crawl a bunch of 404s that they think are news pages of some importance.
-
Possible, hard to guess what your current set-up looks like. What you could do alternatively is set up a robots.txt with Disallow statement that are only targeting the Google News bot instead of just the general Google Bot.
-
Hi Martijn,
We still have the old /news url still in Google News. I don't think we've ever submitted a news site map.
Is there any downside to deleting the url structure they're currently looking at, which does forward to our forum? Or, would it be better to just get out of Google News altogether, since we don't really have or publish news anymore? Is there a downside to that?
To summarize... delete the url they look to as news or get out of Google News altogether... what do you think?
Thanks!
-
Are you still submitting a Google News Sitemap to Google Search Console? Because usually that's the biggest reason where these errors are coming from because Google is picking up these kind of 'new' pages/articles as news content and then seeing it doesn't match with their guidelines.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google News error in Google Search Console
My google search console states some errors as below: 1. Article fragmented Some of the urls in this error are the category urls. How to make google bot understand it is a category not an article? 2. Article too short In fact the article is quite long. I do not know why this is happen... 3. No sentence found In fact, there are a lot of sentences Please help!
Intermediate & Advanced SEO | | binhlai0 -
Malicious Software Warnings in Search Console
google search console said my site failed due to malicious software being hosted. the files creating the problems are behind a username and login gate and are user submitted. the host url is also a subdomain of the main site. unfortunately, catching all malicious files is an unwinable cat and mouse. What is he best seo strategy for this situation?
Intermediate & Advanced SEO | | woolbert0 -
Google Search Console - Indexed Pages
I am performing a site audit and looking at the "Index Status Report" in GSC. This shows a total of 17 URLs have been indexed. However when I look at the Sitemap report in GSC it shows 9,000 pages indexed. Also, when I perform a site: search on Google I get 24,000 results. Can anyone help me to explain these anomalies?
Intermediate & Advanced SEO | | richdan0 -
Google and private networks?
I have one or two competitors (in the UK) in my field who buy expired 1 - 8 year old domains on random subjects (SEO, travel, health you name it) and they are in the printing business and they stick 1 - 2 articles (unrelated to what was on there before) on these and that's it. I think they stick with PA and DA above 30 and most have 10 – 100 links so well used expired domains, hosted in the USA and most have different Ip’s although they now have that many (over 70% of their backlink profile) that some have the same ip. On further investigation none of the blogs have any contact details but it does look like they have been a little smart here and added content to the about us (similar to I use to run xxx but now do xxx) also they have one or two tabs with content on (article length) that is on the same subject they use to do and the titles are all the same content. So basically they are finding expired 1 – 10 year old domains that have only been expired (from what I can see) 6 months max and putting 1 – 2 articles on the home page in relation with print (maybe adding a third on the subject the blog use to cover), add 1 – 3 articles via tabs at the top on subjects the sites use to cover, registering the details via xbybssgcf@whoisprivacyprotect.com and that’s it. They have been ranking via this method for the last couple of years (through all the Google updates) and still do extremely well. Does Google not have any way to combat link networks other than the stupid stuff such as public link networks, it just seems that if you know what you are doing you get away, if your big enough you get away with it but the middle of the ground (mum and pop sites) get F*** over with spam pointing to there site that no spammer would dream of doing anyway?
Intermediate & Advanced SEO | | BobAnderson0 -
Keyword tool for news?
Working on developing a news product and wondering if there are tools available to gauge search interest in a particular topic. For those that work in news, what are your favorite SEO tools?
Intermediate & Advanced SEO | | nicole.healthline0 -
Is Affiliate masking a problem for Google?
Does Google consider affiliate masking as unethical? I have a offer website that has 1000's of affiliate links that are masked. I feel google does not like masking URL's which is why my ranking started dropping. Can any one explain the context of affiliate masking please?
Intermediate & Advanced SEO | | SEOMad0 -
Recovering from Programmers Error
Hey Everybody! Last year one of my bigger sites hit a snaffu. I was getting about 300k + hits a day from google, and then, when a developper released an update with a robots.txt file that basically blocked google from the entire site. We didn't notice the bug until a few days later, but by then, it was already too late. My google traffic dropped to 30k a day and I've been having the hardest time coming back ever since. As a matter of fact, hundreds of sites that were aggregating my content started outranking me for my own terms. For over a year, I've been working on building what I lost back and everything seemed to be coming together. I was back at 100k+ hits a day Until today... My developpers repeated the exact same error as last year. They blocked google from crawling my site for over 5 days and now I'm down to 10k se hits a day. My question : Has anyone encountered this problem before and what did you do to come back?
Intermediate & Advanced SEO | | CrakJason0 -
Link to Google Places, or Google Maps?
On our contact page, we offer a link to view Google Maps for directions. I'm wondering should we be linking to our Google Places page instead, or just stick with the Google Map link? Thanks!
Intermediate & Advanced SEO | | GravitateMCC0