My news site not showing in "In the news" list on Google Web Search
-
I got a news website (www.tapscape.com) which is 6 years old and has been on Google News since 2012. However, whenever I publish a news article, it never shows up "In the news" list on Google Web Search. I have already added the schema.org/NewsArticle on the website and have checked it if it's working or not on Google structured data testing tool. I see everything shows on on the structured data testing tool. The site already has a news sitemap (http://www.tapscape.com/news-sitemap.xml) and has been added to Google webmaster tools.
News articles show perfectly fine in the News tab, but why isn't the articles being shown on "In the news" list on the Google web search? My site has a strong backlink background already, so I don't think I need to work on the backlinks.
Please let me know what I'm doing wrong, and how can I get it to the news articles on "In the news" list.
Below is a screenshot that I have attached to this question to help you understand what I mean to say.
-
Well I'm pretty sure that the difference ain't in Schema.org as the differences there have nothing to do with the crawlability and the newsworthiness of the content. The first things I would focus on are making sure that Google can find new content as fast as possible and that your crawl-rate goes up. That's really important, and if you're convinced that is fixed then focus more attention on the authority part.
Both ain't easy to fix, but someday you'll get there!
-
I got 2 more sites which are on Google News, and doesn't have much authority. Both sites articles show up "In the news" on Google Web search. However, the site in question has the highest domain authority and is the oldest in age. The other 2 don't have backlinks from any high authority website even.
The only difference between those 2 sites and the one in question is that I am using JSON-LD schema markup on the other 2 sites, while I am using Microdata schema on tapscape.com.
As we all know bing.com doesn't support JSON-LD, therefore I am using Microdata. I searched around if I could use Microdata and JSON-LD schema on one page, but everyone suggested to use one of them since it will confuse Google Bot.
-
Thanks for your answer Martijn,
By considering strong backlink profile, I meant was that big sites such as Tech Crunch, Huffington Post, Engaged, Zdnet, The Verge, Computer World, Wired, and many others have published do-follow backlinks to my website articles. I heard this plays a major role in increasing a website's authority.
My domain authority of my website is 48 at the moment, not sure whats the issue.
-
So the "In the news" block in the Google Search results are still powered by the normal Google Search data and not necessarily by the Google News data sets. That means you sometimes can let go of all the requirements like the news-sitemaps that are required for this. Usually authority seems to be the biggest issue to get into the block. What do you consider a strong backlink profile at the moment and could you shed some light on the industry that you're in?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why did Google Index a weird version of my blog post?
i wrote a page - https://domain.com/how-to-do-xyz/ but when doing an inurl search, i see that it is indexed by google as - https://secureservercdn.net/58584.883848.9834983/myftpupload/how-to-do-xyz/ (not actual url) and when i view that page, it is a weirdly formatted version of the page with many design elements missing. this is a wordpress site. Why would this be? thanks, Ryan
Web Design | | RyanMeighan0 -
Is it against google guidelines to use third party review sites as well as have reviews on my site marked up with schema?
So, i look after a site for my family business. We have teamed up with the third party site TrustPilot because we like the way it enables us to send out reviews to our customers directly from our system. It's been going great and some of the reviews have been brilliant. I have used a couple of these reviews on our site and marked them up with: REVIEW CONTENT We work in the service industry and so one of the problems we have found is that getting our customers to actually go online and leave a review. They normally just leave their comments on a job sheet that the workers have signed when they leave. So I have created a page on our site where we post some of the reviews the guys receive too. I have used the following: REVIEW TITLE REVIEW Written by: CUSTOMER NAME Type of Service:House Removal Date published: DATE PUBLISHED 10 / 10 stars I was just wondering I was told that this could be against googles guidelines and as i've seen a bit of a drop in our rankings in the last week or so i'm a little concerned. Is this getting me penalised? Should I not use my reviews referencing the ones on trust pilot and should i not have my own reviews page with rich snippets?
Web Design | | BearPaw881 -
Average Time to Conversion on Site
I am curious to know if there is a way to view or calculate the average time it takes site visitors to convert per session. For example, based on a current website design, the average time on site might be 3 minutes and the number of conversions might be 100. is there a way to say that for the current website design, it takes 3 minutes for the average site visitor to submit a web form? Then, as I redesign the site, my goal would be to improve the average time to conversion by making the web form more accessible and require less information within the form itself. I don't think this is currently possible in GA. Has anyone figured out a way to accomplish this by use of traditional tracking tools? Or, am I facing having to code my site to record each visitor's time on site from the second they enter and then stop the clock when they submit the form?
Web Design | | dsinger0 -
Site down for more than a month - lost rankings
Hello, We have run into a situtation where we had multiple pages setup for different keywords but didn't realize that we had a name server issue that has caused the pages to be down for the last month or so (2-3 weeks on the low side.) The rank finder was still working fine, but the offline page was never reported. We realized the situation recently and have since gotten the sites back online under the new nameservers. Most of these sites were ranking 1 and 2 spots in their keywords, and now are no where to be found in the Google Index. Should I do anything differently, or just put the sites back online and wait it out? I have seen in different places that it may only take 2 weeks to come back, but it's possible that Google has marked the sites as 'not quality' because of their downtime and it will be even harder to get them to rank again. Can anyone shed any light on this situation? Any information is appreciated. Thanks in advance.
Web Design | | EQ-Richie0 -
Multiple Sites, multiple locations similar / duplicate content
I am working with a business that wants to rank in local searches around the country for the same service. So they have websites such as OURSITE-chicago.com and OURSITE-seattle.com -- All of these sites are selling the same services, but with small variations in each state due to different legal standards in the state. The current strategy is to put up similar "local" websites with all the same content. So the bottom line is that we have a few different sites with the same content. The business wants to go national and is planning a different website for each location. In my opinion the duplicate content is a real problem. Unfortunately the nature of the service makes it so that there aren't many ways to say the same thing on each site 50 times without duplicate content. Rewriting content for each state seems like a daunting task when you have 70+ pages per site. So, from an SEO standpoint we have considered: Using the canonocalization tag on all but the central site... I think this would hurt all of the websites SERPs because none will have unique content. Having a central site with directories OURSITE.com/chicago -- but this creates a problem because we need to link back to the relevant content in the main site and ALSO have the unique "Chicago" content easily accessable to Chicago users while having Seattle users able to access their Seattle data. The best way we thought to do this was using a frame with a universal menu and a unique state based menu... Also not a good option because of frames will also hurt SEO. Rewrite all the same content 50 times. You can see why none of these are desirable options. But I know that plenty of websites have "state maps" on their main site. Is there a way to accomplish this in a way that doesn't make our copywriter want to kill us?
Web Design | | SysAdmin190 -
Google search issue with exact domain
We had a site from Feb-2011 to Nov-2011 at the domain amcoexterminating.com. The site was pure HTML/CSS and the daily unique visitors steadily increased over that time. So all was fine. We then moved the site to a CMS (Joomla) on Dec. 6th. From that day forward, the daily visitors went into the tank. Before the move, if you typed "amcoexterminating.com" or "amco exterminating" into Google search, the site would be the first result (as you'd expect since those are the words that make up the actua domain). But we tried this yesterday and the site did not come up at all. NOT GOOD. It would work in Yahoo or Bing, but not in Google. So obviously, the problem with Google search directly affected the daily visitors. We just checked Webmaster tools yesterday (yes, this should have been done sooner, lesson learned) and it said "Site has severe health issues - Important page blocked by robots.txt". It listed the "important" page URL and it was just a link to an image. Regardless, I wiped out the Joomla created robots.txt file and added a new one and made it just say... User-agent: *Allow: / About 14 hours later, after the new robots.txt file was recognized by Google, the "severe health" message went away. However if I search in Google for "amcoexterminating.com", it still doesn't show up and the client is concerned (as they should be). Do you think the search engines just need more time to refresh? If so, once it refreshes, should the site show up first again right away? Or is it possible the robots.txt file had nothing to do with the issue? If so, what other things could I check into that might cause Google search to not find a site even if you search for exact domain name? Please share any and all things I should look into as I need to get this site showing in Google search again (as it was before moving to the CMS). Thanks!
Web Design | | MarathonMS0 -
Any discussions on the actual web page design and how it might affect SEO?
Are there any links to previous discussions or tips, techniques for how creative design has any impact on seo??
Web Design | | theideapeople1