Snippet not work for my article in google
-
I add the script for star snippet in my website but not work in my posts
you can see it in this URL https://youtech.ooo/showbox-apk-download/
when I searched in google my custom keyword "Showbox" my competitor show with star snippet in SERP but my site doesn't show snippet stars.
Thank You!
-
Same thing is happening to me on random products. I use the "@id" /#Product string to include my Feefo reviews in SERP results which has worked fine until today, nothing changed on my side either.
GSC and Rich Results Test both state "Missing field 'review' (optional)", but the old Structured Data Tool shows it working fine, and so does GSC if I test the live URL.
I've also reindexed, cleared caches, re-made the structured data etc but no luck. I can see that "page actions are temporarily disabled" though, could this be some kind of bug while Google is updating stuff?
-
Random products are giving me the same problem. The "@id" /#Product string has worked perfectly for me up to now; nothing on my end has changed either.
A "Missing field'review' (optional)" error appears in GSC and Rich Results Test, yet the previous Structured Data Tool and GSC test the live URL with no problems.
To no avail, I've tried everything else: reindexing, clearing caches, re-creating structured data, etc. I can see that "page actions are temporarily disabled," but might this be a problem while Google is changing their services??
-
Same thing is happening to me on random products. I use the "@id" /#Product string to include my Feefo reviews in SERP results which has worked fine until today, nothing changed on my side either.
GSC and Rich Results Test both state "Missing field 'review' (optional)", but the old Structured Data Tool shows it working fine, and so does GSC if I test the live URL.
I've also reindexed, cleared caches, re-made the structured data etc but no luck. I can see that "page actions are temporarily disabled" though, could this be some kind of bug while Google is updating stuff?
-
Same thing is happening to me on random products. I use the "@id" /#Product string to include my Feefo reviews in SERP results which has worked fine until today, nothing changed on my side either.
GSC and Rich Results Test both state "Missing field 'review' (optional)", but the old Structured Data Tool shows it working fine, and so does GSC if I test the live URL.
I've also reindexed, cleared caches, re-made the structured data etc but no luck. I can see that "page actions are temporarily disabled" though, could this be some kind of bug while Google is updating stuff? Getmyoffer.capitalone.com
-
For the page I see schema for FAQPage and SoftwareApplication among others. But I don't see product review schema.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Fake Links indexing in google
Hello everyone, I have an interesting situation occurring here, and hoping maybe someone here has seen something of this nature or be able to offer some sort of advice. So, we recently installed a wordpress to a subdomain for our business and have been blogging through it. We added the google webmaster tools meta tag and I've noticed an increase in 404 links. I brought this up to or server admin, and he verified that there were a lot of ip's pinging our server looking for these links that don't exist. We've combed through our server files and nothing seems to be compromised. Today, we noticed that when you do site:ourdomain.com into google the subdomain with wordpress shows hundreds of these fake links, that when you visit them, return a 404 page. Just curious if anyone has seen anything like this, what it may be, how we can stop it, could it negatively impact us in anyway? Should we even worry about it? Here's the link to the google results. https://www.google.com/search?q=site%3Amshowells.com&oq=site%3A&aqs=chrome.0.69i59j69i57j69i58.1905j0j1&sourceid=chrome&es_sm=91&ie=UTF-8 (odd links show up on pages 2-3+)
Technical SEO | | mshowells0 -
Google Webmasters Quality Issue Message
I am a consultant who works for a website www.skift.com. Today we received an automated message from Google Webmasters saying our site has quality issues. Since the message is very vague and obviously automated I was hoping to get some insight into whether this message is something to be very concerned about and what can be done to correct the issue.From reviewing the Webmasters Quality Guidelines, the site is not in violation of any of the guidelines. I am wondering if this message is generated as a results of licensing content from Newscred, as I have other clients who are licensing content from Newscred and getting the same message from Google Webmasters.Thanks in advance for any assistance.
Technical SEO | | electricpulp0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
Google webmaster errors
**If you know what these google webmasters errors mean, and you can explain it to me in simple english and tell me how I can locate the problem, I would really appreciate it!. <colgroup><col width=""><col width=""><col width=""><col width=""><col width="*"><col width="124"><col width="54"></colgroup>
Technical SEO | | Joseph-Green-SEO
| | | | | Server error | | | | Soft 404 | | | | Access denied | | Not found | | | Not followed | | | |** I have many of these errors, is it harming SEO?Yoseph0 -
Google SERPs and NoIndex directives.
We have pages that have been added to robots.txt as url patterns in DisAllow. Also, we have the meta noindex tags on the pages themselves. But we are finding the pages in index. I don't think they are higher up in the rankings and they don't have any descriptions, any previews or any cached pages. Why does Google show these pages? Could it be due to internal or external linking?
Technical SEO | | gaganc0 -
How can I see if my website was penalize by Google?
Hello, I have a website http://digitaldiscovery.eu that I have been working for 7 months. Everything is alright in the index of the search engines like Google, Bing e Yahoo. I also have like 1000 visits a month wich is not bad for the topic Im pointing at in my country. However my pagerank insist to be on 0, and I really dont understand why. Some of the my competitors that started at the same time, already have a pagerank of 3 and they do not have the same visitors that I do. In the rank system of Alexa im climbing very fast and the visits of my website are growing. So why does the pagerank dont climb aswell?! Tks in advance, Pedro M Pereira
Technical SEO | | PedroM0 -
Google causing Magento Errors
I have an online shop - run using Magento. I have recently upgraded to version 1.4, and I installed a extension called Lightspeed, a caching module which makes tremendous improvements to Magento's performance. Unfortunately, a confoguration problem, meant that I had to disable the module, because it was generating errors relating to the session, if you entered the site from any page other than the home page. The site is now working as expected. I have Magento's error notification set to email - I've not received emails for errors generated by visitors. However over a 72 hour period, I received a deluge of error emails, which where being caused by Googlebot. It was generating an erro in a file called lightspeed.php Here is an example: URL: http://www.jacksgardenstore.com/tahiti-vulcano-hammock IP Address: 66.249.66.186 Time: 2011-06-11 17:02:26 GMT Error: Cannot send headers; headers already sent in /home/jack/jacksgardenstore.com/user/jack_1.4/htdocs/lightspeed.php, line 444 So several things of note: I deleted lightspeed.php from the server, before any of these error messages began to arrive. lightspeed.php was never exposed in the URL, at anytime. It was referred to in a mod_rewrite rule in .htaccess, which I also commented out. If you clicked on the URL in the error message, it loaded in the browser as expected, with no error messages. It appears that Google has cached a version of the page which briefly existed whilst Lightspeed was enabled. But I though that Google cached generated HTML. Since when does cache a server-side PHP file ???? I've just used the Fetch as Googlebot facility on Webmaster Tools for the URL in the above error message, and it returns the page as expected. No errors. I've had to errors at all in the last 48 hours, so I'm hoping it's just sorted itself out. However I'm concerned about any Google related implications. Any insights would be greatly appreciated. Thanks Ben
Technical SEO | | atticus70 -
Best practices for temporary articles
Hello, I would like to have expert inputs about the best way to manage temporary content? In my case, I've a page (ex : mydomain.com/agenda) where I have listing of temporary article, with a lifetime of 1 month to 6 months for some of them. My articles also have a specific url like for ex : mydomain.com/agenda/12-02-2011/thenameofmyarticle/ As you can guess, I got hundreds of 404 😞 I'm already using canonical tag, should I use a in the listing page? I'm a bit lost here..
Technical SEO | | Alexandre_0