"Ghost" errors on blog structured data?
-
Hi,
I'm working on a blog which Search Console account advises me about a big bunch of errors on its structured data:
But I get to https://developers.google.com/structured-data/testing-tool/ and it tells me "all is ok":
Any clue?
Thanks in advance,
-
Hi Everett,
Yes it seems that this is the way.
Thanks a lot.
-
Yes it is.
Well, it's both a magento site with a wordpress blog.
Thank you very much
-
Webicultors,
Read this thread on Google's Product Forums. Let us know if it answers your question. If not, at least you're not alone...
Upon reading several similar threads on various forums and Q&A sites, it appears this is a very common occurrence resulting from a disparity between what the two tools define as an "error". The testing tool seems to be limited to errors in syntax / markup while GSC may see missing elements as errors.
-
Hi,
I've just added a couple of screenshots more to illustrate that I've already checked this information you are telling me.
But the test tool keeps telling me: All OK
Thanks
-
Like Kristen mentions, you should be able to see an overview of the Schema.org implementations with the amount of errors that they have individually. So that's why you're already not seeing any changes in the one URL you were testing. In the list you should be easily able to identify the pages with issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Structured data and Google+ Local business page are conflicting
Hi, A few (almost 8 now) months ago we have added structured data to our website. which according to the testing tool should work. (Our url: https://www.rezdy.com) However when searching for our company name, our old local business page from Google+ shows up. I have reached out to google to tell them that we aren't a local business anymore and want the data from the page to be removed. But this all takes painfully long. I want my search result to be shown like the large businesses (examples: Adroll, Hubspot), including logo, twitter feed etc. etc. Will this all work, if so, is there a way to speed up the process, any suggestions?
Technical SEO | | Niek_Dekker1 -
New "Static" Site with 302s
Hey all, Came across a bit of an interesting challenge recently, one that I was hoping some of you might have had experience with! We're currently in the process of a website rebuild, for which I'm really excited. The new site is using Markdown to create an entirely static site. Load-times are fantastic, and the code is clean. Life is good, apart from the 302s. One of the weird quirks I've realized is that with oldschool, non-server-generated page content is that every page of the site is an Index.html file in a directory. The resulting in a www.website.com/page-title will 302 to www.website.com/page-title/. My solution off the bat has been to just be super diligent and try to stay on top of the link profile and send lots of helpful emails to the staff reminding them about how to build links, but I know that even the best laid plans often fail. Has anyone had a similar challenge with a static site and found a way to overcome it?
Technical SEO | | danny.wood1 -
To avoid errors in our Moz crawl, we removed subdomains from our host. (First we tried 301 redirects, also listed as errors.) Now we have backlinks all over the web that are broken. How bad is this, from a pagerank standpoint?
Our MOZ crawl kept telling us we had duplicate page content even though our subdomains were redirected to our main site. (Pages from Wineracks.vigilantinc.com were 301 redirected to vigilantinc.com/wineracks.) Now, to solve that problem, we have removed the wineracks.vigilantinc.com subdomain. The error report is better, but now we have broken backlinks - thousands of them. Is this hurting us worse than the duplicate content problem?
Technical SEO | | KristyFord0 -
Rel="canonical" again
Hello everyone, I should rel="canonical" my 2 languages website /en urls to the original version without /en. Can I do this from the header.php? Should I rel="canonical" each /en page (eg. en/contatti, en/pagina) separately or can I do all from the general before the website title? Thanks if someone can help.
Technical SEO | | socialengaged0 -
Jigoshop "add to cart" producing 302 redirects
Hi, My site is throwing thousands of 302 redirect warnings on crawl for the add to cart process in my Wordpress/Jigshop online store. A sample url the crawl references is: | | https://morrowsnuts.com/product/the-best-of-the-best-8-oz/?add-to-cart=6117&_n=9773652185 | | I have read several other threads here that are similar in nature but haven't discovered a way to eliminate this. I am a store owner and with only partial technology skills and I don't know what to try next. I posted the problem with Jigoshop but I am not sure if they will provide a solution since this was the first time they heard of this. The site is Morrow's Nut House located at: https://morrowsnuts.com Thanks in advance for any direction or suggestions for me on next steps, John
Technical SEO | | MorrowsCandyMan0 -
Help with Webmaster Tools "Not Followed" Errors
I have been doing a bunch of 301 redirects on my site to address 404 pages and in each case I check the redirect to make sure it works. I have also been using tools like Xenu to make sure that I'm not linking to 404 or 301 content from my site. However on Friday I started getting "Not Followed" errors in GWT. When I check the URL that they tell me provided the error it seems to redirect correctly. One example is this... http://www.mybinding.com/.sc/ms/dd/ee/48738/Astrobrights-Pulsar-Pink-10-x-13-65lb-Cover-50pk I tried a redirect tracer and it reports the redirect correctly. Fetch as googlebot returns the correct page. Fetch as bing bot in the new bing webmaster tools shows that it redirects to the correct page but there is a small note that says "Status: Redirection limit reached". I see this on all of the redirects that I check in the bing webmaster portal. Do I have something misconfigured. Can anyone give me a hint on how to troubleshoot this type of issue. Thanks, Jeff
Technical SEO | | mybinding10 -
Can name="author" register as a link?
Hi all, We're seeing a very strange result in Google Webmaster tools. In "Links to your site", there is a site which we had nothing to do with (i.e. we didn't design or build it) showing over 1600 links to our site! I've checked the site several times now, and the only reference to us is in the rel="author" tag. Clearly the agency that did their design / SEO have nicked our meta, forgetting to delete or change the author tag!! There are literally no other references to us on this site, there hasn't every been (to our knowledge, at least) and so I'm very puzzled as to why Google thinks there are 1600+ links pointing to us. The only thing I can think of is that Google will recognise name="author" content as a link... seems strange, though. Plus the content="" only contains our company name, not our URL. Can anybody shed any light on this for me? Thanks guys!
Technical SEO | | RiceMedia0 -
Blog in subfolder or folder
SEO best practices says that one should put blog in a subfolder. Like www.example,com/blog In the above case, should we say that the blog is in folder or subfolder. Actually, i have been unsure about this folder vs subfolder thing. Some examples of this would be appreciated. What is the example of a blog in a subdomain ? Thanks
Technical SEO | | seoug_20050