Google Structured Data Problem
-
Hello everyone,
About 1-2 weeks ago, I have implemented rich snippets (microdata) for the product pages of my e-commerce site. However, in the web masters tools, google is saying that the crawlers did not detect any structured data in my site.
I have also checked my pages using Structured Data Testing Tool. You can see an example test result in the following address.
What may cause this problem?
Thank you for your help
-
One thing I have suspected, but not confirmed is that Google has multiple crawlers. It seems that they one that extracts structured data takes a lot longer than the regular one. I have about 20 clients who I have implemented structured product data and offer data with, it usually takes a month or so before it starts showing up.
-
Seems that your structured data is well implemented. I would give it a break and hopefully, within 2 more weeks, Google data will start to show up. It actually takes same time for Google to re-crawl your site.
Suggestion: Use G+ publisher markup.
Workaround: Use Google's Data Highlighter and it will actually crawl and index your markup faster.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Having Problems to Index all URLs on Sitemap
Hi all again ! Thanks in advance ! My client's site is having problems to index all its pages. I even bought the full extension of XML Sitemaps and the number of urls increased, but we still have problems to index all of them. What are the reasons? The robots.txt is open for all robots, we only prohibit users and spiders to enter our Intranet. I've read that duplicate content and 404's can be the reason. Anything else?
Technical SEO | | Tintanus0 -
Google bot notification
Hi there! I've just made some changes in my website in order to optimize it but I don't know if there's a way to notify the googlebot that some aspects of the configuration (metas) have changed and must be "taken into account". The spider visited my site two days ago and obviously processed the sitemap file. I've heard that it's possible to do a ping to certain websites. Is this the way to proceed? I must say that there're not many updates in the site (just one way information) as the social media activity is still low. Thanks in advanced.
Technical SEO | | juanmiguelcr0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
How do I get out of google bomb?
Hi all, I have a website named bijouxroom.com; and I was in the 7th page for the search term takı in google; and 2nd page for online takı. Now, I see that in one day my results seem to be on the 13th and 10th page in google respectively. I made too much anchor text for takı and online takı. What shall I do to gain my positions back? Thanks in advance. Regards,
Technical SEO | | ozererim0 -
Multilingual blogs and site structure
Hi everyone, I have a question about multilingual blogs and site structure. Right now, we have the typical subfolder localization structure. ex: domain.com/page (english site) domain.com/ja/page (japanese site) However, the blog is a slightly more complicated. We'd like to have english posts available in other languages (as many of our users are bilinguals). The current structure suggests we use a typical domain.com/blog or domain.com/ja/blog format, but we have issues if a Japanese (logged in) user wants to view an English page. domain.com/blog/article would redirect them to domain.com/ja/blog/article thus 404-ing the user if the post doesn't exist in the alternate language. One suggestion (that I have seen on sites such as etsy/spotify is to add a /en/ to the blog area: ex domain.com/en/blog domain.com/ja/blog Would this be the correct way to avoid this issue? I know we could technically work around the 404 issue, but I don't want to create duplicate posts in /ja/ that are in English or visa versa. Would it affect the rest of the site if we use a /en/ subfolder just for the blog? Another option is to use: domain.com/blog/en domain.com/blog/ja but I'm not sure if this alternative is better. Any help would be appreciated!
Technical SEO | | Seiyav0 -
Google Links
I am assuming that the list presented by Google Webmaster tools (TRAFFIC | Links To Your Site) is the one that will actually be used by Google for indexing ? There seem to be quite a few links that there that should not be there. ie Assumed NOFOLLOW links. Am I working under an incorrect assumption that all links in webmaster tools are actually followed ?
Technical SEO | | blinkybill0 -
Why isn't Google pushing my Schema data to the search results page
I believe we have it set up right. I'm noticing all my competitors schema data is showing up which is really giving them a leg up on us. We have a high ranking website so I'm just not sure why it's now showing up. Here is an example URL http://www.airgundepot.com/3576w.html I've used the Google webmaster tools tester and it all looks fine. Any ideas? Thanks in advance.
Technical SEO | | AirgunDepot0 -
Can URL re writes fix the problem of critical content too deep in a sites structure?
Good morning from Wetherby UK 🙂 Ok imagine this scenario. You ask the developers to design a site where "offices to let" is on level two of a sites hierachy and so the URL would look like this: http://www.sandersonweatherall.co.uk/office-to-let. But Yikes when it goes live it ends up like this: http://www.sandersonweatherall.co.uk...s/residential/office-to-let Is a fix to this a URL re - write? Or is the only fix relocating the office to let content further up the site structure? Any insights welcome 🙂
Technical SEO | | Nightwing0