Received A Notice Regarding Spammy Structured Data. But we don't have any structured data or do we?
-
Got a message that we have spammy structured data on our site via webmaster tools and have no idea what they are referring to. We do not use any structured data using schema.org mark up. Could they be referring to something else?
The message was:
To: Webmaster of <a>http://www.lulus.com/</a>,
Google has detected structured markup on some of your pages that violates our structured data quality guidelines. In order to ensure quality search results for users, we display rich search results only for content that uses markup that conforms to our quality guidelines. This manual action has been applied to lulus.com/ . We suggest that you fix your markup and file a reconsideration request. Once we determine that the markup on the pages is compliant with our guidelines, we will remove this manual action.
What could we be showing them that would be interpreted as structured data, and or spammy structured data?
-
It's highly unlikely you'd get a manual penalty for incorrect Open Graph markup (especially since Google itslef doesn't use it for anything.)
Instead of trying to test one-off pages with the data testing tool, have a look in your Google Search Console under the Search Appearance > Structured Data report. Here you'll see what Google's crawler thinks about the structured markup on all the pages of your site it is able to crawl. Much better chance that the crawler has caught and reported it than that you'll find it checking one page at a time.
One of the really common types of markup that earned manual penalties recently was recipes (due to certain plugins not implementing it correctly.) Since your site doesn't include recipes, the other area to check closely is reviews/ratings. If Google thinks you're trying to use these manipulatively, they'll slap you hard, since these actually generate rich snippets in SERPS.
In the brief look I had at your site, it didn't appear your reviews/rating were using markup, but that's where an exhaustive check using the GSC report would be vastly more effective than my cursory check.
Hope that all makes sense? Good luck!
Paul
-
Thank you for your insight the data testing tool is very helpful.
https://search.google.com/structured-data/testing-tool/u/0/
- Kent
-
Hello,
I think whatever opengraph plugin you are using on your pages might be causing the issue. Take a look at: https://search.google.com/structured-data/testing-tool/u/0/#url=http%3A%2F%2Fwww.lulus.com%2Fcategories%2F13%2Fdresses.html
On a semi-related note, your og:image tag is missing 'og:image:width' and 'og:image:height'.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Changing site URL structure
Hey everybody, I'm looking for a bit of advice. A few weeks ago Google sent me an email saying all pages with any text input on them need to switch to https for those pages. This is no problem, I was slowly switching the site to https anyway using a 301 redirect. However, my site also has a language subfolder in the url, mysite.com/en/ mysite.com/ru/ etc. Due to poor work on my part the translations of the site haven't been updated in a long time and lots of the pages are in english even on the russian version etc. So I'm thinking of just removing this url structure and just having mysite.com My plan is to 301 all requests to https and remove the language subfolder in the url at the same time. So far the https switching hasn't changed my rankings. Am I more at risk of losing my rankings by doing this? Thanks!
Technical SEO | | Ruhol0 -
Godaddy and Soft 404's
Hello, We've found that a website we manage has a list of not-found URLS in Google webmaster tools which are "soft 404's " according to Google. I went to the hosting company GoDaddy to explain and to see what they could do. As far as I can see GoDaddy's server are responding with a 200 HTTP error code - meaning that the page exists and was served properly. They have sort of disowned this as their problem. Their server is not serving up a true 404 response. This is a WordPress site. 1) Has anyone seen this problem before with GoDaddy?Is it a GoDaddy problem?2) Do you know a way to sort this issue? When I use the command site:mydomain.co.uk the number of URLs indexed is about right except for 2 or 3 "soft URLs" . So I wonder why webmaster tools report so many yet I can't see them all in the index?
Technical SEO | | AL123al0 -
Structured markup for wordpress
Hello, I am having problems with marking up my WP posts. I used the All-in-One which seems to be the most user-friendly, except when I denote the aspects of the "article" and update, the markup shows up as a box at the bottom of the post (even though the info is in the text). How do I mark these up for Google without having the unseemly box at the bottom? Thanks so much in advance for any help! Btw, I am not altogether comfortable just yet on manual schematic markup (if you have a really basic manual markup that will let me do so across various platforms, I would also appreciate the recommendation). Thanks!
Technical SEO | | lfrazer1 -
Why isn't my site not searchable from google?
I am having a hard time figuring out why is it that when I search for my website name, it didn't show up in google's search result? Here's a link to my site. I've been twiddling for days looking for answers in my google webmaster tools. Here's a link of the crawl stats from google webmaster tool. As you can see it is actually crawling some pages. However my looking at my indexed status, I am getting 0 as you can see here (http://cl.ly/image/3G1R1p0b3k1P). I've double checked for my robots.txt and nothing seemed to be out of the ordinary there. I am not blocking anything. Any ideas why?
Technical SEO | | herlamba0 -
I'm receiving this message...
It's saying Roger can't communicate with my site. I've contacted ipage which is the host and they say it's on your end. Please let me know if you need any more info. from me... Thanks, Tom 404-447-2868
Technical SEO | | NextlevelMD0 -
Help With Analytics Data
Hello, I'm seeing the following Analytics data for some of my keywords: Multiple Visits Pages/Visit: 1 Avg Visit Duration: 00:00 % New Visits: 100% Bounce Rate: 100% The data is the same on all "affected keywords". What is going on and how do I fix it? Thanks for the help!
Technical SEO | | AWCthreads0 -
How Best to Handle 'Site Jacking' (Unauthorized Use of Someone else's Dedicated IP Address)
Anyone can point their domain to any IP address they want. I've found at least two domains (same owner) with two totally unrelated domains (to each other and to us) that are currently pointing their domains to our IP address. The IP address is on our dedicated server (we control the entire physical server) and is exclusive to only that one domain (so it isn't a virtual hosting misconfiguration issue) This has caused Google to index their two domains with duplicate content from our site (found by searching for site:www.theirdomain.com) Their site does not come up in the first 50 results though for any of the keywords we come up for so Google obviously knows THEY are the dupe content, not us (our site has been around for 12 years - much longer than them.) Their registration is private and we have not been able to contact these people. I'm not sure if this is just a mistake on the DNS for the two domains or it is someone doing this intentionally to try to harm our ranking. It has been going on for a while, so it is most likely not a mistake for two live sites as they would have noticed long ago they were pointing to the wrong IP. I can think of a variety of actions to take but I can find no information anywhere regarding what Google officially recommends doing in this situation, assuming you can't get a response. Here's my ideas. a) Approach it as a Digital Copyright Violation and go through the lengthy process of having their site taken down. Pro: Eliminates the issue. Con: Sort of a pain and we could be leaving possibly some link juice on the table? b) Modify .htaccess to do a 301 redirect from any URL not using our domain, to our domain. This means Google is going to see several domains all pointing to the same IP and all except our domain, 301 redirecting to our domain. Not sure if THAT will harm (or help) us? Would we not receive link juice then from any site out there that was linking to these other domains? Con: Google will see the context of the backlinks and their link text will not be related at all to our site. In addition, if any of these other domains pointing to our IP have backlinks from 'bad neighborhoods' I assume it could hurt us? c) Modify .htaccess to do a 404 File Not Found or 403 forbidden error? I posted in other forums and have gotten suggestions that are all over the map. In many cases the posters don't even understand what I'm talking about - thinking they are just normal backlinks. Argh! So I'm taking this to "The Experts" on SEOMoz.
Technical SEO | | jcrist1 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0