Structural data in google webmaster tools
-
Hey,
During the year I have done everything in my power to please Google with my website. Instead of building links towards the page I have focused on content, content and content. In addition I have worked with https and page speed. Today my site is faster than 98% of all tested sites in Pingdom tools and have 94/83 in Google insights.
Of course we have had to build some links as well, perhaps 50 links in 8 months. At the same time we have built 700 pages of text. The total amount of links build is 180 over 20 months.
On Thursday last week it looks like the site was penalized by Google. I still believe that we can do something about it and get the site back on track again. Hence we have been looking at technical things on the site, if there is anything Google don't like.
One thing that I have found is structural data. For some reason this has dropped from 875 a month ago to 3 today. I have no clue why. Does anyone know how structural data works and what can have caused this problem. Would it be possible that we in our attempt to optimize the site might have done something that may affect the structural data? http://imgur.com/a/vurB1
In that case, what affect might this drop in structural data mean for SEO. Could that be a reason for the total drop in ranking? (we have basically been wiped on all our keywords)
What I can see in Google webmaster tool about 975 pages are still indexed in Google which has been stable for a long time.
Does anyone know more about structural data and what I can do about this?
Thanks in advance!/A
-
Hey Beau
Thanks for your reply.
We never received anything from Google in Google Webmaster tools. One day the site was gone in the serp with no explanation. That was on 11 August. Since then we have done a lot of things in Google Webmaster tools to see if that would help. There were no major things, but we have done everything we can and no there are nothing left to do. Yesterday, the site as back on track in Google. I can't say why, but positive thinking (ability to look for solutions and not anybody to blame) I believe are the most important reasons why we are back.
We have not liberally used anything to mark up our site with structural data before. The only thing we could find that might have helped us with structural data was a plugin Author H review which I find very good. However my intention is so structure up the entire site with structural data now.
I read the recommended article about structural data (thanks, it was really interesting). I'm no experted in it, nor is my web developer. Hence I have been thinking of contacting a Schema.org expert. What do you think, would that be a good idea? The more I read about schema.org the more it triggers my interest and the more I realize that it could be good to get some help.In google webmaster tools we still have 4 errors regarding Structural data. However when we test it in their own tools everything works. IN the control panel we have added http: https: www.http www.https. For some reason Google shows different results regarding indexed pages and structural data and site maps depending which of the http versions you're looking at. The structural data is now back and we have managed to increase it.
Thanks a lot for your hel Beau. Your questions and answers helped us to start looking more at structural data. It was a great relief yesterday when we were back in the search results again. I hope it will stay that way.
Do you think it's worth spending time and resources to contact a schema.org consultant who can help us mark up the entire page with structural data?
Have a nice day!
Anders
-
Hello,
First, I have some questions & unanswered needs-to-know which could help you figure this out:
- You mentioned that "it looks like the site was penalized by Google". Was your site issued a manual/link/partial penalty or is this based off of metrics? Was there a message via Search Console/WMT? In the previous paragraph you discuss building links but one shouldn't assume, even if it fails the smell test.
- Explain what type of structured data markup you were using for areas affected. Were you using review schema for areas where reviews/products aren't featured? Is there JSON-LD markup on your site AND does it reference the same content on page via HTML?
- What does your Structured Data Report section look like in GSC/WMT? Errors? Did you test in the Structured Data Testing Tool beforehand? What day did this happen? What did MozCast or SEO chatter via the web look like that day?
Here's an older article (2015) via Search Engine Land by Tony Edward covering structured data markup penalty recovery - Ask yourself the above questions (or respond), read the article, and also ask yourself:
- Were my intentions honest? Was my recipe markup for a recipe? Did I buy links via Fiverr? Is this drop in rankings and traffic even related to structured data markup or backlinks?
- If I was doing everything with the best of intentions, is this the work of a negative SEO campaign? What does your recent link profile look like?
Write back - let's figure this out!
Beau
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Penalties not in Webmaster tools?
Hi everybody, I have a client that used to rank very well in 2014. They launched an updated URL structure early January 2015, and since they rank very low on most of the keywords (except the brand keywords). I started working with them early this year, tried to understand what happened, but they have no access to their old website and I cant really compare. I tried the started optimisation methods but nothing seems to work. I have a feeling they have been penalised by Google, probably a Panda penalty, but their Webmaster tools account does not show any penalties under manual actions. Do people impose penalties that are not added to Webmaster tools? If so, is there away I can find out what penalties and what is wrong exactly so we can start fixing it? The website is for a recruitment agency and they have around 400 jobs listed on it. I would love to share the link to the website but I don't believe the client will be happy with that. Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
Google Penalty Checker Tool
What is the best tool to check for the google penalty, What penalty hit the website. ?
Intermediate & Advanced SEO | | Michael.Leonard0 -
Should I set a max crawl rate in Webmaster Tools?
We have a website with around 5,000 pages and for the past few months we've had our crawl rate set to maximum (we'd just started paying for a top of the range dedicated server at the time, so performance wasn't an issue). Google Webmaster Tools has alerted me this morning that the crawl rate has expired so I'd have to manually set the rate again. In terms of SEO, is having a max rate a good thing? I found this post on Moz, but it's dated from 2008. Any thoughts on this?
Intermediate & Advanced SEO | | LiamMcArthur0 -
Hide Aggregation from Google?
Google isn't a fan of aggregation, but sometimes it is a good way to fill out content when you cannot cover every news story there is. What I'm wondering is if anyone has continued to do any form of aggregation based on a category and hide that url from Google. Example: example.com/industry-news/ -- is where you'd post aggregation stories but you block robots from crawling that. We wouldn't be doing this for search value just value to our readers. Thoughts?
Intermediate & Advanced SEO | | meistermedia0 -
Should i remove sitemap from the mainsite at a webshop (footer link) and only submit .XML in Webmaster tools?
Case: Webshop with over 2000 products. I want to make a logical sitemap for Google to follow. What is best practice at this field? Should i remove the on-page sitemap there is in html with links (is shown as a footer link called "sitemap") and only have the domain.com/sitemap.xml ? Links for great articles about making sitemaps are appreciated to. The system is Magento, if that changes anything.
Intermediate & Advanced SEO | | Mickelp0 -
Google Webmaster Now Shows YourMost Recent Links
I just saw this story today about a new Google Webmaster feature which lets you download a file of the most recent links. http://searchengineland.com/google-now-shows-you-your-most-recent-links-127903 I downloaded the file today and I already discovered a major site issue. Our site blog was completely duplicated on a secondary domain we own and Google was showing that site as recent links. I already emailed the dev team to fix this pronto. Anybody else using this new feature and perhaps can share if it helps you in any way.
Intermediate & Advanced SEO | | irvingw1 -
Does Google prefer Wordpress Blogs?
In creating a regular brochure website such as one for a dentist or doctor, do you see any SEO benefit to having it based in a Wordpress blog? I do see the SEO benefit of having an actual blog on the site and continually updating that, but simply using the Wordpress platform as a CMS - does that give the site any benefit? If there is a benefit, is there a way to duplicate that advantage without going through the trouble of creating a Wordpress template for the site? Maybe just publishing a sitemap.xml, and feed, etc? Thanks! Tom
Intermediate & Advanced SEO | | TomBristol0 -
HTTP Errors in Webmaster Tools
We recently added a 301 redirect from our non-www domain to the www version. As a result, we now have tons of HTTP errors (403s to be exact) in Webmaster Tools. They're all from over a month ago, but they still show up. How can we fix this?
Intermediate & Advanced SEO | | kylesuss0