Structural data in google webmaster tools
-
Hey,
During the year I have done everything in my power to please Google with my website. Instead of building links towards the page I have focused on content, content and content. In addition I have worked with https and page speed. Today my site is faster than 98% of all tested sites in Pingdom tools and have 94/83 in Google insights.
Of course we have had to build some links as well, perhaps 50 links in 8 months. At the same time we have built 700 pages of text. The total amount of links build is 180 over 20 months.
On Thursday last week it looks like the site was penalized by Google. I still believe that we can do something about it and get the site back on track again. Hence we have been looking at technical things on the site, if there is anything Google don't like.
One thing that I have found is structural data. For some reason this has dropped from 875 a month ago to 3 today. I have no clue why. Does anyone know how structural data works and what can have caused this problem. Would it be possible that we in our attempt to optimize the site might have done something that may affect the structural data? http://imgur.com/a/vurB1
In that case, what affect might this drop in structural data mean for SEO. Could that be a reason for the total drop in ranking? (we have basically been wiped on all our keywords)
What I can see in Google webmaster tool about 975 pages are still indexed in Google which has been stable for a long time.
Does anyone know more about structural data and what I can do about this?
Thanks in advance!/A
-
Hey Beau
Thanks for your reply.
We never received anything from Google in Google Webmaster tools. One day the site was gone in the serp with no explanation. That was on 11 August. Since then we have done a lot of things in Google Webmaster tools to see if that would help. There were no major things, but we have done everything we can and no there are nothing left to do. Yesterday, the site as back on track in Google. I can't say why, but positive thinking (ability to look for solutions and not anybody to blame) I believe are the most important reasons why we are back.
We have not liberally used anything to mark up our site with structural data before. The only thing we could find that might have helped us with structural data was a plugin Author H review which I find very good. However my intention is so structure up the entire site with structural data now.
I read the recommended article about structural data (thanks, it was really interesting). I'm no experted in it, nor is my web developer. Hence I have been thinking of contacting a Schema.org expert. What do you think, would that be a good idea? The more I read about schema.org the more it triggers my interest and the more I realize that it could be good to get some help.In google webmaster tools we still have 4 errors regarding Structural data. However when we test it in their own tools everything works. IN the control panel we have added http: https: www.http www.https. For some reason Google shows different results regarding indexed pages and structural data and site maps depending which of the http versions you're looking at. The structural data is now back and we have managed to increase it.
Thanks a lot for your hel Beau. Your questions and answers helped us to start looking more at structural data. It was a great relief yesterday when we were back in the search results again. I hope it will stay that way.
Do you think it's worth spending time and resources to contact a schema.org consultant who can help us mark up the entire page with structural data?
Have a nice day!
Anders
-
Hello,
First, I have some questions & unanswered needs-to-know which could help you figure this out:
- You mentioned that "it looks like the site was penalized by Google". Was your site issued a manual/link/partial penalty or is this based off of metrics? Was there a message via Search Console/WMT? In the previous paragraph you discuss building links but one shouldn't assume, even if it fails the smell test.
- Explain what type of structured data markup you were using for areas affected. Were you using review schema for areas where reviews/products aren't featured? Is there JSON-LD markup on your site AND does it reference the same content on page via HTML?
- What does your Structured Data Report section look like in GSC/WMT? Errors? Did you test in the Structured Data Testing Tool beforehand? What day did this happen? What did MozCast or SEO chatter via the web look like that day?
Here's an older article (2015) via Search Engine Land by Tony Edward covering structured data markup penalty recovery - Ask yourself the above questions (or respond), read the article, and also ask yourself:
- Were my intentions honest? Was my recipe markup for a recipe? Did I buy links via Fiverr? Is this drop in rankings and traffic even related to structured data markup or backlinks?
- If I was doing everything with the best of intentions, is this the work of a negative SEO campaign? What does your recent link profile look like?
Write back - let's figure this out!
Beau
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Webmaster Tools -> Sitemap suddent "indexed" drop
Hello MOZ, We had an massive SEO drop in June due to unknown reasons and we have been trying to recover since then. I've just noticed this yesterday and I'm worried. See: http://imgur.com/xv2QgCQ Could anyone help by explaining what would cause this sudden drop and what does this drop translates to exactly? What is strange is that our index status is still strong at 310 pages, no drop there: http://imgur.com/a1sRAKo And when I do search on google site:globecar.com everything seems normal see: http://imgur.com/O7vPkqu Thanks,
Intermediate & Advanced SEO | | GlobeCar0 -
Prevent Google from crawling Ajax
With Google figuring out how to make Ajax and JS more searchable/indexable, I am curious on thoughts or techniques to prevent this. Here's my Situation, we have a page that we do not ever want to be indexed/crawled or other. Currently we have the nofollow/noindex command, but due to technical changes for our site the method in which this information is being implemented if it is ever displayed it will not have the ability to block the content from search. It is also the decision of the business to not list the file in robots.txt due to the sensitivity of the content. Basically, this content doesn't exist unless something super important happens, and even if something super important happens, we do not want Google to know of its existence. Since the Dev team is planning on using Ajax/JS to pull in this content if the business turns it on, the concern is that it will be on the homepage and Google could index it. So the questions that I was asked; if Google can/does index, how long would that piece of content potentially appear in the SERPs? Can we block Google from caring about and indexing this section of content on the homepage? Sorry for the vagueness of this question, it's very sensitive in nature and I am trying to avoid too many specifics. I am able to discuss this in a more private way if necessary. Thanks!
Intermediate & Advanced SEO | | Shawn_Huber0 -
Google and PDF indexing
It was recently brought to my attention that one of the PDFs on our site wasn't showing up when looking for a particular phrase within the document. The user was trying to search only within our site. Once I removed the site restriction - I noticed that there was another site using the exact same PDF. It appears Google is indexing that PDF but not ours. The name, title, and content are the same. Is there any way to get around this? I find it interesting as we use GSA and within GSA it shows up for the phrase. I have to imagine Google is saying that it already has the PDF and therefore is ignoring our PDF. Any tricks to get around this? BTW - both sites rightfully should have the PDF. One is a client site and they are allowed to host the PDFs created for them. However, I'd like Mathematica to also be listed. Query: no site restriction (notice: Teach for america comes up #1 and Mathematica is not listed). https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#q=HSAC_final_rpt_9_2013.pdf+"Teach+charlotte"+filetype:pdf&as_qdr=all&filter=0 Query: site restriction (notice that it doesn't find the phrase and redirects to any of the words) https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#as_qdr=all&q="Teach+charlotte"+site:www.mathematica-mpr.com+filetype:pdf
Intermediate & Advanced SEO | | jpfleiderer0 -
301 Redirect and Webmaster Central
I've been working on removing canonical issues. My host is Apache. Is this the correct code for my htaccess? RewriteEngine On
Intermediate & Advanced SEO | | spkcp111
RewriteCond %{HTTP_HOST} ^luckygemstones.com$ [NC]
RewriteRule ^(.*)$ http://www.luckygemstones.com/$1 [R=301,L] SECOND!!! I have two websites under Google's Webmaster Central; http://luckygemstones.com which gets NO 404 soft errors... AND http://www.luckygemstones.com which has 247 soft 404 errors... I think I should DELETE the http://luckygemstones.com site from Webmaster Central--the 301 redirect handles the"www" thing. Is this correct? I hate to hose things (even worse?) Help! Kathleen0 -
Google Trends - API? Anyone Using This Data?
I'm looking for a way to track change in search volume for dozens or hundreds of keywords but Google Trends doesn't have an API as far as I can tell. Does anyone know of a good tool or method of extracting large amounts of data from Google Trends?
Intermediate & Advanced SEO | | wattssw1 -
Is Google Webmaster tools Accurate?
Is Google webmaster Tools data completely inaccurate, or am I just missing something? I noticed a recent surge in 404 errors detected 3 days ago (3/6/11) from pages that have not existed since November 2011. They are links to tag and author archives from pages initially indexed in August 2011. We switched to a new site in December 2011 and created 301 redirects from categories that no longer exist, to new categories. I am a little perplexed since the Google sitemap test shows no 404 errors, neither does SEO MOZ Crawl test, yet under GWT site diagnostics, these errors, all 125 of them, just showed up. Any thought/insights? We've worked hard to ensure a smooth site migration and now we are concerned. -Jason
Intermediate & Advanced SEO | | jimmyjohnson0 -
How do I get rid of all the 404 errors in google webmaster tools after building a new website under the same domiain
I recently launched my new website under the same domain as the old one. I did all the important 301 redirects but it seems like every url that was in google index is still their but now with a 404 error code. How can I get rid of this problem? For example if you google my company name 'romancing diamonds' half the link under the name are 404 errors. Look at my webmaster tools and you'll see the same thing. Is their anyway to remove all those previous url's from google's indexes and start anew? Shawn
Intermediate & Advanced SEO | | Romancing0 -
Have we suffered a Google penalty?
Hello, In January, we started a new blog to supplement our core ecommerce website. The URL of the website is www.footballshirtblog.co.uk and the idea behind it was that we would write articles related to our industry to build a community which would ultimately boost our sales. We would add several posts per day, a mix between shorter news stories of around 150 words and more detailed content pages of around 500 words. Everything was going well, we were making slow but sure progress on the main generic keywords but were receiving several thousand visitors a day, mostly finding the posts themselves on Google. The surge on traffic meant we needed to move server, which we did around 6 weeks ago. When we did this, we had a few teething problems with file permissions, etc, which meant we were tempoarily able to add new posts. As our developers were tied up with other issues, this continued for a 7-10 day period, with no new content being added. In this period, the site completely dropped from Google, losing all it's rankings and traffic, to the extent it now doesn't even rank for it's own name. This is very frustrating as we have put a huge amount of work and content into developing this site. We have added a few posts since, but not a huge amount as it is frustrating to do it with no return and the concern that the site has been banned forever. I cannot think of any logical reason why this penalty has occured as we haven't been link spamming, etc. Does anyone have any feedback or suggestions as to how we can get back on track? Regards,
Intermediate & Advanced SEO | | ukss1984
David0