Has anyone starting using schema.org?
-
On the 3rd June 2011 Google announced that they are going to start using Schema.
Do you think this will change the way search engines find content, from briefly looking at Schema I'm concerned that the proposed tags could just turn into another keyword meta tag and be abused.
Have you started using this tags yet and have you noticed a difference?
-
Here's a little tidbit. At SMX Advanced today, Stephan Weitz, Head of Search at Bing, confirmed two things I asked.
1. He was recently interviewed by Eric Enge, owner of Stone Temple consulting. In that interview, Stephan said that Bing is moving from how they currently interpret words in content as nouns, to where they want to understand words as they relate to actions - to understand web site intent - to then match that more accurately to searcher intent.
After I read that interview, schema.org was announced - a way to help search engines better understand the intent of the content. And those to concepts clicked in my head - so I asked him - do schema.org and their desire to understand web site intent (as described in that interview) go hand in hand? Are they directly related? He said yes - schema.org is the key to it all.
2. He also confirmed that "schema.org is not a ranking factor now, but it will become a ranking factor. THAT is huge. Why? because people will be required to implement it because failing to do so will harm their rankings. Adopting a microdata solution is no longer a "it would be nice if you did". it's going to be a standard practice.
The search engines, in one shot, with schema.org, drew a new line in the sand. Either get with the program, or suffer. Not today, but it WILL happen.
My experience with other evolving changes to this industry over the 10 years I've been involved with SEO is that when a major shift occurs, it's usually about a year before it's critical for best practices SEO.
When May Day happened, I saw the writing on the wall back then - and spent the next six to 9 months helping various clients change their site architecture accordingly. None of those clients lost rankings when Panda happened. Several clients I got who came to me AFTER panda, who were seriously harmed by Panda, saw a plunge as a result of May Day. It was the advance writing on the wall for Panda in many ways. At least in my experience.
So that's why I am now more confident than ever that people in this industry have a year to generally get on board. Those who can do it within six to nine months will have a competitive advantage.
-
Barry,
You're on a good point - it's a brand new "solution" to an age old "the search engines need help determining quality content" issue. So it's far from polished.
On the other hand, we ARE talking about gambling sites, not necessarily the lions share of important mainstream sites depending on how you look at it.
And I appreciate you saying it could be about the pain it'd be to implement. I believe that's going to ensure a LOT of sites don't get it, which means, if I am correct, those that do will have a significant competitive advantage by mid next year.
-
I don't know. I very much like the idea and to couple it with HTML5 elements, like
<header>and
<footer>, but (although I've not read everything extensively) it doesn't seen to cover every kind of website (specifically the gambling ones I work on).
There is a casino tag, but that's for a location rather than a site. I could mark up the promos as an offer, but that's not really what it's for. I could mark them up as articles, but again it's not a great fit either. And again same with product.
I really only think it'll be useful for categorizing pages which may be shown on differing SERPs pages. Otherwise it doesn't seem to be a particularly good idea to use it as a big ranking factor when not all sites can use it.
Certain elements like 'mainContentOfPage' and 'mainImageOfPage' and 'signifcantLinks' may count (and are the sort of thing all sites can impliment) but for every tag to carry a weight... I'm not sure.
Of course I may just be trying to convince myself of this as implimenting these tags on dozens of sites is not going to be fun.
</footer>
</header>
-
I believe Schema.org will be a critical aspect to SEO in 2012. It's going to take that long for the search engines to adapt their algorithms, and for use to become widespread. It will become widespread, in my opinion, for a couple very important reasons.
All three of the big three search engines got on board with this at the same time. That alone is a major signal of how important this is. People discounted sitemaps.org initially and that was only supported by Google initially. This is all three at once.
Next - I personally have already directed my agency client development teams that schema.org implementation is required - not optional - and that they have 6 months to get with the program. No wiggle room. Several other seasoned industry vets I've spoken with are taking the same action.
Schema.org is going to give the search engines the desperate help they need in better understanding content/data intent as it's meant to be intended by site owners, and matching that against searcher intent. It's going to be a way for the engines to improve the quality of their indexed sites. And amazing as it may sound, it's actually going to help them in some ways to combat spam even more than they do now. Because it's bringing structured consistency to the content of a site at a level they have not previously been able to achieve.
-
I'm still waiting to see how other sites are implementing it. I have an information site and I just can't see it helping.
However, I also have a real estate site and it would make sense to add it there. I plan on doing it but I have to do more reading to wrap my head around it first though!
-
None of my sites would overly benefit from the rewrite necessary to impliment the suggestions, however I will be looking at using it on some new sites.
You are correct in that anything users can manipulate can be abused, however it would be a reasonable effort to use the tags properly across a whole site and an extra step that's not really necessary for spammers, so while I don't think it'll be abused too much I also don't think it will be given any weight but instead simply used to sort data.
One advantage may be that you can get into other verticals with your site (as in, if there's a way to show 'events' in the SERPs) but I'm not sure if or when that would be implimented/useful.
I would certainly assume it would be too early to show any changes to a site for those that have implimented it (and if they have they're ninjas) but I would also be interested to see who is using it.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema for Plastic Surgery Procedure
Hi, i am doing Schema and don't know what type of property i should use for treatments like liposuction or botox or stuff like that. Would you use MedicalProcedure Markup? Thanks for helping me.
Algorithm Updates | | individualmarketing0 -
Has anyone seen any research regarding URL structure correlating/impacting rank brain results?
We are currently writing some "rank brain-friendly" content and were wondering if anyone had seen or conducted research on best URL structure practices. Any insights would be appreciated! Thanks, Zach
Algorithm Updates | | Chris-2417530 -
What date tags are required/best to use for articles or other landing pages?
Does anyone have suggestions on which date tag(s) are most important to use and how to use them on the frontend? (i.e. dateModified, dateCreated, and datePublished). The Structured Data Testing Tool is coming up with errors for my article pages, but I'm a bit confused which ones should be in the code vs. showing on the frontend.
Algorithm Updates | | ElsaT0 -
Our organic search traffic went flat for 2 weeks Oct 2 - Oct 17\. It has since resumed to more normal numbers. Anyone have any idea why this would happen?
Does anyone have any insight as to why our organic search traffic would go to nearly nothing for roughly a 2 week span Oct 2 - Oct 17th? Our regular traffic remained fairly consistent so we were still being indexed. It has now resumed to more normal numbers but I cannot think of anything we did that would make this happen? We did make a 302 switch to be a 301 permanent redirect on our site in early August but that is all I can think of? Any insight or help would be appreciated!
Algorithm Updates | | mwuest0 -
VRL Parameters Question - Exclude? or use a Canonical Tag?
I'm trying to figure something out, as I just finished my "new look" to an old website. It uses a custom built shopping cart, and the system worked pretty well until about a year when ranking went down. My primary traffic used to come from top level Brand pages. Each brand gets sorted by the shopping cart and a Parameter extension is added... So customers can click Page 1 , Page 2 , Page 3 etc. So for example : http://www.xyz.com/brand.html , http://www.xyz.com/brand.html?page=1 , http://www.xyz.com/brand.html?page=2 and so on... The page= is dynamic, therefore the page title, meta's, etc are the same, however the products displayed are different. I don't want to exclude the parameter page= completely, as the products are different on each page and obviously I would want the products to be indexed. However, at the same time my concern is that have these parameters might be causing some confusion an hence why I noticed a drop in google rankings. I also want to note - with my market, its not needed to break these pages up to target more specific keywords. Maybe using something like this would be the appropriate measure?
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
New feature on the SERPs - anyone know what this is!?
Hey guys, I've just been doing a bit of searching to give the new Penguin update a good going over and came across the following result on the right side of the page. As you can see, it's a link to the site that's sitting at the top of the SERP. At first, however, I thought it was some snazzy AdWords feature. Although, having clicked into the link "20 hours ago", it seems that Google has pulled the snippet from the diy-kitchens.com's Google+ feed. Anyone seen this before? Is it a new thing? Is it only the top result that's able to get a snippet from Google+? 7hA4g49.png
Algorithm Updates | | Webrevolve0 -
Does Google or Bing use words in the page title beyond the displayed limit for ranking purposes?
Standard good practice for on-page SEO includes keeping page title length below the maximum that Google displays in the SERPs. But words in the title beyond that maximum can be indexed, even if they don't show in the SERPs for end users. For ranking purposes, is there any value in words beyond the character limit in page titles that are truncated in the SERPs?
Algorithm Updates | | KyleJB0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0