We're currently not using schemas on our website. How important is it? And are websites across the globe using it?
-
Schemas looks like an important thing when it comes to structuring your website and ensuring the crawl bots get all the details. I've been reading a lot of articles around the web and most of them are saying that schemas are important but very few websites are using it. Why so? Are the schemas on schema.org there to stay or am I wasting my time?
-
Hi Pawan,
You're welcome Yes, I believe you are correct in saying that the data highlighter really only translates to Google right now. However, it seems Bing and Yahoo1 really are doing very little with structured data right now. I think it depends on your industry regarding your timeline of adding the markup. If you are in the restaurant, food or travel industry, I think you really have to start now just to stay competitive. If you're in a niche, maybe it's not so crucial. One thing's for sure, what's true about structured data now will probably be different in 6 months, so whatever you do now will need to reviewed over time, just like most anything else related to SEO There's always something new and always something changing. That's why we love it right?
Dana
-
Thanks for your input Dana. As you are saying that the schema.org markup is still sporadic, will it better if I wait for a couple of months before making the changes? Or is it the right time?
And about the microdata highlighter you are talking about, it'll just help the Google bot, not the crawlers from bing and yahoo, right? So wouldn't it be better if I use the schema.org markups?
-
I totally agree with Lesley. You asked why so few few sites might be using them. I think it's a question of knowledge and implementation. Unless you are extremely comfortable with HTML and XML, schema.org markup can be very intimidating. It also doesn't help that Google is choosing to display only certain elements of structured data right now, and even then, it's sporadic. In fact, recently, Google went from displaying a lot of authorship information to displaying less. This is all still in experimental stages. That being said, will it go away? i.e. Is it just a search fad?
My answer is: "no," structured data (also referred to as "schema," "microdata," "rich snippets," and "microformats" ) will only become more and more important until search engine bots get better at understanding different elements of a Web page, for example, understanding that there might be a MSRP price, an "our price" and a "regular price" simply by crawling the data. Right now, bots aren't very good at that because if they crawl three prices, all they are understanding is a very basic "$10.00" - "$8.00" - "$7.00" - but they won't have any idea how those three prices relate to each other without schema.org markup. Or, as another example, especially for e-commerce, a product page might have many images on it. How does a bot know which image on the page is the main product image? Bots aren't quite smart enough to know this because they can't "see" a page like a human sees a page...they can only crawl code.
But, fear not! There is help! Google initiated a microdata highlighter in Google Webmaster Tools sometime last year. If you have a smaller, simpler site, you can use this tool to markup your pages with schema without knowing a lick of code. Here's how to do it: http://www.danatanseo.com/2013/08/google-finally-demystifies-structured.html
Hope this is helpful!
-
I would consider them important. Most of my clients are e-commerce sites and I put them on every site that I do. A lot of platforms are supporting them out of the box now, if that speaks to importance to you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Crawl and Indexation Error - Googlebot can't/doesn't access specific folders on microsites
Hi, My first time posting here, I am just looking for some feedback on a indexation issue we have with a client and any feedback on possible next steps or items I may have overlooked. To give some background, our client operates a website for the core band and a also a number of microsites based on specific business units, so you have corewebsite.com along with bu1.corewebsite.com, bu2.corewebsite.com. The content structure isn't ideal, as each microsite follows a structure of bu1.corewebsite.com/bu1/home.aspx, bu2.corewebsite.com/bu2/home.aspx and so on. In addition to this each microsite has duplicate folders from the other microsites so bu1.corewebsite.com has indexable folders bu1.corewebsite.com/bu1/home.aspx but also bu1.corewebsite.com/bu2/home.aspx the same with bu2.corewebsite.com has bu2.corewebsite.com/bu2/home.aspx but also bu2.corewebsite.com/bu1/home.aspx. Therre are 5 different business units so you have this duplicate content scenario for all microsites. This situation is being addressed in the medium term development roadmap and will be rectified in the next iteration of the site but that is still a ways out. The issue
Intermediate & Advanced SEO | | ImpericMedia
About 6 weeks ago we noticed a drop off in search rankings for two of our microsites (bu1.corewebsite.com and bu2.corewebsite.com) over a period of 2-3 weeks pretty much all our terms dropped out of the rankings and search visibility dropped to essentially 0. I can see that pages from the websites are still indexed but oddly it is the duplicate content pages so (bu1.corewebsite.com/bu3/home.aspx or (bu1.corewebsite.com/bu4/home.aspx is still indexed, similiarly on the bu2.corewebsite microsite bu2.corewebsite.com/bu3/home.aspx and bu4.corewebsite.com/bu3/home.aspx are indexed but no pages from the BU1 or BU2 content directories seem to be indexed under their own microsites. Logging into webmaster tools I can see there is a "Google couldn't crawl your site because we were unable to access your site's robots.txt file." This was a bit odd as there was no robots.txt in the root directory but I got some weird results when I checked the BU1/BU2 microsites in technicalseo.com robots text tool. Also due to the fact that there is a redirect from bu1.corewebsite.com/ to bu1.corewebsite.com/bu4.aspx I thought maybe there could be something there so consequently we removed the redirect and added a basic robots to the root directory for both microsites. After this we saw a small pickup in site visibility, a few terms pop into our Moz campaign rankings but drop out again pretty quickly. Also the error message in GSC persisted. Steps taken so far after that In Google Search Console, I confirmed there are no manual actions against the microsites. Confirmed there is no instances of noindex on any of the pages for BU1/BU2 A number of the main links from the root domain to microsite BU1/BU2 have a rel="noopener noreferrer" attribute but we looked into this and found it has no impact on indexation Looking into this issue we saw some people had similar issues when using Cloudflare but our client doesn't use this service Using a response redirect header tool checker, we noticed a timeout when trying to mimic googlebot accessing the site Following on from point 5 we got a hold of a week of server logs from the client and I can see Googlebot successfully pinging the site and not getting 500 response codes from the server...but couldn't see any instance of it trying to index microsite BU1/BU2 content So it seems to me that the issue could be something server side but I'm at a bit of a loss of next steps to take. Any advice at all is much appreciated!0 -
Why doesn't my website crawl by Google?
Hi mozzers and members, I am having issues, why my website: http://profilecosmeticsurgery.com/ crawl by Google? let me share more clearly when this starts happening. A month or around 45 days back our website is being indexed and crawled quite well without any issues with having .html extension pages with static built website.
Intermediate & Advanced SEO | | SEOOOOOoooooooo
We finally thought to change to .php version and make whole website and its pages to be treated dynamically.
Once we changed all changes, thereafter this issues started. It has been more than 45 days, our website isn't being crawled since then. I didn't know what are the things preventing this to? Please help. Thanks in Advance Capture1.PNG0 -
What to do about old urls that don't logically 301 redirect to current site?
Mozzers, I have changed my site url structure several times. As a result, I now have a lot of old URLs that don't really logically redirect to anything in the current site. I started out 404-ing them, but it seemed like Google was penalizing my crawl rate AND it wasn't removing them from the index after being crawled several times. There are way too many (>100k) to use the URL removal tool even at a directory level. So instead I took some advice and changed them to 200, but with a "noindex" meta tag and set them to not render any content. I get less errors but I now have a lot of pages that do this. Should I (a) just 404 them and wait for Google to remove (b) keep the 200, noindex or (c) are there other things I can do? 410 maybe? Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Bing flags multiple H1's as an issue of high importance--any case studies?
Going through Bing's SEO Analyzer and found that Bing thinks having multiple H1's on a page is an issue. It's going to be quite a bit of work to remove the H1 tags from various pages. Do you think this is a major issue or not? Does anyone know of any case studies / interviews to show that fixing this will lead to improvement?
Intermediate & Advanced SEO | | nicole.healthline0 -
Starting Over with a new site - Do's and Don'ts?
After six months, we've decided to start over with a new website. Here's what I'm thinking. Please offer any constructive Do's or Don'ts if you see that I'm about to make a mistake. Our original site,(call it mysite.com ) we have come to the conclusion, is never going to make a come back on Google. It seems to us a better investment to start over, then to to simply keep hoping. Quite honestly, we're freakin' tired of trying to fix this. We don't want to screw with it any more. We are creative people, and would much rather be building a new race car rather than trying to overhaul the engine in the old one. We have the matching .net domain, mysite.net, which has been aged about 6 years with some fairly general content on a single page. There are zero links to mysite.net, and it was really only used by us for FTP traffic -- nothing in the SERPS for mysite.net. Mysite.NET will be a complete redesign. All content and images will be totally redone. Content will be new, excellent writing, unique, and targeted. Although the subject matter will be similar to mysite.COM, the content, descriptions, keywords, images -- all will be brand spankin' new. We will have a clean slate to begin the long painful link building process.We will put in the time, and bite the bullet until mysite.NET rules Google once again. We'll change the URL in all of our Adwords campaigns mysite.net. My questions are: 1. Mysite.com still gets some ok traffic from Bing. Can I leave mysite.com substantially intact, or does it need to go? 2. If I have "bad links" pointing to mysite.com/123.html what would happen if I 301 that page to mysite.NET/abc.html ? Does the "bad link juice" get passed on to the clean site? It would be a better experience for users who know our URL if they could be redirected to the new site. 3. Should we put Mysite.net on a different server in a different clean IP block? Or doesn't matter? We're willing to spend for the new server if it would help 4. What have I forgotten? Cheers, all
Intermediate & Advanced SEO | | DarrenX0 -
How Long Before a URL is 'Too Long'
Hello Mozzers, Two of the sites I manage are currently in the process of merging into one site and as a result, many of the URLs are changing. Nevertheless (and I've shared this with my team), I was under the impression that after a certain point, Google starts to discount the validity of URLs that are too long. With that, if I were to have a URL that was structured as follows, would that be considered 'too long' if I'm trying to get the content indexed highly within Google? Here's an example: yourdomain.com/content/content-directory/article and in some cases, it can go as deep as: yourdomain.com/content/content-directory/organization/article. Albeit there is no current way for me to shorten these URLs is there anything I can do to make sure the content residing on a similar path is still eligible to rank highly on Google? How would I go about achieving this?
Intermediate & Advanced SEO | | NiallSmith0 -
Entering into a new website franchise model, currently subdomains, client wants scalability. Best approach?
This is my first experience with a franchise model business. It is less than 1 year old and on page SEO is in pitiful shape with hundreds of subdomains already for specific locations. What is the best approach to take here? I've seen a lot of debate regarding subdomains and folders and it seems the folder structure may be the best long term course of action but I'm still a bit unclear on this. What is the best approach to ensure that all SEO addressed on the site has the most impact and moving forward, what is the best method to scaling the SEO for franchisee owners. What is the best practice to help each location be best positioned in search in the future, how much should the corporate franchise site typically provide in terms of SEO services to franchisees, and how does the lead SEO consultant scale those services to franchisees?
Intermediate & Advanced SEO | | methods0 -
Can I use the same source for two different websites?
I have developed a successful portal based website but would like to grow my portfolio of sites by expanding into new niches and sectors. I would like to use the same source code to fast track new sites but I'm not sure of the dangers involved. Content, meta details etc. will all be unique and the only similarity will be the html code. Another example of how I want to use this is that my current site targets the UK but I want to target a global market with a .com domain and this would involve using the same source. Is this possible without a penalty or am I overlooking something?
Intermediate & Advanced SEO | | Mulith0