Implementing Schema.org on a web page
-
Hi all,
As we know, implementing Schema doesn't change the look & feel of a web page for the users.
So here is my Question..
Could we implement Schema markup on the web pages only for Bots (but not visible to users in a Source code) so that page load time doesn't increase?
-
Hello Anirbon,
You never want to show Google one thing in the code, and show everyone else something different. That is the very definition of cloaking.
Have you looked into using JASON-LD instead of Schema markup? Built Visible has a great article on micro data that includes a section about JSON-LD, which allows you to mark up code in a script instead of wrapping the HTML.
-
Hi,
I am not saying that schema is bad or that you shouldn't do it - it just seems that some big players only use schema on detail pages of an individual product & not on the overview pages. I found an example of site using it - but in the serp's it's only the average rating which appears (example http://www.goodreads.com/author/list/7779.Arthur_C_Clarke). The first result
You can always test what the impact will be - as mentioned before - I guess even for 50 elements fully tagged with Schema the impact on page speed will be minimal. Check your curent pages with webpagetest.org - see the repartition of load time. Probably the html will only account for 10-20% of the load time - rest being images, javascript & css files. Adding a few hundred lines of HTML will not fundamentally change this (text can be compressed quite well)
rgds
Dirk
-
Hi,
But using Schema, providing a well structure data will help bots to understand what type of content/information is present on a page & i think that will definitely help a page to rank better in Google search either its SRP or JD.
Regards,
Anirban
-
Hi,
I am not sure I adding schema.org on a result page is adding a lot of value. If you send 50 different blocks of structured data how should search engines understand which piece would be relevant to be shown in SERPS. I just did a check on 2 different sites (allrecipes.com & monster.com) - they only seem to use the schema markup on the detail pages - not on the result pages.
If you would like to go ahead - you could always try to measure the impact on the page by creating two (static) versions of a search result page - one with & one without markup and test both versions with webpagetest.org & Google page speed analyser. An alternative would be to using "lazy loading" - you first load the first x results (visible part on screen), when the user scrolls you load the next batch ...and so on. This way, the impact on loading times would remain minimal.
In each case, I would not try to show different pages to users & bots.
rgds,
Dirk
-
Hello Dirk,
Thanks for the reply.
Agreed that the impact of adding the few lines of extra code of schema.org will be zero on the load time of the pages. But it totally depends what content you are going to show on a page.
I want to implement Schema.org on the Search Result pages where a single page contains more than 50 listings with different information like Job Title, Company name, Skills, Job posted etc. For each i will have to use different properties as recommended by Google by which the load time of a page will definitely increase.
Please let me know for the above listed case.
Thanks
-
Try adding schema with meta tags in the html, for example:
This way you're telling bots your phone number with schema but it doesn't appear visibly to users. This is normally done with the latitude and longitude schema tags but you can use it for the others as well. Though I wouldn't rely on this as a permanent long-term solution as Google may change their policies on how they interpret content that is not visible to users.
-
It's a game of words. In the context of the question - if you would provide the schema tagging only to bots the tagged info could also be listed in the SERP's and the bots get a better understanding of what the page is all about. Final goal is off course to serve the user the best answers when he's searching. On the page itself however the user doesn't see any difference if the page is tagged with schema or not.
Dirk
-
Dirk I think you misunderstand my words. Schema for user means exactly the same that you wrote in last lines "Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Thanks
-
Hi Alick,
Schema.org is not for users - it is "a collection of schemas that webmasters can use to markup HTML pages in ways recognized by major search providers, and that can also be used for structured data interoperability (e.g. in JSON). Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Source: http://schema.org/
rgds,
Dirk
-
Hi Anirban,
I'm completely agree with Dirk second thing I would like to know what is the purpose of showing schema to bot only. In my limited understanding we use schema for user to show price, offers to users not bot.
Thanks
-
Hi Anirban,
The impact of adding the few lines of extra code of schema.org will be zero on the load time of your pages.
Apart from that, serving different content to bots & human users could be considered cloaking by search engines.
Implementing schema.org on the normal pages should do just fine!
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sitelinks to internal pages
Hi Moz Community, Is it only possible to get sitelinks to your main domain in the SERPs or can you also get them to internal pages in the SERPs? i.e. if this were to pop up in the SERP: www.mysite.com/page1
On-Page Optimization | | Brian_Dowd
Can I get sitelinks such as the examples below:
Sitelink1: www.mysite.com/page1/page1a
Sitelink2: www.mysite.com/page1/page1b
Sitelink3: www.mysite.com/page1/page1c
etc. Just curious and I haven't really find anything like that on the internet. How can I set them up? Thanks.0 -
Best schema.org wordpress plugin?
Dear all,
On-Page Optimization | | plcarrara
Im doing SEO for a finances website, and Im struggling to find a schema plugin for wordpresss that would contain "News Article" schema. Can you give me a hand with this? Best regards. Pablo0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
Schema: Multiple Locations on a Single Page
Can adding multiple locations on a single page be confusing to Google? Is using "LocalBusiness" with "branchof" the proper way of doing this? Does anyone know of any resources that go into this type of thing in more detail? I've read everything Google, Schema and SeoMoz seem to have on this topic. Thanks.
On-Page Optimization | | Kwilder0 -
If I want to rank well on one keyword would it be better to optimize multiple pages on the website for the keyword or should I only optimize one page for that keyword?
If I want to rank well on one keyword would it be better to optimize multiple pages on the website for the keyword or should I only optimize one page for that keyword?
On-Page Optimization | | CustomOnlineMarketing0 -
Schema.org and Google +
Now that google merged local into Google+, should we be changing the way we do the reviews on our sites to be out of 3? I had one out of 5 and it showed up in the SERPS, but since the change now nothing shows.
On-Page Optimization | | netviper0 -
Getting page cached
I am reworking some content that is deep in my site. What is the best way for google to find it? Some of the pages were cached about 3 weeks ago, but I don't want to wait too long to get them to see the new content (and links).
On-Page Optimization | | azguy0 -
Creating New Pages Versus Improving Existing Pages
What are some things to consider or things to evaluate when deciding whether you should focus resources on creating new pages (to cover more related topics) versus improving existing pages (adding more useful information, etc.)?
On-Page Optimization | | SparkplugDigital0