Implementing Schema.org on a web page
-
Hi all,
As we know, implementing Schema doesn't change the look & feel of a web page for the users.
So here is my Question..
Could we implement Schema markup on the web pages only for Bots (but not visible to users in a Source code) so that page load time doesn't increase?
-
Hello Anirbon,
You never want to show Google one thing in the code, and show everyone else something different. That is the very definition of cloaking.
Have you looked into using JASON-LD instead of Schema markup? Built Visible has a great article on micro data that includes a section about JSON-LD, which allows you to mark up code in a script instead of wrapping the HTML.
-
Hi,
I am not saying that schema is bad or that you shouldn't do it - it just seems that some big players only use schema on detail pages of an individual product & not on the overview pages. I found an example of site using it - but in the serp's it's only the average rating which appears (example http://www.goodreads.com/author/list/7779.Arthur_C_Clarke). The first result
You can always test what the impact will be - as mentioned before - I guess even for 50 elements fully tagged with Schema the impact on page speed will be minimal. Check your curent pages with webpagetest.org - see the repartition of load time. Probably the html will only account for 10-20% of the load time - rest being images, javascript & css files. Adding a few hundred lines of HTML will not fundamentally change this (text can be compressed quite well)
rgds
Dirk
-
Hi,
But using Schema, providing a well structure data will help bots to understand what type of content/information is present on a page & i think that will definitely help a page to rank better in Google search either its SRP or JD.
Regards,
Anirban
-
Hi,
I am not sure I adding schema.org on a result page is adding a lot of value. If you send 50 different blocks of structured data how should search engines understand which piece would be relevant to be shown in SERPS. I just did a check on 2 different sites (allrecipes.com & monster.com) - they only seem to use the schema markup on the detail pages - not on the result pages.
If you would like to go ahead - you could always try to measure the impact on the page by creating two (static) versions of a search result page - one with & one without markup and test both versions with webpagetest.org & Google page speed analyser. An alternative would be to using "lazy loading" - you first load the first x results (visible part on screen), when the user scrolls you load the next batch ...and so on. This way, the impact on loading times would remain minimal.
In each case, I would not try to show different pages to users & bots.
rgds,
Dirk
-
Hello Dirk,
Thanks for the reply.
Agreed that the impact of adding the few lines of extra code of schema.org will be zero on the load time of the pages. But it totally depends what content you are going to show on a page.
I want to implement Schema.org on the Search Result pages where a single page contains more than 50 listings with different information like Job Title, Company name, Skills, Job posted etc. For each i will have to use different properties as recommended by Google by which the load time of a page will definitely increase.
Please let me know for the above listed case.
Thanks
-
Try adding schema with meta tags in the html, for example:
This way you're telling bots your phone number with schema but it doesn't appear visibly to users. This is normally done with the latitude and longitude schema tags but you can use it for the others as well. Though I wouldn't rely on this as a permanent long-term solution as Google may change their policies on how they interpret content that is not visible to users.
-
It's a game of words. In the context of the question - if you would provide the schema tagging only to bots the tagged info could also be listed in the SERP's and the bots get a better understanding of what the page is all about. Final goal is off course to serve the user the best answers when he's searching. On the page itself however the user doesn't see any difference if the page is tagged with schema or not.
Dirk
-
Dirk I think you misunderstand my words. Schema for user means exactly the same that you wrote in last lines "Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Thanks
-
Hi Alick,
Schema.org is not for users - it is "a collection of schemas that webmasters can use to markup HTML pages in ways recognized by major search providers, and that can also be used for structured data interoperability (e.g. in JSON). Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Source: http://schema.org/
rgds,
Dirk
-
Hi Anirban,
I'm completely agree with Dirk second thing I would like to know what is the purpose of showing schema to bot only. In my limited understanding we use schema for user to show price, offers to users not bot.
Thanks
-
Hi Anirban,
The impact of adding the few lines of extra code of schema.org will be zero on the load time of your pages.
Apart from that, serving different content to bots & human users could be considered cloaking by search engines.
Implementing schema.org on the normal pages should do just fine!
rgds,
Dirk
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Should I combine pages?
Hi, Im not sure of the correct route to take here... We are a training provider and I manage the website. The main course offered is the transport manager CPC. Currently, I have a "catch all" landing page which links to each different course option: Landing page > Classroom Online Self study Distance learning The main keyword revolves around "transport manager cpc" I want searchers to land on the online page is they search "online transport manager CPC" for example but I think its confusing Google. I'm wondering if I should de-index the store pages (although some perform very well) and increase the content on the main landing page to rank for every related keyword on that page. Initially, I wanted to devalue the landing page in favor of the store pages but I'm unsure if that's the right way to go. I've stripped out the bulk of the keywords and content and shifted it to each individual page. but as above, Im now unsure if that's the right route to take. Any help would be greatly appreciated 👍 Thanks
On-Page Optimization | | dunbavand
Rich0 -
Sub-pages have no pa
I took over a website a few months ago which is not performing well at all for chosen keywords. When I first inspected it, I found a rel canonical tag pointing to the homepage on every page. This was quickly deleted and all the pages were fetched in webmaster tools. 3 months later and the website is still performing badly. When I use the mozbar, it shows that all of the sub-pages have a pa of 1. It is only a small site and all of the pages are linked to on the navbar in a simple way. The links are not made using javascript and all the pages are on the sitemap which is submitted to wmt. I have checked that all of the changes that have been made have been indexed as well. Could it be possible that google still sees the canonical tag even though its not there? I can't think of any other reason why the pages have no pa or why it is so far behind the competitors despite having better content and links. Also, the site is appropriate for adults, but I found (among the mess left for me) a meta ratings tag set to "general". This has now been deleted, could it negatively affect rankings?
On-Page Optimization | | maxweb0 -
Duplicate Pages software
Hey guys, i was told few hours ago about a system that can take few of your keywords and automatically will create new links and pages (in the map file) for your website, so a website that was build with 20 pages( for example) will be shown to SE as a site with hundreds of pages, thing that should help the SEO IS anyone heard about such a software? is it legal? any advice that you can give on this mater? Thanks i.
On-Page Optimization | | iivgi0 -
Locating Duplicate Pages
Hi, Our website consists of approximately 15,000 pages however according to our Google Webmaster Tools account Google has around 26,000 pages for us in their index. I have run through half a dozen sitemap generators and they all only discover the 15,000 pages that we know about. I have also thoroughly gone through the site to attempt to find any sections where we might be inadvertently generating duplicate pages without success. It has been over six months since we did any structural changes (at which point we did 301's to the new locations) and so I'd like to think that the majority of these old pages have been removed from the Google Index. Additionally, the number of pages in the index doesn't appear to be going down by any discernable factor week on week. I'm certain it's nothing to worry about however for my own peace of mind I'd like to just confirm that the additional 11,000 pages are just old results that will eventually disappear from the index and that we're not generating any duplicate content. Unfortunately there doesn't appear to be a way to download a list of the 26,000 pages that Google has indexed so that I can compare it against our sitemap. Obviously I know about site:domain.com however this only returned the first 1,000 results which all checkout fine. I was wondering if anybody knew of any methods or tools that we could use to attempt to identify these 11,000 extra pages in the Google index so we can confirm that they're just old pages which haven’t fallen out of the index yet and that they’re not going to be causing us a problem? Thanks guys!
On-Page Optimization | | ChrisHolgate0 -
On-page SEO reviews
Hi everyone 🙂 I was hoping someone could point me in a direction on where I could get my website's on-page SEO a review with recommendations. Title tags, meta description, code, H1s, content and so on... I know you can find most of what I'm asking online, but I would like a professional with a new set of eyes to help out. Thanks in advance for your time in helping!
On-Page Optimization | | AutoGlassRescue0 -
Is it necessary to add keywords to all of your pages?
Hi Everyone he company I work for has just built a new website with approximately 87 pages/sub pages. Should i be looking to add keywords and descriptions to all of these pages, via the allocated areas in the back end of the site? I am using "google's key words" tool to generate relevant key words. If any one has any advice it would be much appreciated. Thanks for you help Regards Pete
On-Page Optimization | | dawsonski0 -
Can I have a strong brand category page and a strong product page?
It seems Google base and other Comparison Shopping Engines like to see the brand in the product name. But, on my category page for that brand, website optimizer tells me including the brand name with each product is cannabilizes links. For example; I have a page for jewelerABC with 20 pieces of jewelry listed as well as original content about jewelerABC. I do not currently name these products as xyz by jewelerABC. This page comes up nicely in the serps. But in Google base The top listings for jewelry by jewelerABC seem to have every product named xyz by jewelerABC or JewelerABC xyzs. What is the best way to optimize.for both? Stephen
On-Page Optimization | | stephenfishman0