Good references/studies on mark up?
-
I'm looking to do some study in impact across the board on structured data and would like to know if anyone has any good studies on CTR, possibly rankings and overall performance.
Any awesome links would be helpful
-
Came across this last night too. http://searchengineland.com/from-microdata-schema-to-rich-snippets-markup-for-the-advanced-seo-162902
It talks about microdata schema and rich snippets.
-
Thanks for the additional content. I'm working on a study to give a very solid strong case for improving structured data on one of our sites. With semantic search being more prominent than ever, I think its extremely important for every site to have mark up whether or Google chooses to show this.
Social media functions and other future applications could very well use this data to classify content of a page(see ogp.me). Perhaps even bucketing links via structured data and using it to gauge relativity of linking pages etc.
-
Search Engine Land did a write up a few years ago about a 30% increase in CTR for results with structured data. http://searchengineland.com/how-to-get-a-30-increase-in-ctr-with-structured-markup-105830
Here's an actual Case Study by Jason Jersey of SEOVoom on Structured data on CTR and Rankings in general http://seovoom.com/central/structured-data/
Matt Cutts covered your question in Webmaster Help http://youtu.be/OolDzztYwtQ
My personal thoughts:
I think anything you can do to help the search engines know what your site is about and how it is structured will ultimately lead to more traffic. Matt's explanation is pretty good in that it the structured data may help you show up in certain places that you normally wouldn't show up because you didn't have it before. NewsArticle Schema is a good example of that. But don't let these sway you either way. I use it as much as possible. I also recommend using data highlighter in Google WMT if you haven't done so. It helps Google even more.
There was a post by Barry Schwartz over at Search Engine Roundtable about Google reducing Rich Snippets by 15% or so. Matt Cutts basically said that it would remove snippets for low quality sites. I personally think they are getting ready to ramp up testing for AgentRank and giving snippets to "authority" authors (just a hunch/guess as I predicted in 2011, after their patent update, that AgentRank (aka AuthorRank) would roll out in 2015. http://www.vzpro.com/2012-seo-prediction/)
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Use H2 or H3 for Six Ways to Hip Thrust https://www.fitness-china.com/hip-thrust
Our article about Hip Thrust https://www.fitness-china.com/hip-thrust We have sex tips about how to hip thrust. that use h2. but Final Words use h3 or h2?
On-Page Optimization | | ahislop5740 -
Will adding 1M (legitimate/correct) internal backlinks to an orphan page trip algo penalty?
We have a massive long tail user generated gamification strategy that has worked really well. Because of that success we haven't really been paying enough attention to SEO and in looking around caught some glaring issues. The section of our site that works as long tail goes from overview page > first classification > sub classification > specific long tail term page. Looks like we were relying on google to crawl/use forms to go from our overview page to the first classification BUT those resulting pages were orphaned - so www.mysite.com/product/category_1 defaulted back to the search page creating duplicate issues. www.mysite.com/product/category_1 and www.mysite.com/product/category_2 and www.mysite.com/product/category_3 all had duplicate content as they all reverted to the overview page. It's clear we need to make an actual breadcrumb trail and proper site taxonomy/linkage. I'm wanting to do this on just this one area first, but it's a big section with over 3M indexed "specific long tail term pages". I want to just add a simple breadcurmb trail in a sub navigation menu but doing so will literally create millions of new internal backlinks from specific term pages to their sub & parent category pages. Although we're missing the intermediary category breadcrumbs, we did have a breadcrumb coming back to the main overview page - that was tagged nofollow. So now I'm contemplating adding millions of (proper) backlinks and removing a nofollow tag from another million internal back links. All of this seems in line with "best practices" but what I have not been able to determine is if there is a proper/better way to roll these changes out so as to not trigger an algorithm penalty. I am also reticent about making too many changes too quickly but these are SEO 101 basics that need to be rectified. Is it a mistake to make good improvements too quickly? Thanks!
On-Page Optimization | | DrewProZ1 -
Help finding someone to handle crashing site/site optimization.
I need someone who can handle website/WordPress issues as they come up.For example, my site has gone down 4 times tonight, and my host can't figure it out. They also keep recommending that I optimize my site, but I don't know how. I need a go-to web person for this sort of thing. Any recommendations?
On-Page Optimization | | cbrant7770 -
Rel=canonical vs noindex/follow - tabs with individual URLs
Hi everyone I've got a situation that I haven't seen in quite this way before. I would like some advice on whether I should be rel=canonicalzing of noindexing/following a range of pages on a clients website. I've just started working on a website that creates individual URLs for tabs within each page which has resulted in several URLs being created for each listing: Example URLs: hotel-downtown-calgary hotel-downtown-calgary/gallery?tab hotel-downtown-calgary?tab hotel-downtown-calgary/map?tab hotel-downtown-calgary/facilities?tab hotel-downtown-calgary/reviews?tab hotel-downtown-calgary/in-the-area?tab Google has indexed over 1500 pages with the "?tab" parameter (there are 4380 page indexed for the site in total), and also seems to be indexing some of these pages without the "?tab" parameter i.e. ("hotel-downtown-calgary/reviews" instead of "hotel-downtown-calgary/reviews?tab") so the amount of potential duplication could be more. These tabbed pages are getting minimal traffic from organic search, so I've got no issues with taking them out of the index - the question is how. There are the issues I see: Each tab has the same title as the other tabs for each location, so lots of title duplication. Each individual tab doesn't have much content (although the content each tab has is unique). I would usually expect the tabs to be distinguished by the parameters only, not have unique URLs - if that was the case we wouldn't have a duplication issue. So the question is: rel=canonical or noindex/follow? I can see benefits of both. Looking forward to your thoughts!
On-Page Optimization | | Digitator0 -
Content on ecommerce categories - good or bad?
We have a case with a client where they previously had content on top of their most important ecommerce categories. The content was well integrated and should in my opinion enhance the category experience, but after doing some A/B testing they proved to only decrease the conversion rates when sending traffic directly to those categories. Around that topic I have two questions: Is it a bad thing to put the content BELOW the categories? I need examples of categories where content and products are very well integrated and enhances the category experience - any tips?
On-Page Optimization | | Inevo0 -
Duplicate Content aka 301 redirect from .com to .com/index.html
Moz reports are telling me that I have duplicate content on the home page because .com and .com/index.html are being seen as two pages. I have implemented 301 redirect using various codes I found online, but nothing seems to work. Currently I'm using this code. RewriteEngine On
On-Page Optimization | | omakad
RewriteBase /
RewriteCond %{HTTP_HOST} ^jacksonvilleacservice.com
RewriteRule ^index.html$ http://www.jacksonvilleacservice.com/ [L,R=301] Nothing is changing. What am I doing wrong? I have given it several weeks but report stays the same. Also according to webmasters tools they can't see this as duplicate content. What am I doing wrong?0 -
Summarize your question.Images being seen as duplicate content/pages
My images suddenly are appearing in my crawl reports as duplicate content, without meta tags, this happened over night and cant figure out why.
On-Page Optimization | | RBYoung0 -
Customer Review Capture / Google Approved Review Sites
We are interested in showing customer reviews on our website (in UK for seo purposes- UGC) and are initially reluctant to use a 'What customer say' or Testimonial page as we think customers may think we have just made these reviews up? I wanted to ask what methods you folks use to capture reviews? If you use 3rd party providers do you have any recommendations? (I found this link but it seems a little outdated as it doesn't include for example eKomi:http://www.seomoz.org/blog/how-to-rank-well-in-google-products-search-a-big-list-of-places-to-get-reviews) Thanks in advance.
On-Page Optimization | | jannkuzel0