Microformats & Microdata
-
Hi,
Does splitting data apart using microformats & Microdata, help Google better understand your content and in turn could be used to increase relevancy?
O and does anyone know if it's supported across major browsers.
-
Its basic purpose is to allow search spiders to better understand what is what on your site. For instance, if you are selling a product and implement your product markup correctly Google can tell (and somewhat understand) what is what from the price, to quality (customer reviews), to what it should tell people it does (description) and looks like (image).
Some of these are already integrated into the SERPs others are expected to be integrated at a later date. Cooking seems to be one that has been a test bed for markup for Google. Look at the differences between this page and this page.
It is not really intended for browsers, just for search spiders. People do not require markups to tell us what is what because we can make logical sense of what is on webpages.
If you are just starting to markup your site use this: http://schema.org/
-
It really depends on the data I believe.
Check this entry: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=146897
Also, this post over on searchengineland - http://searchengineland.com/schema-org-google-bing-yahoo-unite-79554 (and Google, MicroHoo and Schema.org)
As you should be able to see from those 2 posts, the 'general' answer to your question is: "Yes, and probably increasingly so". However it will also depend on what data you are trying to tag with microdata, to some extent...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub Directories Domain & Page Crawl Depth
Hi, I just bought an old domain with good backlinks and authority, that domain was technology product formerly. So, I want to make this domain for my money site. The purpose of this website is to serve technological information like WordPress tutorial and etc (free software or drivers). And I just installed a sub directory on this domain like https://maindomain.com/subdirectory/ and this directory I made for a free software like graphics drivers download (NVIDIA or AMD). What you think with this website? Is it make sense? Wait, I just added this domain to my campaign at MOZ and the result shown my sub directory was 6 times of crawl depth. Is it good for directory or I need to move the sub directory to my main site? Thank you, hope someone answer my confuse. Best Regard, Matthew.
Intermediate & Advanced SEO | | matthewparkman0 -
Optimizing Product Catalogs for Multiple Brick & Mortar Locations
We're working on a project for a retail client who has multiple (5+) brick and mortar store locations in a given geographical area. They're regional, so they have locations in multiple states. We're optimizing their content (coupons, events, products, etc) across their site, but we're running into the issue of ranking well for specific products in one location, but not as well (or not at all) in others. The keywords we would like to rank for generally aren't super competitive, we're dealing with commodity products in local retail markets, so in most cases, good on page optimization is enough to rank in the top couple results. Our current situation: (specific examples are fictitious but representative) Title: My Company | Dogwood Trees - Fredericksburg, VA, Rocky Mt, NC, Rock Hill, SC…
Intermediate & Advanced SEO | | cballinger
Url: http://mycompany.com/catalog/product/dogwood-trees The content on the page is generally well optimized. We've claimed all the locations in Google places and we've deployed schema.org markup for each location that carries the item on the product page. We have specific location pages that rank well for Company name or Company Name Location, but the actual goal is to have the product page come up in each location. In the example above, we would rank #1 for "Dogwood Trees Fredericksburg VA" or just "Dogwood Trees" if the searcher is in or around Fredericksburg, on the first page for "Dogwood Trees Rocky Mt, NC", but not at all for any other locations. As these aren't heavily linked to pages, this indicates the title tag + on page content is probably our primary ranking factor, so as Google cuts the keyword relevance at the tail of the title tag, the location keywords stop helping us. What is the proper way to do this? A proposed solution we're discussing is subfolder-ing all the locations for specific location related content. For Example: My Company | Dog wood Trees - Fredericksburg, VA, Rocky Mt, NC, Rock Hill, SC…http://mycompany.com/catalog/product/dogwood-trees Becomes: My Company | Dogwood Trees - Fredericksburg, VA
http://mycompany.com/fredericksburg-va/product/dogwood-trees My Company | Dogwood Trees - Rocky Mt, NC
http://mycompany.com/rocky-mt-nc/product/dogwood-trees My Company | Dogwood Trees - Rock Hill, SC
http://mycompany.com/rock-hill-sc/product/dogwood-trees Of course, this is the definition of duplicate content, which concerns me, is there a "Google approved" way to actually do this? It's the same exact tree being sold from the same company in multiple locations. Google is essentially allowing us to rank well for whichever location we put first in the title tag, but not the others. Logically, it makes complete sense that a consumer in Rock Hill, SC should have the same opportunity to find the product as one in Fredericksburg, VA. In these markets, the client is probably one of maybe three possible merchants for this product within 20 miles. As I said, it's not highly competitive, they just need to show up. Any thoughts or best practices on this would be much appreciated!2 -
Philosophy & Deep Thoughts On Tag/Category URLs
Hello, SEO Gurus! First off, my many thanks to this community for all of your past help and perspective. This is by far the most valuable SEO community on the web, and it is precisely because of all of you being here. Thanks! I've recently kicked off a robust niche biotech news publishing site for a client, and in the first 6 weeks, we've generated 15K+ views and 9300 visits. The site is built on the WordPress platform. I'm well aware that a best practice is to noindex tag and category pages, as I've heard SEOs say that they potentially lead to duplicate content issues. We're using tags and categories heavily, and to date, we've had just 282 visits from tag & category pages. So, that's 2.89% of our traffic; the vast majority of traffic has landed on the homepage or article pages (we are using author markup). Here's my question, though, and it's more philosophical: do these pages really cause a duplicate content issue? Isn't Google able to determine that said page is a tag page, and thus not worthy of duplicate content penalties? If not, then why not? To me, tag/category pages are sometimes better content pages to have ranked than article pages, since, for news especially, they potentially give searchers a better search result (particularly for short tail keywords). For example, if I write articles all the time about the Mayo Clinic," I'd rather have my evergreen "Mayo Clinic" tag page rank on page one for the keyword "mayo clinic" than just one specific article that very quickly drops out of the news cycle. Know what I mean? So, to summarize: 1. Are doindexed tag/category pages really a duplicate content problem, and if so, why the heck? 2. Is there a strategy for ranking tag/category pages for news publishing sites ahead of article pages? Thanks as always for your time and attention. Kind Regards, Mike
Intermediate & Advanced SEO | | RCNOnlineMarketing0 -
Moving popular blog from root to subdomain. Considerations & impact?
I'd like to move the popular company blog from /ecommerce-blog to blog.bigcommerce.com.WordPress application is currently living inside the application that runs the .com and is adding a large amount of files to the parent app, which results in longer deployment times than we'd like. We would use HTTP redirection to handle future requests (e.g. HTTP status code 301). How can this be handled from a WP point of view? What is the impact of SEO, rankings, links, authority? Thanks.
Intermediate & Advanced SEO | | fullstackmarketing.io0 -
Penguin & Panda: Geographic Penalities?
Has anyone ever come across information about a website appearing strongly in SERP's in one region, but poorly in another? (ie: great in Europe, not so great in N. America) If so, perhaps it is a Panda or Penguin issue?
Intermediate & Advanced SEO | | Prospector-Plastics0 -
"Starting Over" With A New Domain & 301 Redirect
Hello, SEO Gurus. A client of mine appears to have been hit on a non-manual/algorithm penalty. The penalty appears to be Penguin-like, and the client never received any message (not that that means it wasn't manual). Prior to my working with her, she engaged in all kinds of SEO fornication: spammy links on link farms, shoddy article marketing, blog comment spam -- you name it. There are simply too many tens of thousands of these links to have removed. I've done some disavowal, but again, so much of the link work is spam. She is about to launch a new site, and I am tempted to simply encourage her to buy a new domain and start over. She competes in a niche B2B sector, so it is not terribly competitive, and with solid content and link earning, I think she'd be ok. Here's my question: If we were to 301 the old website to the new one, would the flow of page rank outperform any penalty associated with the site? (The old domain only has a PR of 2). Anyone like my idea of starting over, rather than trying to "recover?" I thank you all in advance for your time and attention. I don't take it for granted.
Intermediate & Advanced SEO | | RCNOnlineMarketing0 -
How to get power tweets & Likes for social signals!
Hi, Just been looking into social signals a little deeper. From what I have read a tweet from one page is not the same as a tweet from another page, the authority and influence is also a big part. So a tweet from CNN does a lot more then a tweet from a random. So how do you find these authority and influential pages/users? I have come across Klout.com which gives a score out of 100, which is one way I guess BUT I have also noticed mozbar stats change for different facebook pages. Q: Can you use the mozbar on facebook & twitter pages to workout who will generate the best social signals? Cheers
Intermediate & Advanced SEO | | activitysuper0 -
ECommerce syndication & duplicate content
We have an eCommerce website with original software products. We want to syndicate our content to partner and affiliate websites, but are worried about the effect of duplicate content all over the web. Note that this is a relatively high profile project, where thousands of sites will be listing hundreds of our products, with the exact same name, description, tags, etc. We read the wonderful and relevant post by Kate Morris on this topic (here: http://mz.cm/nXho02) and we realize the duplicate content is never the best option. Some concrete questions we're trying to figure out: 1. Are we risking penalties of any sort? 2. We can potentially get tens of thousands of links from this concept, all with duplicate content around them, but from PR3-6 sites, some with lots of authority. What will affect our site more - the quantity of mediocre links (good) or the duplicate content around them (bad)? 3. Should we sacrifice SEO for a good business idea?
Intermediate & Advanced SEO | | erangalp0