Does Schema.org markup create a conflict with Power Reviews' standard microformat markup for e-commerce product pages?
-
Does anyone have experience implementing Schema.org markup on e-commerce websites that are already using Power Reviews (now Bazaar)? In Google's documentation they say that it's generally not a good idea to use two types of semantic markup for the same item (reviews in this case), but I wouldn't think that there would be a problem marking up other items on the page with Schema such as price, stock status, etc...
Anyone care to provide some insight?
Also in a related topic, have you all noticed that Google has really dialed back the frequency in which they display rich snippets for product searches? A few weeks ago the site that I'm referring to had hundreds of products that were displaying snippets, now it seems that only about 10% (roughly) of them are still showing.
Thanks everybody.
-
I actually meant the new one- not the option that was in Labs. You can access it through:
Optimization>structured data , in the case of the domain I was referring to ( large ecommerce site). It does show the data and the URL that it is on.
-
Thanks for the response.
The rich snippet testing tool has actually been available for a long time, but they've recently made improvements and created a menu category for it (it was previously in the labs/beta section). However, the tool doesn't actually predict the snippets you get in search results, it just verifies the semantic markup and shows an example of what your snippet might look like. At least that's been my experience with it, and I've heard the same from other people around the industry.
Ideally Power Reviews/Bazaar should just update their markup to Schema.org format. Since that's the preferred format agreed upon by the powers that be, I don't understand why they wouldn't, unless their framework is just extremely rigid.
I appreciate your feedback about your client sites running PR. If I understand the (limited) documentation correctly, this shouldn't be a problem unless I was marking up the exact same data with two different formats. Just wanted to see if anyone else had direct experience. So thanks
-
You can use the tool in Webmaster tools ( it's new), that will show you the markup on the page. I haven't noticed that the power reviews interfere with the proper structure of product schema.org on my clients.
The tool will show you the mark up and the results.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have you ever seen or experienced a page indexed which is actually from a website which is blocked by robots.txt?
Hi all, We use robots file and meta robots tags for blocking website or website pages to block bots from crawling. Mostly robots.txt will be used for website and expect all the pages to not getting indexed. But there is a condition here that any page from website can be indexed by Google even the site is blocked from robots.txt; because crawler may find the page link somewhere on internet as stated here at last paragraph. I wonder if this really the case where some webpages have got indexed. And even we use meta tags at page level; do we need to block from robots.txt file? Can we use both techniques at a time? Thanks
Algorithm Updates | | vtmoz0 -
Using Google to find a discontinued product.
Hi Guys. I mostly use this forum for business questions, but now it's a personal one! I'm trying to find a supplier that might still have discontinued product. It's the Behritone C5A speaker monitor. All my searches bring up a plethora of pages that appear to sell the product... but they have no stock. (Wouldn't removing these pages make for a better internet?) No 2nd hand ones on eBay 😞 Do you have any suggestion about how I can get more relevant results... i.e find supplier that might still have stock? Any tips or trick I may be able to use to help me with this? Many thanks in advance to an awesome community 🙂 Isaac.
Algorithm Updates | | isaac6631 -
Rankings fluctuating by around 10 pages between night and day
Hi all, I'm experiencing something very odd with my website ranking at the moment. My homepage is fluctuating in rank for my main keyword by 10 pages every day and night. So, during the day i am on page 14, 15 or 16 for my main keyword yet by night i am on page 5 or 6. This trend has continued for the past 7 days now and i can't quite understand why this is. I'm using pagewash dot net to carry out manual searches and a ranking tool - both of which produce exactly the same result. Does anyone have any experience of this or why this is happening? My domain is around 8 years old and has around 50,000 pages. Any pointers would be greatly appreciated.
Algorithm Updates | | MarkHincks0 -
Is my page footer the reason keyword rankings have dropped?
Hi all, One of my sites http://henstuff.com/ has seen some ranking drops for major keywords over the past few weeks and I was wondering if it was something to do with Penguin not taking a positive view of link-filled footers. It is something we are looking at phasing out but wanted to get the opinions of the SEOMOZ community. Thanks! Rob
Algorithm Updates | | RobertHill0 -
Google.co.uk vs pages from the UK - anyone noticed any changes?
We've started to notice some changes in the rankings of Google UK and Google pages from the UK. Pages from the UK have always typically ranked higher, however it seems like these are slipping, and Google UK pages (pages from the web) are climbing. We've noticed a similar thing happening in the Bing/Yahoo algorithm as well. Just wondered if anyone else has anyone else noticed this? Thanks
Algorithm Updates | | Digirank0 -
Google site links on sub pages
Hi all Had a look for info on this one but couldn't find much. I know these days that if you have a decent domain good will often automatically put site links on for your home if someone searches for your company name, however has anyone seen these links appear for sub pages? For example, lets say I had a .com domain with /en /fr /de sub folders, each seoed for their location. If I were to then have domain.com/en/ as no1 in Google for my company in the UK would I be able to get site links under this or does it only work on the 'proper' homepage domain.com/ A client of mine wants to reorganise their website so they have different location sections ranking in different markets but they also want to keep having sitewide links as they like the look of it Thanks Carl
Algorithm Updates | | Grumpy_Carl0 -
(Ireland & USA) Speilling 'Z' v's 'S'
Hi, My site is called ExampleVirtualisation.ie it's only new but when I type Example Virtualistion (the S in the word) into Google.ie The suggestive spelling for Example virtualiztion (with the Z) keeps coming up. When I click the Suggestion Spelling with Example Virtualiztion (Z) another website arrives in position 1. My site is not being reconised for the Z type spelling. How can I get found? I was thinking of purchasing the ExampleVirtualiztion.ie (Z) as well & redirecting to my S spelling site. Also optimising my title & des tags with both S & Z spellings. I've only recently submitted my sitemap.xml in Google -hopefully this will help my site to be found. Apologies if the Questions sounds a bit tricky, Your advise is welcome thank you.
Algorithm Updates | | GlenBOB0 -
When Panda's attack...
I have a predicament. The site I manage (www.duhaime.org) has been hit by the Panda update but the system seems fixed against this site’s purpose. I need some advice on what i'm planning and what could be done. First, the issues: Content Length The site is legal reference including dictionary and citation look up. Hundreds (perhaps upwards of 1000) of pages, by virtue of the content, are thin. The acronym C.B.N.S. stands for “Common Bench Reports, New Series” a part of the English reports. There really isn’t too much more to say nor is there much value to the target audience in saying it. Visit Length as a Metric There is chatter claiming Google watches how long a person uses a page to gauge it’s value. Fair enough but, a large number of people that visit this site are looking for one small piece of data. They want the definition of a term or citation then they return to whatever caused the query in the first place. My strategy so far… Noindex some Pages Identify terms and citations that are really small – less than 500 characters – and put a no index tag on them. I will also remove the directory links to the pages and clean the sitemaps. This should remove the obviously troublesome pages. We’ll have to live with the fact these page won’t be found in Google’s index despite their value. Create more click incentives We already started with related terms and now we are looking at diagrams and images. Anything to punch up the content for that ever important second click. Expand Content (of course) The author will focus the next six months on doing his best to extend the content of these short pages. There are images and text to be added in many cases – perhaps 200 pages. Still won't be able to cover them all without heavy cut-n-paste feel. Site Redesign Looking to lighten up the code and boiler plate content shortly. We were working on this anyway. Resulting pages should have less than 15 hard-coded site-wide links and the disclaimer will be loaded with AJAX upon scroll. Ads units will be kept at 3 per page. What do you think? Are the super light pages of the citations and dictionary why site traffic is down 35% this week?
Algorithm Updates | | sprynewmedia0