How could Google define "low quality experience merchants"?
-
Matt Cutts mentioned at SXSW that Google wants to take into consideration the quality of the experience ecommerce merchants provide and work this into how they rank in SERPs. Here's what he said if you missed it:
"We have a potential launch later this year, maybe a little bit sooner, looking at the quality of merchants and whether we can do a better job on that, because we don’t want low quality experience merchants to be ranking in the search results.”
My question; how exactly could Google decide if a merchant provides a low and high quality experience? I would image it would be very easy for Google to decide this with merchants in their Trusted Store program. I wonder what other data sets Google could realistically rely upon to make such a judgment. Any ideas or thoughts are appreciated.
-
I would agree that the Google Trusted store would be a good place for them to start, very convenient.
They could also use ratings from other review sites, like Yelp, etc. and possibly even social signal via Google Plus, Twitter, etc.
-
I can see them doing something with longevity tracking and repeat customers. They could possibly track to see if people are return shoppers to a site.
And then there is the obvious reviews of a store from Google Wallet users when you are a Google Wallet store.
I think there are other things that they could do just in the design of a site. They could possibly figure out what design elements make an experience easier on the shopper.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Many meta descriptions ignored by Google
Hi all, We have recently added the meta descriptions for more than 50 pages of our website. It's been more than a week and all the pages have been indexed. But still I can see most of the pages in Google results didn't show up with recently added meta description, but the content from page like how it used to be. I wonder what's wrong with this scenario. Please guide of someone aware of this. Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Site not indexed on Google UK after 4 days?
Hello!
Algorithm Updates | | digitalsoda
Wonder if anyone can help with this one? I have an ecommerce site www.doggydazzles.co.uk which went live on Friday and was submitted to Google via webmaster tools on saturday morning, but I can't find any trace of it in a google search?
I'm a bit stuck with this as its never happened to any of my other sites.
Can anyone help please or make suggestions as to what I can do to get ranked quicker? Thanks0 -
Best practice for cleaning up multiple Google Places listings and multiple Google accounts when logins were lost.
We are an inbound marketing agency, most of our clients are not relying on local seo. I have a pretty good understanding of it when starting fresh but not so much in joining a "movie in progress" kind of scenario. Recently we've brought on two clients who have had their websites in place for awhile, have made small attempts at marketing themselves online over the years and its resulted in multiple Google places listings, variations of the company names (one of them changed their name), worried there are yet more accounts out there they aren't aware of, etc (analytics, and others from well intentioned employees and past service providers - no internal leadership at the company level). In reading Google help forums I'm seeing some recently having their accounts suspended when they try to clean things up - in one case a person setup a new Google account thinking he would start fresh and in trying to claim listings, get rid of duplicates, etc. his account was suspended. What is the CURRENT recommended course of action in situations like these? With all the changes going on with Google, I don't know which route to take and have combed the Internet reading articles about this (including Google's resources) - would like some current real world advise.
Algorithm Updates | | rhgraves651 -
Why google index ip address instead of the domain name?
I have a website ,now google index ip address of it instead of the domain name,I have used 301 redirected to the domain name,but how to change the index IP to its domain name? And why google index the IP address?
Algorithm Updates | | frankfans1170 -
Is this a possible Google penalty scenario?
In January we were banned from Google due to duplicate websites because of a server configuration error by our previous webmaster. Around 100 of our previously inactive domain names were defaulted to the directory of our company website during a server migration, thus showing the exact same site 100 times... obviously Google was not game and banned us. At the end of February we were allowed back into the SERPS after fixing the issue and have since steadily regained long-tail keyword phrase rankings, but in Google are still missing our main keyword phrase. This keyword phrase brings in the bulk of our best traffic, so obviously it's an issue. We've been unable to get above position 21 for this keyword, but in Yahoo, Bing, and Yandex (Russian SE) we're positions 3, 3, and 7 respectively. It seems to me there has to be a penalty in effect, as this keyword gets between 10 and 100 times as much traffic in Google than any of the ones we're ranked for, what do you think? EDIT: I should mention in the 4-5 years prior to the banning we had been ranked between 15 and 4th in Google, 80% of the time on the first page.
Algorithm Updates | | ACann0 -
Is there a way to know what rank my site is listed on google ?
My current client web page was listed at the 4th page 1 month ago. Im trying real hard to make him understand that the traffic from beiing on the first page is important and that he need to give me additionnal ressource to make it happen ( i don't prog at all). So i had the idea of checking every page to see whats is current rank. but instead of looking from page 1 to page X, i was wondering if there was something somewhere that could give me my rank right away. It woud help saving time. Thx.
Algorithm Updates | | Promoteam0 -
Rankings in Bing/Yahoo lower than in Google
Other than a few keywords, my rankings are consistently lower in MSN/Bing/Yahoo than in Google. Any ideas or suggestions as to why?
Algorithm Updates | | NueMD0