What do Bing and Yahoo look for in a site?
-
Do Bing and Yahoo look for authoritative sites like google does? Do they punish sites for black hat or spamming?
The reason I ask these questions is because one of my competitors was ranking in first place for many great keywords in Google, they have the highest authority out of all of their competitors. They must have been punished by Google because now they are not ranking for any great keywords in Google. However they are ranking 1st in Bing and Yahoo for most the top keywords, getting the the most visibility out of all the sites.
I attached a small Graph with latest visibility for the sites with the top keywords from google and then I also included the company that was punished from google they are the green circles on the graph.
-
I think TEST is the keyword when Duane is talking about the index. Further down the page it makes it quite clear they will kick it back out again its no good.
“If the users love it, it stays. If the users don’t like it, it gets dropped. This is a way to determine if the users feel this was a quality result.”
Duane has said many times that they will not be indexing everything; they only want your best pages.
"Rand: Right, yeah. I was going to say, and Bing has been pretty good about
penalizing a lot of the links that look manipulative on the Web
too.Duane: Yeah. It's a natural part of keeping things clean, right?
At Bing, we are very keen on having a quality driven index. So, the main focus
we have is making sure that everything that gets in is a good resource, when
someone makes a query they get a realistic answer that is actually an answer to
their query. Not, here's some shallow depth data. I'm going to click on it, and
then oh, it's not really what I want. I go back and I try it again. We're trying
to shorten that number of searches to get to the final answer."Duane: Right, exactly. I love this idea, Rand, this whole pick your top 200, whatever the number happens to be for you, pick it and run with it. You don't need everything indexed. Pick your best stuff and make sure that's in there. Make sure your quality content is in there, right? Be sure that you look at the site and say, "What's the goal of this page? Is it to monetize ads? Is it to convert somehow? What is the goal of it? Is it optimized properly to do that? If it is, I want that indexed in the search engine ranking well."
http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholdsThat’s good news about the Social media, because every thing I build seems to rank high in Bing, with no social media. I guess that’s something I can fall back on, if rankings start to slip.
-
Here are some interesting insights from Duane Forrester, who is a senior product manager at Bing.
http://www.stonetemple.com/search-algorithms-and-bing-webmaster-tools-with-duane-forrester/
Two of the biggest things of interest are:
- The huge weight placed by Bing on user interaction with the search results as a ranking factor. This was amazing stuff. Basically, Bing is willing to test any page by indexing it. In fact you can pretty much directly inject any URL you want into their search results using the Submit URL feature of Bing Webmaster Tools. Then they will test it, and if the click interaction data is bad, out (or down) you go.
- The ranking of the priorities for publishers in Duane’s eyes. #1 Content #2 Social Media #3 Links. Links were rated as the third most important area. Third.
The article is very easy to read, with the highlights put in front. This is recent information from a couple of months ago.
-
Very interesting. I never knew that.
And wow, that's the oldschool Yahoo design. Haven't seen that look since viewing Yahoo.com in the WayBack machine..
-
Yahoo uses Google in Japan (not that you, or anyone really cares).
-
A large difference I've noticed with Bing vs Google in the years has been that Google is more inclined to index and place a site within the SERP's much quicker, basically giving a new site 'the benefit of the doubt'; however, that site must maintain a good standing throughout the course of the 'sandbox' period to ensure they don't drop off the map after a year or two.
Bing seems to show preference towards domains that are aged. Their search index, at least at one point; I'm sure they're working to update, or might have even done so already, doesn't seem to be as fresh as Google's, which has its advantages as well.
With Google, you'll often find many new sites at the top of the SERP's for any given search on a non-highly-competitive search term. Just Google's way of getting more information to the masses whether it's a scraped site of not (unfortunately, I'm still finding scraped sites in the index). Where Bing seems to have sites that are tried and true.
Just my observations over the years. However, it's been a while since I've really paid a whole lot of attention to this.
-
From what I have read and my own experiences, Bing is lot more fussy on what they index, its lot harder to get in the index,
I have found that Bing also likes clean code free from all violations. your site needs to be able to be crawled easily.
Bing is also quick to lose trust if you misuse things such as redirects, canonicals and sitemaps. Duane Forrester told me in regard to sitemaps that they will lose trust in your site map if your lastmod dates are not accurate, if you have any 404’s in it; they only want 200 status pages. You not only should have a sitemap. You should keep it up to date they have no intention of indexing everything that Google does.I have also got sites to well in Bing with no or few links, for pretty good keywords, so i dont think they rely on links so much.
-
Well, to begin, Yahoo search is now run off the Bing algorithm (algo). So while there may still be a "Yahoo Slurp" crawler out there, it's based on a different algo than once before. Bing now completely runs Yahoo search.
Search engines have their own algorithms. There is no specific algo that they all must adhere to. So while rankings for your site might go up in one engine, they might very well go down in another (or not move at all).
And I can assume Bing watches for black-hat SEO tactics, although I don't have any physical data to back that up. But it's safe to say they do.
Huge mistake website owners make is to optimize their sites for Google only. Google only makes up 65% (?) of the search market, so by optimizing for Google, and Google alone, you're cutting off a potential 35% of traffic.
There is a ton of forums, documentation, webmaster tools for Bing, just as there is Google, so you need to put in that extra effort to see what makes a site rank well in Bing.
As long as you stick to the fundamentals, ie. proper internal link structure, attain solid/safe, relevant backlinks to your site, use your Webmaster tools (and SEOmoz ;)) to make sure site errors and such are taken care of, and get your HTML error free with proper H1-H6 tags (where applicable)/title tags, meta tags, etc., then, and only then, should you start tweaking your site for direct optimization for each engine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Competitor has same site with multiple languages
Hey Moz, I am working with a dating review website and we have noticed one of our competitors is basically making duplicated of their site with .com, .de, .co.uk, etc. My first thought is this is basically a way to game the system but I could be wrong. They are tapping into googles geo results by including major cities in each state, i.e. "dating in texas" "dating in atlanta" however the content itself doesn't really change. I can't figure out exactly why they are ranking so much higher. For example using some other SEO tools they have a traffic estimate of $500,000 monthly, where as we are sitting around $2000. So, either the traffic estimates are grossly misrepresenting traffic volume, OR they really are crushing it. TLDR: Is geo locating/translating sites a valid way to create backlinks? It's seems a lot like a PBN.
White Hat / Black Hat SEO | | HashtagHustler0 -
Dealing with Dodgy Looking Traffic
Hi there I am really hoping someone can help. The site I run has started receiving traffic from the US (we are a UK run firm who don't ship overseas). Ordinarily, this wouldn't be a massive problem but the traffic is coming directly to lots of pages and instantly bouncing. I am worried this is going to negatively impact my rankings as drop off rate and conversions are getting hammered by this 'fake traffic'. The attached image shows the traffic for the homepage but its happening on every page with hundreds of hits bouncing and hurting my stats. Is there any way of dealing with this or reporting it to an authority or even Google itself? Any help would be greatly appreciated. George 7vprsJo
White Hat / Black Hat SEO | | BrinvaleBird0 -
Site build in the 80% of canonical URLs - What is the impact on visibility?
Hey Everyone, I represent international wall decorations store where customer can freely choose a pattern to be printed on a given material among a few milions of patterns. Due to extreme large number of potential URL combinations we struggle with too many URL adressess for a months now (search console notifications). So we finally decided to reduce amount of products with canonical tag. Basing on users behavior, our business needs and monthly search volume data we selected 8 most representative out of 40 product categories and made them canonical toward the rest. For example: If we chose 'Canvas prints' as our main product category, then every 'Framed canvas' product URL points rel=canonical tag toward its equivalent URL within 'Canvas prints' category. We applied the same logic to other categories (so "Vinyl wall mural - Wild horses running" URL points rel=canonical tag to "Wall mural - Wild horses running" URL, etc). In terms of Googlebot interpretation, there are really tiny differences between those Product URLs, so merging them with rel=canonical seems like a valid use. But we need to keep those canonicalised URLs for users needs, so we can`t remove them from a store as well as noindex does not seem like an good option. However we`re concerned about our SEO visibility - if we make those changes, our site will consist of ~80% canonical URLs (47,5/60 millions). Regarding your experience, do you have advices how should we handle that issue? Regards
White Hat / Black Hat SEO | | _JediMindBender
JMB0 -
Somebody took an article from my site and posted it on there own site but gave it credit back to my site is this duplicate content?
Hey guys, This question may sound a bit drunk, but someone copied our article and re-posted it on their site the exact article, however the article was credited to our site and the original author of the article had approved the other site could do this. We created the article first though, Will this still be regarded as duplicate content? The owner of the other site has told us it wasn't because they credited it. Any advice would be awesome Thanks
White Hat / Black Hat SEO | | edward-may0 -
Why is a site that does all the wrong things dominating?
A site that is a competitor of ours is basically dominating the search results despite doing everything you're not supposed to do, including: Purchasing links Having content that is thin, templated, and duplicate - adds little value Owning half a dozen other sites for linking to each other (link wheel?) We spend a lot of time on our content and making it the most useful it can be for our visitors. Granted our site is newer but we avoid these gray/black hat practices and yet we're not ranking nearly as high. What gives?
White Hat / Black Hat SEO | | Harbor_Compliance0 -
Penguin destroys 1 of my sites! Any ideas why the other was spared?
I have 2 main sites for my business. One is a creaky homestead site about 4 years old Another is a much more sophisticated wordpress site now almost 2 years old. That site's traffic steadily increased until May of 2011 when it suffereed a 25 to 30% decline probably due to Panda. I did all of the recommended fixes with little effect until about 3 months ago when its traffic started going up again and had almost a complete recovery until last week when my traffic is down about 95%. I strongly suspecct the penguin. Interestingly, my old site has been virtually unaffectted even though bost sites are fairly similar, on both sites I started with a lot of directory links including DMOZ, Yahoo, BOTW, some strong lawyer sites like NOLO.COM, Lawyers.com, and others not so strong but I tried to get the best directories I could find. Then I started getting a lot of natural links but some of these aee pretty junky sites and scraper type sites. I am curious if anyone has any thoughts on why www.uncontesteddivorce-nyc.com was hit so hard while www.affordable-uncontested-divorce.com is unscathed. The newer site has, accoring to majestic seo and market samurai, around 35, 000 backlinks, while the older site has around 3500. Thanks, Paul
White Hat / Black Hat SEO | | diogenes0 -
Yahoo Slurp Bot 3.0 Going Crazy
On one of our sites, since the Summer, Yahoo Slurp bot has been crawling our pages at about 5 times a minute. We have put a crawl delay on it and it does not respect our robots.txt. Now the issue is it's triggering javascript (which bots shouldn't) triggering our adsense, ad server, analytics information, etc. We've thought of banning the bot all together but get a good amount of Yahoo traffic. We've though about programmatic-ly not showing the javascript (ad + analytic) tags but are slightly afraid the Yahoo might consider this cloaking. What are the best practices to deal with this bad bot.
White Hat / Black Hat SEO | | tony-755340