What do Bing and Yahoo look for in a site?
-
Do Bing and Yahoo look for authoritative sites like google does? Do they punish sites for black hat or spamming?
The reason I ask these questions is because one of my competitors was ranking in first place for many great keywords in Google, they have the highest authority out of all of their competitors. They must have been punished by Google because now they are not ranking for any great keywords in Google. However they are ranking 1st in Bing and Yahoo for most the top keywords, getting the the most visibility out of all the sites.
I attached a small Graph with latest visibility for the sites with the top keywords from google and then I also included the company that was punished from google they are the green circles on the graph.
-
I think TEST is the keyword when Duane is talking about the index. Further down the page it makes it quite clear they will kick it back out again its no good.
“If the users love it, it stays. If the users don’t like it, it gets dropped. This is a way to determine if the users feel this was a quality result.”
Duane has said many times that they will not be indexing everything; they only want your best pages.
"Rand: Right, yeah. I was going to say, and Bing has been pretty good about
penalizing a lot of the links that look manipulative on the Web
too.Duane: Yeah. It's a natural part of keeping things clean, right?
At Bing, we are very keen on having a quality driven index. So, the main focus
we have is making sure that everything that gets in is a good resource, when
someone makes a query they get a realistic answer that is actually an answer to
their query. Not, here's some shallow depth data. I'm going to click on it, and
then oh, it's not really what I want. I go back and I try it again. We're trying
to shorten that number of searches to get to the final answer."Duane: Right, exactly. I love this idea, Rand, this whole pick your top 200, whatever the number happens to be for you, pick it and run with it. You don't need everything indexed. Pick your best stuff and make sure that's in there. Make sure your quality content is in there, right? Be sure that you look at the site and say, "What's the goal of this page? Is it to monetize ads? Is it to convert somehow? What is the goal of it? Is it optimized properly to do that? If it is, I want that indexed in the search engine ranking well."
http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholdsThat’s good news about the Social media, because every thing I build seems to rank high in Bing, with no social media. I guess that’s something I can fall back on, if rankings start to slip.
-
Here are some interesting insights from Duane Forrester, who is a senior product manager at Bing.
http://www.stonetemple.com/search-algorithms-and-bing-webmaster-tools-with-duane-forrester/
Two of the biggest things of interest are:
- The huge weight placed by Bing on user interaction with the search results as a ranking factor. This was amazing stuff. Basically, Bing is willing to test any page by indexing it. In fact you can pretty much directly inject any URL you want into their search results using the Submit URL feature of Bing Webmaster Tools. Then they will test it, and if the click interaction data is bad, out (or down) you go.
- The ranking of the priorities for publishers in Duane’s eyes. #1 Content #2 Social Media #3 Links. Links were rated as the third most important area. Third.
The article is very easy to read, with the highlights put in front. This is recent information from a couple of months ago.
-
Very interesting. I never knew that.
And wow, that's the oldschool Yahoo design. Haven't seen that look since viewing Yahoo.com in the WayBack machine..
-
Yahoo uses Google in Japan (not that you, or anyone really cares).
-
A large difference I've noticed with Bing vs Google in the years has been that Google is more inclined to index and place a site within the SERP's much quicker, basically giving a new site 'the benefit of the doubt'; however, that site must maintain a good standing throughout the course of the 'sandbox' period to ensure they don't drop off the map after a year or two.
Bing seems to show preference towards domains that are aged. Their search index, at least at one point; I'm sure they're working to update, or might have even done so already, doesn't seem to be as fresh as Google's, which has its advantages as well.
With Google, you'll often find many new sites at the top of the SERP's for any given search on a non-highly-competitive search term. Just Google's way of getting more information to the masses whether it's a scraped site of not (unfortunately, I'm still finding scraped sites in the index). Where Bing seems to have sites that are tried and true.
Just my observations over the years. However, it's been a while since I've really paid a whole lot of attention to this.
-
From what I have read and my own experiences, Bing is lot more fussy on what they index, its lot harder to get in the index,
I have found that Bing also likes clean code free from all violations. your site needs to be able to be crawled easily.
Bing is also quick to lose trust if you misuse things such as redirects, canonicals and sitemaps. Duane Forrester told me in regard to sitemaps that they will lose trust in your site map if your lastmod dates are not accurate, if you have any 404’s in it; they only want 200 status pages. You not only should have a sitemap. You should keep it up to date they have no intention of indexing everything that Google does.I have also got sites to well in Bing with no or few links, for pretty good keywords, so i dont think they rely on links so much.
-
Well, to begin, Yahoo search is now run off the Bing algorithm (algo). So while there may still be a "Yahoo Slurp" crawler out there, it's based on a different algo than once before. Bing now completely runs Yahoo search.
Search engines have their own algorithms. There is no specific algo that they all must adhere to. So while rankings for your site might go up in one engine, they might very well go down in another (or not move at all).
And I can assume Bing watches for black-hat SEO tactics, although I don't have any physical data to back that up. But it's safe to say they do.
Huge mistake website owners make is to optimize their sites for Google only. Google only makes up 65% (?) of the search market, so by optimizing for Google, and Google alone, you're cutting off a potential 35% of traffic.
There is a ton of forums, documentation, webmaster tools for Bing, just as there is Google, so you need to put in that extra effort to see what makes a site rank well in Bing.
As long as you stick to the fundamentals, ie. proper internal link structure, attain solid/safe, relevant backlinks to your site, use your Webmaster tools (and SEOmoz ;)) to make sure site errors and such are taken care of, and get your HTML error free with proper H1-H6 tags (where applicable)/title tags, meta tags, etc., then, and only then, should you start tweaking your site for direct optimization for each engine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Dealing with Dodgy Looking Traffic
Hi there I am really hoping someone can help. The site I run has started receiving traffic from the US (we are a UK run firm who don't ship overseas). Ordinarily, this wouldn't be a massive problem but the traffic is coming directly to lots of pages and instantly bouncing. I am worried this is going to negatively impact my rankings as drop off rate and conversions are getting hammered by this 'fake traffic'. The attached image shows the traffic for the homepage but its happening on every page with hundreds of hits bouncing and hurting my stats. Is there any way of dealing with this or reporting it to an authority or even Google itself? Any help would be greatly appreciated. George 7vprsJo
White Hat / Black Hat SEO | | BrinvaleBird0 -
On-site duplication working - not penalised - any ideas?
I've noticed a website that has been set up with many virtually identical pages. For example many of them have the same content (minimal text, three video clips) and only the town name varies. Surely this is something that Google would be against? However the site is consistently ranking near the top of Google page 1, e.g. http://www.maxcurd.co.uk/magician-guildford.html for "magician Guildford", http://www.maxcurd.co.uk/magician-ascot.html for "magician Ascot" and so on (even when searching without localisation or personalisation). For years I've heard SEO experts say that this sort of thing is frowned on and that they will get penalised, but it never seems to happen. I guess there must be some other reason that this site is ranked highly - any ideas? The content is massively duplicated and the blog hasn't been updated since 2012 but it is ranking above many established older sites that have lots of varied content, good quality backlinks and regular updates. Thanks.
White Hat / Black Hat SEO | | MagicianUK0 -
Tags on WordPress Sites, Good or bad?
My main concern is about the entire tags strategy. The whole concept has really been first seen by myself on WordPress which seems to be bringing positive results to these sites and now there are even plugins that auto generate tags. Can someone detail more about the pros and cons of tags? I was under the impression that google does not want 1000's of pages auto generated just because of a simple tag keyword, and then show relevant content to that specific tag. Usually these are just like search results pages... how are tag pages beneficial? Is there something going on behind the scenes with wordpress tags that actually bring benefits to these wp blogs? Setting a custom coded tag feature on a custom site just seems to create numerous spammy pages. I understand these pages may be good from a user perspective, but what about from an SEO perspective and getting indexed and driving traffic... Indexed and driving traffic is my main concern here, so as a recap I'd like to understand the pros and cons about tags on wp vs custom coded sites, and the correct way to set these up for SEO purposes.
White Hat / Black Hat SEO | | WebServiceConsulting.com1 -
How to fix doorway site
Hello, This client has a one page doorway site that is a copy of a category of his main site. It looks like the main site and has over 100 links to the site. We're cleaning things up to be white hat and we're wondering how to capture this traffic and link juice (this doorway has no backlinks though, it is an EMD) without a ton of money and effort. My thought so far is to put a uniquely designed paypal cart on there with top products and one link to the main site that says something like: To pay by credit card or to see more products, visit mainsite.com Would that be squeeky clean white hat or is that still a doorway headed for an update? What's best to do here on a low budget?
White Hat / Black Hat SEO | | BobGW0 -
Google authorship and multiple sites with multiple authors
Hi guys :). I am asking your help - basically I would like to know what would be the best way to set all of this up. Basically I have two main (e-commerce) sites, and a few other big web properties. What I would like to know is if it is ok to link the main sites to my real G+ account, and use alias G+ accounts for other web properties, or is that a kind of spamming? The thing is that I use a G+ account for those e-commerce sites, and would not necessarily want the other web properties to be linked to the same G+ account, as they are not really related. I do hope I was clear. Any insight would be appreciated. Thanks.
White Hat / Black Hat SEO | | sumare0 -
Has Panda help this site achieve great heights? How? and Why?
Today I went about my business in trying to understand what is happening in our market, eyewear, after the last Panda update. I was interested to know if any of our competitors were effected as much as we were for a very competitive key phrase To my surprise a new kid appeared on the block, well, on page one, position two. Imagine my second surprise, when the new kid turn out to be a 3 month old domain, yes 3 months, with zero page rank and zero back links. I was in for one more surprise before I stood up, walked to the window and gazed into space to contenplate the meaning of Panda and SEO as we know it. This third surprise was the site in question is a counterfeiting site using black hat SEO with fast results. It has a Blog its a good looking site with the key phrase menstioned a hundred times. google-UK-%20Search-Result.jpg panda-help.jpg
White Hat / Black Hat SEO | | ShoutChris0 -
Google SEVERE drop as of last week (oct 10) on long standing .org site
Hello Experts Wanted some imput if possible. I own a .org informational site that has been #1 in its category for Google Yahoo and Bing under a major keyword for years. The site is aged back to 2005 and all of the sudden it dropped on August 10 (Google only- Yahoo and Bing still #1)) but remained atop the primary keywords that it is namesaked for .org (xxxxyyyzzz.org) and then Oct 9-10 it dropped from the page 1 top ranking it had for years on that primary keyword to page 13. I dont know where to begin to look. Any ideas how something like this could happen and what "Stones" I should turn. We purchased the website and are not SEO gurus so just not sure. Any help would be appreciated
White Hat / Black Hat SEO | | TBKO1 -
Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations?
Hi All, In relation to this thread http://www.seomoz.org/q/what-happend-to-my-ranks-began-dec-22-detailed-info-inside I'm still getting whipped hard from Google, this week for some reason all rankings have gone for the past few days. What I was wondering though is this, when Google says- Does the site have duplicate, overlapping, or redundant articles on the same or similar topics with slightly different keyword variations? I assume my site hits the nail on the head- [removed links at request of author] As you can see I target LG Optimus 3D Sim Free, LG Optimus 3D Contract and LG Optimus 3D Deals. Based on what Google has said, I know think there needs to be 1 page that covers it all instead of 3. What I'm wondering is the best way to deal with the situation? I think it should be something like this but please correct me along the way 🙂 1. Pick the strongest page out of the 3 2. Merge the content from the 2 weaker pages into the strongest 3. Update the title/meta info of the strongest page to include the KW variations of all 3 eg- LG Optimus 3D Contract Deals And Sim Free Pricing 4. Then scatter contract, deals and sim free throughout the text naturally 5. Then delete the weaker 2 pages and 301 redirect to the strongest page 6. Submit URL removal via webmastertools for the 2 weaker pages What would you do to correct this situation? Am I on the right track?
White Hat / Black Hat SEO | | mwoody0