What do Bing and Yahoo look for in a site?
-
Do Bing and Yahoo look for authoritative sites like google does? Do they punish sites for black hat or spamming?
The reason I ask these questions is because one of my competitors was ranking in first place for many great keywords in Google, they have the highest authority out of all of their competitors. They must have been punished by Google because now they are not ranking for any great keywords in Google. However they are ranking 1st in Bing and Yahoo for most the top keywords, getting the the most visibility out of all the sites.
I attached a small Graph with latest visibility for the sites with the top keywords from google and then I also included the company that was punished from google they are the green circles on the graph.
-
I think TEST is the keyword when Duane is talking about the index. Further down the page it makes it quite clear they will kick it back out again its no good.
“If the users love it, it stays. If the users don’t like it, it gets dropped. This is a way to determine if the users feel this was a quality result.”
Duane has said many times that they will not be indexing everything; they only want your best pages.
"Rand: Right, yeah. I was going to say, and Bing has been pretty good about
penalizing a lot of the links that look manipulative on the Web
too.Duane: Yeah. It's a natural part of keeping things clean, right?
At Bing, we are very keen on having a quality driven index. So, the main focus
we have is making sure that everything that gets in is a good resource, when
someone makes a query they get a realistic answer that is actually an answer to
their query. Not, here's some shallow depth data. I'm going to click on it, and
then oh, it's not really what I want. I go back and I try it again. We're trying
to shorten that number of searches to get to the final answer."Duane: Right, exactly. I love this idea, Rand, this whole pick your top 200, whatever the number happens to be for you, pick it and run with it. You don't need everything indexed. Pick your best stuff and make sure that's in there. Make sure your quality content is in there, right? Be sure that you look at the site and say, "What's the goal of this page? Is it to monetize ads? Is it to convert somehow? What is the goal of it? Is it optimized properly to do that? If it is, I want that indexed in the search engine ranking well."
http://www.seomoz.org/blog/bings-duane-forrester-on-webmaster-tools-metrics-and-sitemap-quality-thresholdsThat’s good news about the Social media, because every thing I build seems to rank high in Bing, with no social media. I guess that’s something I can fall back on, if rankings start to slip.
-
Here are some interesting insights from Duane Forrester, who is a senior product manager at Bing.
http://www.stonetemple.com/search-algorithms-and-bing-webmaster-tools-with-duane-forrester/
Two of the biggest things of interest are:
- The huge weight placed by Bing on user interaction with the search results as a ranking factor. This was amazing stuff. Basically, Bing is willing to test any page by indexing it. In fact you can pretty much directly inject any URL you want into their search results using the Submit URL feature of Bing Webmaster Tools. Then they will test it, and if the click interaction data is bad, out (or down) you go.
- The ranking of the priorities for publishers in Duane’s eyes. #1 Content #2 Social Media #3 Links. Links were rated as the third most important area. Third.
The article is very easy to read, with the highlights put in front. This is recent information from a couple of months ago.
-
Very interesting. I never knew that.
And wow, that's the oldschool Yahoo design. Haven't seen that look since viewing Yahoo.com in the WayBack machine..
-
Yahoo uses Google in Japan (not that you, or anyone really cares).
-
A large difference I've noticed with Bing vs Google in the years has been that Google is more inclined to index and place a site within the SERP's much quicker, basically giving a new site 'the benefit of the doubt'; however, that site must maintain a good standing throughout the course of the 'sandbox' period to ensure they don't drop off the map after a year or two.
Bing seems to show preference towards domains that are aged. Their search index, at least at one point; I'm sure they're working to update, or might have even done so already, doesn't seem to be as fresh as Google's, which has its advantages as well.
With Google, you'll often find many new sites at the top of the SERP's for any given search on a non-highly-competitive search term. Just Google's way of getting more information to the masses whether it's a scraped site of not (unfortunately, I'm still finding scraped sites in the index). Where Bing seems to have sites that are tried and true.
Just my observations over the years. However, it's been a while since I've really paid a whole lot of attention to this.
-
From what I have read and my own experiences, Bing is lot more fussy on what they index, its lot harder to get in the index,
I have found that Bing also likes clean code free from all violations. your site needs to be able to be crawled easily.
Bing is also quick to lose trust if you misuse things such as redirects, canonicals and sitemaps. Duane Forrester told me in regard to sitemaps that they will lose trust in your site map if your lastmod dates are not accurate, if you have any 404’s in it; they only want 200 status pages. You not only should have a sitemap. You should keep it up to date they have no intention of indexing everything that Google does.I have also got sites to well in Bing with no or few links, for pretty good keywords, so i dont think they rely on links so much.
-
Well, to begin, Yahoo search is now run off the Bing algorithm (algo). So while there may still be a "Yahoo Slurp" crawler out there, it's based on a different algo than once before. Bing now completely runs Yahoo search.
Search engines have their own algorithms. There is no specific algo that they all must adhere to. So while rankings for your site might go up in one engine, they might very well go down in another (or not move at all).
And I can assume Bing watches for black-hat SEO tactics, although I don't have any physical data to back that up. But it's safe to say they do.
Huge mistake website owners make is to optimize their sites for Google only. Google only makes up 65% (?) of the search market, so by optimizing for Google, and Google alone, you're cutting off a potential 35% of traffic.
There is a ton of forums, documentation, webmaster tools for Bing, just as there is Google, so you need to put in that extra effort to see what makes a site rank well in Bing.
As long as you stick to the fundamentals, ie. proper internal link structure, attain solid/safe, relevant backlinks to your site, use your Webmaster tools (and SEOmoz ;)) to make sure site errors and such are taken care of, and get your HTML error free with proper H1-H6 tags (where applicable)/title tags, meta tags, etc., then, and only then, should you start tweaking your site for direct optimization for each engine.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hlp with site setup
Hi there and thanks for the great information, certainly lots to take in. Can anyone suggest the best way to setup product / category url structure for a store? At the moment we have something like domainname.com/parentcategory/subcategory/product name.html As the product url, we edited url structure using a plugin, we don't use default WooCommerce url settings. domainname.com/parentcategory/subcategory/product name.html. this can sometimes be long But when you click on the product the url changes to the following. domainname.com/product name.html. This can shorted the url by 40% and still have keyword in url Is there any benefit in doing his? Re canonical urls, I only have about 15 products that are selected in many categories.the other 200 are under once category only. Product pages don't have many backlinks at the moment. Thanking you so much.
White Hat / Black Hat SEO | | IvanaDaulay0 -
Duplication content management across a subdir based multisite where subsites are projects of the main site and naturally adopt some ideas and goals from it
Hi, I have the following problem and would like which would be the best solution for it: I have a site codex21.gal that is actually part of a subdirectories based multisite (galike.net). It has a domain mapping setup, but it is hosted on a folder of galike.net multisite (galike.net/codex21). My main site (galike.net) works as a frame-brand for a series of projects aimed to promote the cultural & natural heritage of a region in NW Spain through creative projects focused on the entertainment, tourism and educational areas. The projects themselves will be a concretion (put into practice) of the general views of the brand, that acts more like a company brand. CodeX21 is one of those projects, it has its own logo, etc, and is actually like a child brand, yet more focused on a particular theme. I don't want to hide that it makes part of the GALIKE brand (in fact, I am planning to add the Galike logo to it, and a link to the main site on the menu). I will be making other projects, each of them with their own brand, hosted in subsites (subfolders) of galike.net multisites. Not all of them might have their own TLD mapped, some could simply be www.galike.net/projectname. The project codex21.gal subsite might become galike.net/codex21 if it would be better for SEO. Now, the problem is that my subsite codex21.gal re-states some principles, concepts and goals that have been defined (in other words) in the main site. Thus, there are some ideas (such as my particular vision on the possibilities of sustainable exploitation of that heritage, concepts I have developed myself as "narrative tourism" "geographical map as a non lineal story" and so on) that need to be present here and there on the subsite, since it is also philosophy of the project. BUT it seems that Google can penalise overlapping content in subdirectories based multisites, since they can seem a collection of doorways to access the same product (*) I have considered the possibility to substitute those overlapping ideas with links to the main page of the site, thought it seems unnatural from the user point of view to be brought off the page to read a piece of info that actually makes part of the project description (every other child project of Galike might have the same problem). I have considered also taking the subsite codex21 out of the network and host it as a single site in other server, but the problem of duplicated content might persist, and anyway, I should link it to my brand Galike somewhere, because that's kind of the "production house" of it. So which would be the best (white hat) strategy, from a SEO point of view, to arrange this brand-project philosophy overlapping? (*) “All the same IP address — that’s really not a problem for us. It’s really common for sites to be on the same IP address. That’s kind of the way the internet works. A lot of CDNs (content delivery networks) use the same IP address as well for different sites, and that’s also perfectly fine. I think the bigger issue that he might be running into is that all these sites are very similar. So, from our point of view, our algorithms might look at that and say “this is kind of a collection of doorway sites” — in that essentially they’re being funnelled toward the same product. The content on the sites is probably very similar. Then, from our point of view, what might happen is we will say we’ll pick one of these pages and index that and show that in the search results. That might be one variation that we could look at. In practice that wouldn’t be so problematic because one of these sites would be showing up in the search results. On the other hand, our algorithm might also be looking at this and saying this is clearly someone trying to overdo things with a collection of doorway sites and we’ll demote all of them. So what I recommend doing here is really trying to take a step back and focus on fewer sites and making those really strong, and really good and unique. So that they have unique content, unique products that they’re selling. So then you don’t have this collection of a lot of different sites that are essentially doing the same thing.” (John Mueller, Senior Webmaster Trend Analyst at Google. https://www.youtube.com/watch?time_continue=1&v=kQIyk-2-wRg&feature=emb_logo)
White Hat / Black Hat SEO | | PabloCulebras0 -
Clean-up Question after a wordpress site Hack added pages with external links from a massive link wheel?
Hey All, Thought I would throw this out to ensure I am dotting my "i's" and crossing my "t's"..... Client WordPress site was hacked injected 3-4 pages that cross linked to hundreds (affiliate junk spam link wheel). Pages were removed, 3rd party cleared all malware/viruses. Heavy duty firewall and security monitoring are in place. Hacked pages are now showing as 404. No penalties, ranking issues....If anything there was a temporary BOOST in rankings due to the large link-wheel type net that the pages were receiving....That has since leveled out rankings. I guess my question is, in your opinion is it best to let those pages 404, I am noticing a large amount of links going to them from all over the world from this large link net that was built. I find the temptation to 301 re-direct deleted pages to the homepage difficult...lol..{the temptation is REAL}. Is there anything I am missing? Any other steps that YOU would take? I am assuming letting those pages 404 would be the best bet, as in time they will roll off index.... Thank you in advance, I appreciate any feedback or opinions....
White Hat / Black Hat SEO | | Anthony_Howard0 -
Site Scraping and Canonical Tags
Hi, So I recently found a site (actually just one page) that has scraped my homepage. All the links to my site have been removed except the canonical tag, should this be disavowed through WMT or reported through WMT's Spam Report? Thanks in advance for any feedback.
White Hat / Black Hat SEO | | APFM0 -
How will Google deal with the crosslinks for my multiple domain site
Hi, I can't find any good answer to this question so I thought, why not ask Moz.com ;-)! I have a site, let's call it webshop.xx For a few languages/markets, Deutsch, Dutch & Belgian, English, French. I use a different TLD with a different IP for each of these languages, so I'll end up with: webshop.de, webshop.nl, webshop.be, webshop.co.uk, webshop.com & webshop.fr They all link to eachother and every subpage that is translated from the other site gets a link as well from the other languages, so: webshop.com/stuff links to webshop.de/stuff My main website, webshop.com gets links from every other of these domain which Open Site Explorer as well as Majestic SEO sees as an external link. (this is happening) My question. How will Google deal in the long-run with the crosslinks coming from these domains? some guesses I made: I get full external links juice (content is translated so unique?) I get a bit of the juice of an external link They are actually seen as internal links I'll get a penalty Thanks in advance guys!!!
White Hat / Black Hat SEO | | pimarketing0 -
Spam linking site how to report
I have a spam linking site that is generation thousans of links to my site. Even if i have a good link background, this is the only spammy i have, each week number of links comings from it increases by 500 , i know have 3000 links for that site and 1800 for other sites, but that one keeps growing What should i do, i dont want that link it is imposible to remove as webmaster does not respond
White Hat / Black Hat SEO | | maestrosonrisas0 -
Am i getting backlink benefits from sites i design and host
I own & host over 300 domains for as many businesses. They all link back to my site from every page. but seomoz shows only hundred. so do other seo tools. why is that?
White Hat / Black Hat SEO | | nooptee0 -
Somebody hacked many sites and put links to my sites in hidden div
I had 300 good natural links to my site from different sites and site ranked great for my keywords. Somebody (I suppose my competitor) has hacked other sites 2 days ago (checked Google cache) and now Yahoo Site Explorer shows 600 backlinks. I've checked new links - they all are in the same hidden div block - top:-100px; position:absolute;. I'm afraid that Google may penalize my site for these links. I'm contacting webmasters of these sites and their hosting so they remove these links. Is it possible to give Google a notice that these links are not mine so it could just skip them not penalizing me? Is it safe to make "Spam report" regarding links to my own site?
White Hat / Black Hat SEO | | zarades0