Why some sites doesn't get ranked in Google but in Bing and Yahoo
-
Few of my sites e.g. Business-Training-Schools.com and Ultrasoundtechnicians.com doesnt get much visits from Google but these sites get top ranked in Bing and Yahoo. I have tried searching for answer to these question but i did not find anything convincing.
-
Google uses a different algorithm to calculate rankings than Bing and Yahoo (Yahoo is now powered by Bing so both will always give the same results).
While the many factors that both Bing and Google take into account might be similar they are not identical. What might get you hit with a Google penalty (and see your site wiped off the SERP map in Google) might have no effect in Bing.
Here's a handy infographic from pentonmarketingservices.com on the differences between the two. Of course, the differences between the two are always changing as Google constantly brings out new updates:
-
I wouldn't worry about it, no one uses Bing or Yahoo!!
-
I am relatively new to SEO. But different search engines use different ranking factors into their mix. So what might work for a search engine might not convince the other SE.
Your strategy should be aligned with your Traffic SEO Objetives,
I personally would relocate my SEO Efforts towards Google, because its high volume of traffic, in order to meet your objectives.
I have a similar situation but in the reverse order. I am well ranked in Google, but i am in the dumper for Bing, and mediocre in Yahoo. However, the traffic that i get from Google is very high and meets my SEO Objetives.
Hopefully my comments were of some help!!
Thanks for sharing your situation,
Regards!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GSC Performance completely dropped off, but Google Analytics is steady. Why can't GSC track my site anymore?
Hey everyone! I'm having a weird issue that I've never experienced before. For one of my clients, GSC has a complete drop-off in the Performance section. All of the data shows that everything fell flat, or almost completely flat. But in Google Analytics, we have steady results. No huge drop-off in traffic, etc. Do any of you know why GSC would all of a sudden be unable to crawl our site? Or track this data? Let me know what you think!
Algorithm Updates | | TaylorAtVelox
Thanks!2 -
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Google creating it own content
I am based in Australia but a US founded search on 'sciatica' shows an awesome answer on the RHS of the SERP https://www.google.com/search?q=sciatica&oq=sciatica&aqs=chrome.0.69i59.3631j0j7&sourceid=chrome&ie=UTF-8 The download on sciatica is a pdf created by google. Firstly is this common in the US? secondly any inputs on where this is heading for rollout would be appreciated. Is google now creating its own content to publish?
Algorithm Updates | | ClaytonJ0 -
Site refuses to improve rankings. Can someone else put a set of eyes on this for me and see what I am missing?
Hello! We've been successful with over 40 clients and getting them to great results in our industry, insurance. We recently acquired a new client who had an existing website with prior SEO results a very spammy blog and many spammy links. We've removed many of the blog articles and links using the Google Disavow Tool We've been monitoring this site in a campaign on Moz, but we're seeing zero improvement week to week. Can someone put another set of eyes on this and see if we're simply just missing something? Results for all 30 of our tracked keywords, zero are in the top 50! I would guess this was an algorithm penalty, but it has been 3 months now since we've made the changes and nothing is changing... not even a little bit! Any help/suggestions would be GREATLY appreciated. Thank you and enjoy Labor Day weekend!
Algorithm Updates | | Tosten0 -
Why do in-site search result pages rank better than my product pages?
Maybe this is a common SERP for a generic product type but I'm seeing it a lot more often. Here is an example SERP "rolling stools". The top 4 results are dynamic in-site search pages from Sears, ebay and Amazon (among others). I understand their influence and authority but why would a search return a dynamic in-site SERP instead of a solid product page. A better question would be - How do I get my in-site SERPs to rank or how do I get my client's page to rise above the #5 spot is currently ranks at? Thanks
Algorithm Updates | | BenRWoodard0 -
Organic ranking vs Local ranking
After reading this blog entry from Dr Pete on Mirametrix, my question would be:
Algorithm Updates | | echo1
What's more important for a local company, being in the 7 pack or in the top 10 organic results? Which one attracts more clicks? Is the optimization for local ranking just became more important than the traditional SEO?0 -
Lesser visited, but highly ranked landing paged dropped in rank on Google. Time for a content update?
I noticed that my page one ranked landing pages that don't get a lot of love from me have dropped in rank big time on Google this week. This is a site that has static (meaning, I can't freshen up the content easily) landing pages for products that we sell. The pages that dropped are the ones that have the fewest inbound links, and don't get much attention on the social media side. Our most important landing pages have also dropped, but just a few spots on page one. This is a first for me. Does anyone think that this is a "lack of freshness" penalty? We are still number one on page one for our brand search terms. Would fresh content give me a shot at getting the pages back up? I'm willing to update them slowly, but before I go crazy, I'm reaching out to the pros here.
Algorithm Updates | | Ticket_King0