Google Has Indexed Most of My Site, why won't Bing?
-
We've got 600K+ pages indexed by Google and have submitted our same sitemap.xml's to Bing, but have only seen 100-200 pages get indexed by Bing.
Is this fairly typical? Is there anything further we can do to increase indexation on Bing?
-
How much time has passed since you submitted your sitemap to Bing?
I am not sure about other SEO specialists, but I don't spend my time on Bing/Yahoo. As far as I know they have less than 30% of the search market together. I would rather allocate my time on moving more pages of my website to the 1st page of Google.
If you are still concerned about being indexed on Bing try to compare your website with your competitors (who are on Bing of course).
- See how many pages they have indexed.
- See if you can get the same backlinks.
- See what robots.txt they have. (Maybe the problem is in it.)
- See if your site has any errors.
- See if your site is banned from Bing for some reason.
The underscored points are those I would check first.
Warmest regards,
Slava
-
Hi James,
Bing is very slow in indexing the pages of any website while the google is very fast and it indexes the pages as soon as possible.That is why the google is rank 1 in the internet world.I also use google most of the time and after that bing and yahoo.
But you don't need to worry because the 80 percent people use google and only 10 percent users use bing.
Bing is also improving gradually and hope that it will also establish itself as a remarkable institution.
Bing's FAQs state that you should do the following on-site seo to rank well in their search engine:
· Target no more than two keywords per page
· Use unique <title>tags on each page</p> <p>· Use unique <meta> description tags on each page</p> <p>· Use H1 tags</p> <p>· Use text navigation links</p> <p>· Create content for your human visitors, not the Bing web crawler</p> <p>· Incorporate keywords into URL strings</p> <p> </p> <p>You can find all those details over here: <a title="http://silverrose.hubpages.com/hub/Optimizing-your-website-for-Bing" href="http://silverrose.hubpages.com/hub/Optimizing-your-website-for-Bing">http://silverrose.hubpages.com/hub/Optimizing-your-website-for-Bing</a></p> <p> </p></title>
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google adding text to SERP title which isn't relevant
Hi guys, I have a site with around 300 articles on it and these articles came from three old domains which were migrated during a Wordpress domain migration almost four months back. There The problem I'm having is that for quite a lot of the articles in the SERP, Google is adding '- Maine Coons' to the end of the title. One of our old domains was related to this breed of cat so at least in Google's eyes it must have something to do with this I guess. I've attached a screenshot that shows one such example. What's odd is a lot of the new content that has been created also has this suffix added and it doesn't show in any other search engine. So, it doesn't appear in other search engines and it's not coming from the article itself (proved also via developer tools inspecting the code). So, Google is adding it but as you can see in this example (there are many more) it has absolutely no relevance to the post. Has anyone seen this behavior or have any idea how to fix it? I've tried all kinds of things and have even hired SEO 'experts' that haven't been able to see any problems. Any clues? Thanks, Matt K71Y3P9
Technical SEO | | mattpettitt0 -
404's being re-indexed
Hi All, We are experiencing issues with pages that have been 404'd being indexed. Originally, these were /wp-content/ index pages, that were included in Google's index. Once I realized this, I added in a directive into our htaccess to 404 all of these pages - as there were hundreds. I tried to let Google crawl and remove these pages naturally but after a few months I used the URL removal tool to remove them manually. However, Google seems to be continually re/indexing these pages, even after they have been manually requested for removal in search console. Do you have suggestions? They all respond to 404's. Thanks
Technical SEO | | Tom3_151 -
Why isn't our new site being indexed?
We built a new website for a client recently. Site: https://www.woofadvisor.com/ It's been live for three weeks. Robots.txt isn't blocking Googlebot or anything. Submitted a sitemap.xml through Webmasters but we still aren't being indexed. Anyone have any ideas?
Technical SEO | | RobbieD910 -
How Does Google's "index" find the location of pages in the "page directory" to return?
This is my understanding of how Google's search works, and I am unsure about one thing in specific: Google continuously crawls websites and stores each page it finds (let's call it "page directory") Google's "page directory" is a cache so it isn't the "live" version of the page Google has separate storage called "the index" which contains all the keywords searched. These keywords in "the index" point to the pages in the "page directory" that contain the same keywords. When someone searches a keyword, that keyword is accessed in the "index" and returns all relevant pages in the "page directory" These returned pages are given ranks based on the algorithm The one part I'm unsure of is how Google's "index" knows the location of relevant pages in the "page directory". The keyword entries in the "index" point to the "page directory" somehow. I'm thinking each page has a url in the "page directory", and the entries in the "index" contain these urls. Since Google's "page directory" is a cache, would the urls be the same as the live website (and would the keywords in the "index" point to these urls)? For example if webpage is found at wwww.website.com/page1, would the "page directory" store this page under that url in Google's cache? The reason I want to discuss this is to know the effects of changing a pages url by understanding how the search process works better.
Technical SEO | | reidsteven750 -
Why my site is not indexing in google
In google webmaster i have updated my sitemap in Mar 6th..There is around 22000 links..But google fetched only 5300 links for long time...
Technical SEO | | Rajesh.Chandran
I waited for 1 month till no improvement in google index..So apr6th we have uploaded new sitemap (1200 links totally)..,But only 4 links indexed in google ..
why google not indexing my urls? Is this affect our ranking in SERP? How many links are advisable to submit in sitemap for a website?0 -
Is submitting your site to yahoo & Google still relevant
Good Morning from Sh@t its still raining wetherby UK... I want to just make sure the process i go through when a new site is launched is nort overlooking some fundamentals. Most sites we launch are not brand new, do allready have a link heritage and have been indexed by Google. With that in mid i do not submit a sites url thru the following links: www.google.com/addurl
Technical SEO | | Nightwing
search.yahoo.com/info/submit.html
search.live.com/docs/submit.aspx Am i right in saying you should really only bother with this if the site a newbie ie no history no link heritage and the site is enering cyberspace for the forst time. And i wonder if for example you launched a new site made sure the xml site map was in place and it had a few inbound links anyway it would be indexed anyway. So is the practice of submitting your url to search engined relevant anymore? Any insights welcome 🙂2 -
Google.ca is showing our US site instead of our Canada Site
When our Canadian users who search on google.ca for our brand (e.g. Travelocity, Travelocity hotels, etc.), the first few results our from our US site (travelocity.com) rather than our Canadian site (travelocity.ca). In Google Webmaster Tools, we've adjusted the geotargeting settings to focus on the appropriate locale, but the wrong country TLD is still coming up at the top via google.ca. What's the best way to ensure our Canadian site comes up instead of the US site on google.ca? Thanks, Tory Smith
Technical SEO | | travelocitysearch
Travelocity0 -
Getting Google to index new pages
I have a site, called SiteB that has 200 pages of new, unique content. I made a table of contents (TOC) page on SiteB that points to about 50 pages of SiteB content. I would like to get SiteB's TOC page crawled and indexed by Google, as well as all the pages it points to. I submitted the TOC to Pingler 24 hours ago and from the logs I see the Googlebot visited the TOC page but it did not crawl any of the 50 pages that are linked to from the TOC. I do not have a robots.txt file on SiteB. There are no robot meta tags (nofollow, noindex). There are no 'rel=nofollow' attributes on the links. Why would Google crawl the TOC (when I Pinglered it) but not crawl any of the links on that page? One other fact, and I don't know if this matters, but SiteB lives on a subdomain and the URLs contain numbers, like this: http://subdomain.domain.com/category/34404 Yes, I know that the number part is suboptimal from an SEO point of view. I'm working on that, too. But first wanted to figure out why Google isn't crawling the TOC. The site is new and so hasn't been penalized by Google. Thanks for any ideas...
Technical SEO | | scanlin0