Bing's indexed pages vs pages appearing in results
-
Hi all
We're trying to increase our efforts in ranking for our keywords on Bing, and I'm discovering a few unexpected challenges. Namely, Bing is reporting 16000+ pages have been crawled... yet a site:mywebsite.com search on Bing shows less than 1000 results.
I'm aware that Duane Forrester has said they don't want to show everything, only the best. If that's the case, what factors must we consider most to encourage Bing's engine to display most if not all of the pages the crawl on my site?
I have a few ideas of what may be turning Bing off so to speak (some duplicate content issues, 301 redirects due to URL structure updates), but if there's something in particular we should monitor and/or check, please let us know. We'd like to prioritize
Thanks!
-
Yep, if Bing Webmaster Tools doesn't show problems with the sitemap, I'd focus on the points I highlighted back in mid-June on this thread (make content robust, unique, and make sure text is in HTML).
Good luck,
Kristina
-
Hello again Kristina
Bing's showing 38,885 pages indexed... and I've noticed the amount of pages vary after clicking through several pages.
So I guess the problem isn't why aren't they indexing, but rather why aren't they showing all pages. I'd assume this is related to page quality (content, on-page ranking factors, etc)?
-
I haven't heard of Bing keeping historically submitted sitemaps and confusing them, although I know that they're very picky about the number of inaccuracies they find in a sitemap, so it's possible they keep the latest one around so they can refer to it if the current one seems to have holes.
That said - when you search for your site, are the same pages coming up on the first page? What about the second? Third? The number of pages that come up when you search for site:mysite.com are approximations and can vary even as you scroll through the results pages. The more important question is, how many pages does Bing say are indexed in Bing Webmaster Tools?
-
Just an update:
Bing reported a successful crawl after submitting a new one, then rejected it based on an error that it didn't describe. Took it down, made a change to URL itself (somehow the .gz extension wasn't there) and resubmitted on 7/7/13.
Since then, Bing has reported a successful crawl, then reported a successful crawl on 6/30/13 (7 days before submission?), then reported a failed crawl on 7/5/13 (2 days before submission?) and now today again reporting a successful crawl on 7/7/13.
So my question now is... does Bing keep record of historically submitted sitemaps and confuse them with new submissions of the same ones? I've yet to see Bing actually index what's in the sitemaps, as a site: operator search is still a daily fluctuation between 1200 and 3300 results, sometimes going up to 4400. But again, this is daily. Right now, searching site:roadtrippers.com on Bing reports 4,420 results. Later today, I imagine it'll be around 3,300 or 1,200.
Any suggestions at all would be greatly appreciated.
-
Good luck!
If these tips don't work, you should follow up here again, but include a little more information about your site. It's possible that Bing IS crawling all of your pages properly, but something about them is making Bing think that they aren't valuable enough to be in their indexes. I'd particularly look to see if:
- Content seems to be duplicate, either within your site or if it's duplicated elsewhere
- Content is extremely thin (less than 100 words on a page/no unique text above the fold)
- Content is unreadable by Bing: check the cached version of a page that's not indexed and make sure you can read the unique content
Hope this helps! I'm going to mark this question as "answered," only because if you have a follow up question, it'll probably be more specific now that you have more information, and I'd like all of that info to be included in the original question.
Best,
Kristina
-
Hey Kristina
It has not unfortunately.
Bing reports successful crawls, however it's not crawling it - at all.
After reading more about Bing's sitemap preferences, there are a few things left to try. I'm using this post on Bing's forums http://www.bing.com/blogs/webmaster/f/12248/t/659635.aspx#9602607 as a reference for now. We're going to make a temporary separate sitemap for Bing to test what is suggested in that link. Hopefully something sticks and we can make some progress going forward!
Brandon
-
Hi Brandon,
Just wanted to check in - did using 1 sitemap work?
Kristina
-
I believe I've found the solution - as recently as 2009, Bing was only crawling one sitemap per website. It also said Bing would only crawl the most recently submitted sitemap but it doesn't appear that was the case for our site.
So I've since removed the old sitemap and am waiting to see some evidence of our new sitemap being crawled and indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Primary keyword in every page title of website
Hi all, We can see many website page titles are filled with "brand name & primary keyword" at suffix. Just wondering how much this gonna help. Or can we remove "primary keyword" from other non-relevant pages and limit the same to important pages to rank well? Thanks
Algorithm Updates | | vtmoz0 -
Does Bing fully support SNI yet?
We're working on enabling SNI on a large number of ecommerce websites, however I've read that Bingbot has real issues with SNI. There was a 'through-the-grapevine' response from Microsoft in April 2015 saying it was supposedly going to be fixed "within the next 6 months," but I can't find anything that corroborates that. There was also an article on SEL in July 2015 that basically said that microsoft may have pushed a fix out, but that there were still lingering issues.
Algorithm Updates | | simon.serrano0 -
Bing not indexing pages
We have taken all recommended steps to index our site sitegeek.com pages to Bing Bot but failed to index them. Bing bot crawled more than 5,000 pages every day but strange why pages are not getting index ? if we query site:sitegeek.com in Bing Bing Search Engine shows only 1,200 pages got indexed. but we query site:sitegeek.com in Google Google Search Engine show more 546,000 pages got indexed. For example : https://www.sitegeek.com/000webhost Above page crawled by Google but Bing. Can anyone suggest what we are missing on this page? what need to change to index such pages? Thanks! Rajiv
Algorithm Updates | | gamesecure0 -
Time lag between algorithm changes and results?
Hi everyone, We had pretty good organic traffic growth for the past 3 years. Some people in our organization think that we were probably affected positively by the algorithm changes (Panda, Penguin,...). When I plot the dates on a chart I think Panda had no effect, but we had big gains starting five months after Penguin 1. Do you think the effect could have come that much later ? Thanks
Algorithm Updates | | jfmonfette0 -
Double Listings On Page One
I've been noticing a trend over the past month and a half. My sites that use to get more than one page listed in certain SERPs are now being adjusted. It almost looks manual but I know it is most likely a change in the algorithm. Let's say I had a SERP where my site was showing two different sub-pages in a single SERP at #4 and #6 are now having one page being pushed up to #3 but the other page is being pushed back past the first page. I'm not worried about penalizations or loss of value. I have been seeing this accross many of my client's sites. I just wanted to confirm that others were seeing it as well (so I'm not going crazy) and/or if Google has made any announcements or leaks regarding this shift. Maybe it's just my sites coming of age or something but I would love to be able to explain it more knowledgeably than with a "Google might be doing this". BTW - This is not effecting any of my Brand SERPs.
Algorithm Updates | | BenRWoodard0 -
When did Google include display results per page into their ranking algorithm?
It looks like the change took place approx. 1-2 weeks ago. Example: A search for "business credit cards" with search settings at "never show instant results" and "50 results per page", the SERP has a total of 5 different domains in the top 10 (4 domains have multiple results). With the slider set at "10 results per page", there are 9 different domains with only 1 having multiple results. I haven't seen any mention of this change, did I just miss it? Are they becoming that blatant about forcing as many page views as possible for the sake of serving more ads?
Algorithm Updates | | BrianCC0 -
Very Strange Search Results!
Having just done a search on Google.co.uk for 'payday loans' I am baffled as to why the top two organic results (image attached) are even associated to the keyword. The KW isn't present in the title, metas, or content. Nor do any backlinks use relevant anchor text. I'm guessing this is an algorithmic 'f*ck up', do you agree? uGdk7Cw92Rme
Algorithm Updates | | Webpresence0