Bing's indexed pages vs pages appearing in results
-
Hi all
We're trying to increase our efforts in ranking for our keywords on Bing, and I'm discovering a few unexpected challenges. Namely, Bing is reporting 16000+ pages have been crawled... yet a site:mywebsite.com search on Bing shows less than 1000 results.
I'm aware that Duane Forrester has said they don't want to show everything, only the best. If that's the case, what factors must we consider most to encourage Bing's engine to display most if not all of the pages the crawl on my site?
I have a few ideas of what may be turning Bing off so to speak (some duplicate content issues, 301 redirects due to URL structure updates), but if there's something in particular we should monitor and/or check, please let us know. We'd like to prioritize
Thanks!
-
Yep, if Bing Webmaster Tools doesn't show problems with the sitemap, I'd focus on the points I highlighted back in mid-June on this thread (make content robust, unique, and make sure text is in HTML).
Good luck,
Kristina
-
Hello again Kristina
Bing's showing 38,885 pages indexed... and I've noticed the amount of pages vary after clicking through several pages.
So I guess the problem isn't why aren't they indexing, but rather why aren't they showing all pages. I'd assume this is related to page quality (content, on-page ranking factors, etc)?
-
I haven't heard of Bing keeping historically submitted sitemaps and confusing them, although I know that they're very picky about the number of inaccuracies they find in a sitemap, so it's possible they keep the latest one around so they can refer to it if the current one seems to have holes.
That said - when you search for your site, are the same pages coming up on the first page? What about the second? Third? The number of pages that come up when you search for site:mysite.com are approximations and can vary even as you scroll through the results pages. The more important question is, how many pages does Bing say are indexed in Bing Webmaster Tools?
-
Just an update:
Bing reported a successful crawl after submitting a new one, then rejected it based on an error that it didn't describe. Took it down, made a change to URL itself (somehow the .gz extension wasn't there) and resubmitted on 7/7/13.
Since then, Bing has reported a successful crawl, then reported a successful crawl on 6/30/13 (7 days before submission?), then reported a failed crawl on 7/5/13 (2 days before submission?) and now today again reporting a successful crawl on 7/7/13.
So my question now is... does Bing keep record of historically submitted sitemaps and confuse them with new submissions of the same ones? I've yet to see Bing actually index what's in the sitemaps, as a site: operator search is still a daily fluctuation between 1200 and 3300 results, sometimes going up to 4400. But again, this is daily. Right now, searching site:roadtrippers.com on Bing reports 4,420 results. Later today, I imagine it'll be around 3,300 or 1,200.
Any suggestions at all would be greatly appreciated.
-
Good luck!
If these tips don't work, you should follow up here again, but include a little more information about your site. It's possible that Bing IS crawling all of your pages properly, but something about them is making Bing think that they aren't valuable enough to be in their indexes. I'd particularly look to see if:
- Content seems to be duplicate, either within your site or if it's duplicated elsewhere
- Content is extremely thin (less than 100 words on a page/no unique text above the fold)
- Content is unreadable by Bing: check the cached version of a page that's not indexed and make sure you can read the unique content
Hope this helps! I'm going to mark this question as "answered," only because if you have a follow up question, it'll probably be more specific now that you have more information, and I'd like all of that info to be included in the original question.
Best,
Kristina
-
Hey Kristina
It has not unfortunately.
Bing reports successful crawls, however it's not crawling it - at all.
After reading more about Bing's sitemap preferences, there are a few things left to try. I'm using this post on Bing's forums http://www.bing.com/blogs/webmaster/f/12248/t/659635.aspx#9602607 as a reference for now. We're going to make a temporary separate sitemap for Bing to test what is suggested in that link. Hopefully something sticks and we can make some progress going forward!
Brandon
-
Hi Brandon,
Just wanted to check in - did using 1 sitemap work?
Kristina
-
I believe I've found the solution - as recently as 2009, Bing was only crawling one sitemap per website. It also said Bing would only crawl the most recently submitted sitemap but it doesn't appear that was the case for our site.
So I've since removed the old sitemap and am waiting to see some evidence of our new sitemap being crawled and indexed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Non-indexed or indexed top hierarchy pages get high PageRank at Google?
Hi, We are creating some pages just to capture leads from blog-posts. We created few pages at top hierarchy like website.com/new-page/. I'm just wondering if these pages will take away more PageRank. Do we need to create these pages at low hierarchy like website.com/folder/new-page to avoid passing more PageRank? Is this is how PR distributed even now and it's same for indexed or non-indexed pages? Thanks
Algorithm Updates | | vtmoz0 -
Can a page be 100% topically relevant to a search query?
Today's YouMoz post, Accidental SEO Tests: When On-Page Optimization Ceases to Matter, explores the theory that there is an on-page optimization saturation point, "beyond which further on-page optimization no longer improves your ability to rank" for the keywords/keyword topics you are targeting. In other words, you can optimize your page for search to the point that it is 100% topically relevant to query and intent. Do you believe there exists such a thing as a page that is 100% topically relevant? What are your thoughts regarding there being an on-page optimization saturation point, beyond which further on-page optimization no longer improves your ability to rank? Let's discuss!
Algorithm Updates | | Christy-Correll1 -
Did .org vs. .com SEO importance recently changed?
I have seen previous answers in the Forum about this subject but Google has seemed to have again changed the playing surface. Within the past 30 days, we have seen a huge spike in organic search returns seeming to favor .org as domain authorities. Has anyone else noticed this shift and is it just coincidence or worth factoring in? If it is a shift, will Google punish those that have .org but have used.com previously for switching the redirects to serve .org first? Thanks, Jim
Algorithm Updates | | jimmyzig0 -
Should my canonical tags point to the category page or the filter result page?
Hi Moz, I'm working on an ecommerce site with categories, filter options, and sort options – teacherexpress.scholastic.com. Should I have canonical tags from all filter and sort options point to the category page like gap.com and llbean.com? or have all sort options point to the filtered page URL like kohls.com? I was under the impression that to use a canonical tag, the pages have to have the same content, meaning that Gap and L.L. Bean would be using canonical tags incorrectly. Using a filter changes the content, whereas using a sort option just changes the order. What would be the best way to deal with duplicate content for this site? Thanks for reading!
Algorithm Updates | | DA20130 -
Reasons for a sharp decline in pages crawled
Hello! I have a site I've been tracking using Moz since July. The site is mainly stagnant with some on page content updates. Starting the first week of December, Moz crawler diagnostics showed that the number of pages crawled decreased from 300 to 100 in a week. So did the number of errors through. So crawler issues went from 275 to 50 and total pages crawled went from 190 to 125 in a week and this number has stayed the same for the last 5 weeks. Are the drops a red flag? Or is it ok since errors decreased also? Has anyone else experienced this and found an issue? FYI: sitemap exists and is submitted via webmaster tools. GWT shows no crawler errors nor blocked URLs.
Algorithm Updates | | Symmetri0 -
Second rebranding, what's the best approach?
Our client rebranded in 2007 and it worked very successfully from an SEO persepctive. They put in place page-to-page 301 redirects and the new website replaced the old one in the SERPS very quickly in similar positions. The market has changed and they now need to rebrand again so they are moving to a third domain. So in 2007 they redirected DomainA to DomainB and now are moving to DomainC Domain A was in existence since 1996 so a majority of the link profile is still directed to DomainA and is passing through it via 301 to DomainB. Is the best approach 1. to just redirect DomainB to DomainC, leaving the DomainA links pass through a second set of 301 redirects?
Algorithm Updates | | G-DC
or 2. would it be better to change the redirects on DomainA to go directly to DomainC (the theory here is that each 301 dilutes the value of a link so taking out a hop could be better)0 -
Why Am I Ranking in Bing but Not Google
My website is ranking is ranking in Bing, but it's nowhere to be found on Google? What can be some causes for this?
Algorithm Updates | | locallyrank0 -
Double Listings On Page One
I've been noticing a trend over the past month and a half. My sites that use to get more than one page listed in certain SERPs are now being adjusted. It almost looks manual but I know it is most likely a change in the algorithm. Let's say I had a SERP where my site was showing two different sub-pages in a single SERP at #4 and #6 are now having one page being pushed up to #3 but the other page is being pushed back past the first page. I'm not worried about penalizations or loss of value. I have been seeing this accross many of my client's sites. I just wanted to confirm that others were seeing it as well (so I'm not going crazy) and/or if Google has made any announcements or leaks regarding this shift. Maybe it's just my sites coming of age or something but I would love to be able to explain it more knowledgeably than with a "Google might be doing this". BTW - This is not effecting any of my Brand SERPs.
Algorithm Updates | | BenRWoodard0