Submitting an 'HTTPS' sitemap.xml to Bing
-
I have been trying to submit my sitemap to Bing [via their webmaster tools] for well over a week and it continues to report 'pending' My site is HTTPS and the sitemap is accepted by Google. I questioned Bing about this and got this response:
To set your expectations, our Sitemap fetchers use a different pipeline and because of this, we cannot crawl Sitemaps in HTTPS format. We require that you submit an HTTP version of sitemap in order for Bing to properly crawl the file. Please go ahead and delete the current Sitemap and resubmit a new one in HTTP.
Currently I don't and can't have a HTTP version of my site & sitemap and my developers are telling me that 3hrs worth of dev time will go into coming up with a work-around which I'm not sure I want to invest in [I have more important things to concentrate my spend on!].
Has anyone been faced with this problem and is there any quick/cheap alternative or do I just accept that Bing won't crawl my site until they update their end?!
-
Hi Matthew, your response makes perfect sense. Thankfully Bing [seems to be!!] indexing my site - well certainly the pages that count as we are showing up in search results. We've been trying to come up with a work-around but all solutions will involve an element of dev. time which I don't really think is money well spent - at the moment anyway!
Cheers
Iain
-
Hey Iain. If it were me, I'd probably just accept that Bing can't crawl the sitemap and let it go. XML sitemaps are important, but not something that will generally make a huge life altering difference for your website's performance.
Now, I say "probably" because I'm wondering if you are having indexing problems with Bing. Are there pages you want Bing to index that maybe they can't reach easily (or at all) without an XML sitemap? If that is the case, then maybe it is worth the 3 hours of dev time to get the XML sitemap in place. Alternatively, you could find other ways to link to those pages Bing isn't currently indexing (on your site or others) to get those pages noticed.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving to https - Webmaster Tools
Okay so I've moved a site I work on to https finally... wasn't easy, but I won't go into the whys and hows of that. Everything is working fine on the site and my rankings are holding steady but when I went into the GWT today I noticed that my main site (www.domain.com) has lost inquiry and traffic data almost entirely. So I quickly added and verified the https://www.domain.com URL (even though it doesn't ask for the protocol when adding a site) and saw the traffic.. kind of. Only 2 inquiries and a few pages indexed. Now I realize I need to submit a new sitemap and am currently working on that but don't know which account to upload it to. Do I really need two GWT sites separately and why is the other one still getting the impressions and clicks? Basically I'm looking for somebody to shed some light on how to handle this migration from a GWT standpoint. I've attached 3 screenshots to illustrate what I'm talking about. Should be obvious what's what to those who read this post. The Analytics shot is to show that I haven't actually lost traffic in this time and eliminate the fear of penalty or anything of the sort. Thanks! wbPpwVw.png YJifLiR.png cRVgqRN.png
Reporting & Analytics | | jesse-landry0 -
Webmaster Tools vs. Google Trends data doesn't add up
I am investigating a two-month 25% drop in organic traffic from Google to a client's site. When I turned to the Webmaster Tools data for the site, there is a clear, gradual drop over the course of a couple months both in impressions and clicks. In general, the drop occurred across many pages and for a large number of queries; there wasn't a core group of keywords or pages that saw the drop...it was more sitewide. Yet, the average rankings reported by WMT were, for the top 100 or so landing pages, not significantly different. The site hosts information about medical conditions, and I wouldn't expect any time-related variations in search volume, and this was confirmed by looking at Google Trends data for a number of the top keywords. I started to look at the data by query for all the top keywords (all ranked in the top 10), and saw the following general trend: impressions were down, rankings stayed in the top 10, and Google Trends showed either flat or rising volumes. So I am trying to make sense of that. If the search volume trend did not decline and rankings held inside the top 10, then how could the number of impressions drop significantly? Am I trusting the WMT data too much? But the reality is that the volume of traffic measured by Google Analytics from Google organic did indeed drop the way Webmaster Tools show it.
Reporting & Analytics | | WillW0 -
How to safely exclude search result pages from Google's index?
Hello everyone,
Reporting & Analytics | | llamb
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blah I wanted to block everything of that sort, but how do I do it without blocking /index.php ? Thanks in advance and have a great day everyone!0 -
Google Analytics Content Experiments - Experiment Conversions and Goal Figures Don't Match
Hi, I set up a new content experiment 6 days ago, the experiment says there have been 2 conversions but the goal associated with it says 5. The experiment is set to target 100% of traffic, distributed evenly among the variations, the goal is a destination URL goal. I've doubled checked the goal set up and everything seems fine. How can the content experiment report a different figure to the goal associated with it? Has anyone else noticed the same problem? Is this a bug? Is there a workaround available? Or is there a setting I need to be aware of when creating content experiments to prevent this from happening? I need to know I can trust the results the content experiments provide.
Reporting & Analytics | | UNIT40 -
404 errors on page urls that don't even exist
I am getting a lot of errors on pages with urls that aren't even legit. Like for example: /videos/support/index.asp No such path even exists like this on the site. I have a /videos and /support off root but no place on the site is there any reference or file at location /videos/support/index.asp so I get a lot of 404 duplicate page errors. This is just one example of several. How do I stop this?
Reporting & Analytics | | GKLWL0 -
Google's New Privacy Policy and Analytics
Does anybody know if Google's new privacy policy allows it to use data gathered by Analytics to be used as a ranking factor in the SERPs?
Reporting & Analytics | | Jolora0 -
222.mydomain.com... I'm stumped....
Hi, I'm stumped by this one and am hoping I can get some advice/guidance. I've seen this pop up in my stats where the www.mydomain.com/page is appearing as 222.mydomain.com/page. I have no clue what's going on... This google return displays the same page 2 ways (see first image). And when I click on the 222. link I get a prompt (see 2nd image). Any idea what's going on? Thanks! Peter ipHKt.png o0ZeB.png
Reporting & Analytics | | peterdbaron0 -
Google and bing search filed commands
Dose someone have / know a full list / resource with commands for google and bing ? Including filters for those commands ? (site:domain.com -filter etc) (like: site:domain.com, link:domain.com etc) I use the basic ones b ut I know there are much more and that there are several filters that can be used with success to filter down results. Thanks.
Reporting & Analytics | | eyepaq1