Bing Webmaster Tools Incompatibility Issues with new Microsoft Edge Browser
-
Our client received an email from Bing WMTs saying
"We have identified 4 known issues with your website in Microsoft Edge – the new default browser for Windows 10 and Bing – Of the four problems mentioned, only two seem to be relevant (maybe)
- We’ve found that this webpage may include HTML markup that treats Microsoft Edge differently from other modern browsers. The new EdgeHTML rendering engine for Microsoft Edge is document-mode agnostic and designed for fast, modern rendering. We recommend that you implement one code base for all modern browsers and include Microsoft Edge as part of your modern browser test matrix.
- **We've found that this webpage may have missing vendor-specific prefixes **or may have implemented vendor-specific prefixes when they are not required in common CSS properties. This may cause compatibility problems with how this webpage renders across different browsers.
Last month the client received 20K visitors from all IE browsers and this is significant enough to be concerned about.
**Are other folks making changes to their code to adapt to MS Edge? **
-
First i'm on Mac since 2008/09 and i missed Windows7,8 and 10. But i have experience since Windows 3.0 somewhere in 91/92. So longs story short - you should do routine website testing in Edge just to see that everything is correct and works perfect. You shouldn't see any impact if you don't follow their suggestions.
Actual all web developers and designers have long and difficult partnership with IE. IE6 trying to break W3C standards and implement own version of them. This is described here: https://en.wikipedia.org/wiki/Internet_Explorer_box_model_bug IE7 fixes them but add new "bugs", and IE8 add new portion of them. This is described here: http://www.smashingmagazine.com/2009/10/css-differences-in-internet-explorer-6-7-and-8/ http://code.tutsplus.com/tutorials/9-most-common-ie-bugs-and-how-to-fix-them--net-7764 https://css-tricks.com/ie-css-bugs-thatll-get-you-every-time/ and in many other articles. So all devs finally was little bit pissed off and stop supporting IE at all. They just make some IE specific hacks for specific versions user just to see something almost correct.
Later Webkit and Firefox join the party in CSS and add their own versions of CSS styles. Each of them start with webkit- or moz- as prefix. Opera also joins the party with o- preffix. But they fix almost all vendor prefixes within years with CSS3 and finally all agreed to abandon them. But all "vendors" comes with autoupdated browsers so they push new versions to users. You can't imagine today working with other than current version of Firefox, Chrome or Opera. Safari is different because actual version is only on current version of OSX. And in this world only IE is outdated... Just MS refuses to push new versions to old OSes.
So today as Bob Dylan sing "Times have changed" everything is different. And Microsoft is trying to bring back support of all standards they lacking in past. But web here is different than web before and bringing support for IE is hard. That's why they sent mails about "possible problems". Honestly mine site works in IE 6,7,8,9,10 and 11 (tested, no joke!) and should works in Edge too (not tested).
So that's why they sent you mail about "possible issues". In all web companies are jokes about IE6 or IE7 support: http://www.smashingmagazine.com/2011/11/but-the-client-wants-ie-6-support/ http://www.sitepoint.com/how-to-stop-wasting-time-developing-for-internet-explorer/ And if today Microsoft warn us about "standartds" we should ask them where have they are all over years NOT supporting standards?
-
We have not made any changes to our client sites for Edge.
Since Edge tends to handle rendering better than previous versions of IE we don't do any conditional formatting for IE.
Now if you are doing something like:
Then I can see where that may try to force Edge to render in ways that the newer version has taken care of.
-
Hi Peter: Yes, we have already used that link because it was provided in Bing's message to us and that's where I pulled the two bulleted "issues" I referenced in the original email.
The question is, do we need to go to this trouble for Edge and what might it do to legacy versions of IE? We checked other clients and all of them have alerts - or what Bing is calling "Suggestions".
I don't see everyone following MS's suggestions and if we don't what serious impact could there be?
-
If you need to support EDGE you need to follow this one link: https://dev.windows.com/en-us/microsoft-edge/tools/staticscan/ There a static tester will check your HTML and CSS for specific issues against it.
There you can see "same markup", "browser detection" and "CSS prefixes" section with more information.
The real problem is outdated IE versions. You can easy support EDGE but what to do with old versions? I know sites with visitors from IE7 still. Now Microsoft tell "support us", but what to do with all their versions?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO - New URL structure
Hi, Currently we have the following url structure for all pages, regardless of the hierarchy: domain.co.uk/page, such as domain/blog name. Can you, please confirm the following: 1. What is the benefit of organising the pages as a hierarchy, i.e. domain/features/feature-name or domain/industries/industry-name or domain/blog/blog name etc. 2. This will create too many 301s - what is Google's tolerance of redirects? Is it worth for us changing the url structure or would you only recommend to add breadcrumbs? Many thanks Katarina
Technical SEO | | Katarina-Borovska1 -
Indexing Issue of Dynamic Pages
Hi All, I have a query for which i am struggling to find out the answer. I unable to retrieve URL using "site:" query on Google SERP. However, when i enter the direct URL or with "info:" query then a snippet appears. I am not able to understand why google is not showing URL with "site:" query. Whether the page is indexed or not? Or it's soon going to be deindexed. Secondly, I would like to mention that this is a dynamic URL. The index file which we are using to generate this URL is not available to Google Bot. For instance, There are two different URL's. http://www.abc.com/browse/ --- It's a parent page.
Technical SEO | | SameerBhatia
http://www.abc.com/browse/?q=123 --- This is the URL, generated at run time using browse index file. Google unable to crawl index file of browse page as it is unable to run independently until some value will get passed in the parameter and is not indexed by Google. Earlier the dynamic URL's were indexed and was showing up in Google for "site:" query but now it is not showing up. Can anyone help me what is happening here? Please advise. Thanks0 -
Amp page development issue
Hi everyone currently developing an amp version of my website it validates with no errors, however my <a name="blah"></a>some blah does not work for amp any ideas
Technical SEO | | livingphilosophy0 -
Why Are Some Pages On A New Domain Not Being Indexed?
Background: A company I am working with recently consolidated content from several existing domains into one new domain. Each of the old domains focused on a vertical and each had a number of product pages and a number of blog pages; these are now in directories on the new domain. For example, what was www.verticaldomainone.com/products/productname is now www.newdomain.com/verticalone/products/product name and the blog posts have moved from www.verticaldomaintwo.com/blog/blogpost to www.newdomain.com/verticaltwo/blog/blogpost. Many of those pages used to rank in the SERPs but they now do not. Investigation so far: Looking at Search Console's crawl stats most of the product pages and blog posts do not appear to be being indexed. This is confirmed by using the site: search modifier, which only returns a couple of products and a couple of blog posts in each vertical. Those pages are not the same as the pages with backlinks pointing directly at them. I've investigated the obvious points without success so far: There are a couple of issues with 301s that I am working with them to rectify but I have checked all pages on the old site and most redirects are in place and working There is currently no HTML or XML sitemap for the new site (this will be put in place soon) but I don't think this is an issue since a few products are being indexed and appearing in SERPs Search Console is returning no crawl errors, manual penalties, or anything else adverse Every product page is linked to from the /course page for the relevant vertical through a followed link. None of the pages have a noindex tag on them and the robots.txt allows all crawlers to access all pages One thing to note is that the site is build using react.js, so all content is within app.js. However this does not appear to affect pages higher up the navigation trees like the /vertical/products pages or the home page. So the question is: "Why might product and blog pages not be indexed on the new domain when they were previously and what can I do about it?"
Technical SEO | | BenjaminMorel0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
How can I find my Webmaster Tools HTML file?
So, totally amateur hour here, but I can't for the life of me find our HTML verification file for webmaster tools. I see nowhere to look at it in Google Webmaster Tools console, I tried a site:, I googled it, all the info out there is about how to verify a site. Ours is verified, but I need the verification file code to sync up with the Google API and no one seems to have it. Any thoughts?
Technical SEO | | healthgrades0 -
Geotargeting issue
Hi, So ive just starting working on a travel website and noticed that the .com website outranks the com.au in Australian SERPS, even though the .au site has been geotargeted (In GWT) for Australia.I also geotargeted the .com website to Canada (the primary place of business). Is this advisable? Will this affect rankings?
Technical SEO | | Tourman0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860