I have a 2D HTML5 game engine (www.scirra) and really want to detect if WebGL is going to render with Chrome 18's 'Swiftshader' software renderer. If so we would much prefer to fall back to the ordinary canvas 2D context, as happens in other browsers. The mass of people out there have low end machines with weak CPUs that turn the game in to a slideshow when software rendering, and I think in many cases the 2D canvas would have been hardware accelerated. However, the WebGL context creation never fails in Chrome and there is no obvious way to detect SwiftShader.
Things I've tried:
// Always returns "WebKit WebGL" regardless of SwiftShader
gl.getParameter(gl.RENDERER)
// Always returns "WebKit" regardless of SwiftShader
gl.getParameter(gl.VENDOR)
I could try taking in to account things like the maximum texture size or the other MAX_* properties, but how do I know they don't vary between machines even with SwiftShader? And since I guess SwiftShader aims to mimic mon hardware, using that approach might still get a lot of false positives.
I don't want to write a startup performance test, because:
- we just make an engine, not any particular game, so I don't know how we'd write a fair test which works in the general case for any game of any performance profile with a high degree of accuracy
- A good test would probably need a second or two to finish running, which could interrupt the user experience or make them have to watch some squares being shifted around or whatever
- It could create new plications, such as if we cache the result, what if the user updates their drivers and fixes the problem?
I don't want to flat out disable WebGL on Chrome, because with hardware-accelerated WebGL performance can be over twice as fast as canvas 2D! If we did that, everyone loses.
I don't want to have to add in-game switches or a user setting, because how many users care about that? If the game is slow they'll just quit and most likely not search for a solution. "This game sucks, I'll go somewhere else." I think only a minority of users would bother reading instructions like "by the way, if this game is slow, try changing this setting to 'canvas 2D'..."
My current best guess is to use gl.getSupportedExtensions()
. I have found that SwiftShader reports the following extensions:
OES_texture_float,OES_standard_derivatives,WEBKIT_WEBGL_lose_context
...but a real hardware-accelerated context reports:
OES_texture_float,OES_standard_derivatives,WEBKIT_WEBGL_lose_context,WEBKIT_WEBGL_pressed_textures
Note the addition of WEBKIT_WEBGL_pressed_textures
. Some quick research indicates that this may or may not be widely supported. See this support table - both GL_EXT_texture_pression_s3tc
and GL_ARB_texture_pression
appear widely supported on desktop cards. Also the table only seems to list reasonably old models, so I could hazard a guess that all modern desktop graphics cards would support WEBKIT_WEBGL_pressed_textures
... therefore my detection criteria for SwiftShader would be:
- Windows OS
- Google Chrome browser
- WebGL context does not support
WEBKIT_WEBGL_pressed_textures
- Result: fall back to Canvas 2D
Of course, if SwiftShader adds pressed texture support in future, this breaks again. But I can't see the advantage of pressed textures with a software renderer! Also, it will still get lots of false positives if there are many real working video cards out there that don't support WEBKIT_WEBGL_pressed_textures
!
Is there not a better way to detect SwiftShader?
I have a 2D HTML5 game engine (www.scirra.) and really want to detect if WebGL is going to render with Chrome 18's 'Swiftshader' software renderer. If so we would much prefer to fall back to the ordinary canvas 2D context, as happens in other browsers. The mass of people out there have low end machines with weak CPUs that turn the game in to a slideshow when software rendering, and I think in many cases the 2D canvas would have been hardware accelerated. However, the WebGL context creation never fails in Chrome and there is no obvious way to detect SwiftShader.
Things I've tried:
// Always returns "WebKit WebGL" regardless of SwiftShader
gl.getParameter(gl.RENDERER)
// Always returns "WebKit" regardless of SwiftShader
gl.getParameter(gl.VENDOR)
I could try taking in to account things like the maximum texture size or the other MAX_* properties, but how do I know they don't vary between machines even with SwiftShader? And since I guess SwiftShader aims to mimic mon hardware, using that approach might still get a lot of false positives.
I don't want to write a startup performance test, because:
- we just make an engine, not any particular game, so I don't know how we'd write a fair test which works in the general case for any game of any performance profile with a high degree of accuracy
- A good test would probably need a second or two to finish running, which could interrupt the user experience or make them have to watch some squares being shifted around or whatever
- It could create new plications, such as if we cache the result, what if the user updates their drivers and fixes the problem?
I don't want to flat out disable WebGL on Chrome, because with hardware-accelerated WebGL performance can be over twice as fast as canvas 2D! If we did that, everyone loses.
I don't want to have to add in-game switches or a user setting, because how many users care about that? If the game is slow they'll just quit and most likely not search for a solution. "This game sucks, I'll go somewhere else." I think only a minority of users would bother reading instructions like "by the way, if this game is slow, try changing this setting to 'canvas 2D'..."
My current best guess is to use gl.getSupportedExtensions()
. I have found that SwiftShader reports the following extensions:
OES_texture_float,OES_standard_derivatives,WEBKIT_WEBGL_lose_context
...but a real hardware-accelerated context reports:
OES_texture_float,OES_standard_derivatives,WEBKIT_WEBGL_lose_context,WEBKIT_WEBGL_pressed_textures
Note the addition of WEBKIT_WEBGL_pressed_textures
. Some quick research indicates that this may or may not be widely supported. See this support table - both GL_EXT_texture_pression_s3tc
and GL_ARB_texture_pression
appear widely supported on desktop cards. Also the table only seems to list reasonably old models, so I could hazard a guess that all modern desktop graphics cards would support WEBKIT_WEBGL_pressed_textures
... therefore my detection criteria for SwiftShader would be:
- Windows OS
- Google Chrome browser
- WebGL context does not support
WEBKIT_WEBGL_pressed_textures
- Result: fall back to Canvas 2D
Of course, if SwiftShader adds pressed texture support in future, this breaks again. But I can't see the advantage of pressed textures with a software renderer! Also, it will still get lots of false positives if there are many real working video cards out there that don't support WEBKIT_WEBGL_pressed_textures
!
Is there not a better way to detect SwiftShader?
Share Improve this question asked May 4, 2012 at 21:40 AshleysBrainAshleysBrain 22.7k16 gold badges91 silver badges125 bronze badges 3- Remember that the WEBKIT_WEBGL_pressed_textures extension is available on windows because: 1) it's an experimental hardware feature; 2) Hardware acceleration on Chrome/Windows is done through the ANGLE library, which is an OpenGL ES implementation backed by Direct3D (which has much better hardware support than OpenGL in some Intel boards). So that flag is directly related to Direct3D pressed texture support. – Chiguireitor Commented May 5, 2012 at 2:35
- @Chiguireitor - are you saying that particular extension is mon, or a good indicator of real hardware support? That would be reassuring. – AshleysBrain Commented May 5, 2012 at 2:43
- Don't take my word for it, check the project directly code.google./p/angleproject i saw a demo by Toji showing texture pression, which is "almost" a certain sign of hardware acceleration but remember that a => b doesn't necessarily imply b => a... you won't have a pressed_textures extension on non current hardware. – Chiguireitor Commented May 5, 2012 at 3:27
4 Answers
Reset to default 2Go to http://code.google./p/angleproject/wiki/ExtensionSupport look at the EGL extension EGL_ANGLE_software_display, if it is available, it is because there's a SwiftShader backend.
You say you don't want to write a “startup performance test” — that is, render a few frames using both 2D and WebGL and measure which is faster, then use that one — but I still think it is the best option.
Advantages:
- Selects the renderer that is actually faster on the current hardware, regardless of its apparent attributes.
Disadvantages/caveats:
- You have to load your resources for both renderers, which might increase load time (but you can remember the choice so that it only ever happens once).
- If the system is slow at first (e.g. due to paging) the measurement could be wrong.
- Hardware without active cooling (e.g. phones) may overheat and reduce performance at some later time. You can't directly measure the heat output of the two methods. (If you have a good opportunity to re-measure and re-select the rendering method, such as a static screen while loading a new level, you could do that.)
Addressing your specific concerns:
- Since you are writing a game engine, you will have to provide the game developer with a way to specify sample content to use in the test.
- A second or two isn't a whole lot of additional load time, especially if (as I suspect) there is no reliable way to discriminate renderers otherwise. There is no need to make the canvas elements you are using visible to the user during the test; you could present an entirely unrelated loading animation.
What you really want to know is if it would be better to present your game in Canvas2D instead of WebGL. That's not the same question as whether or not it's running on top of Swiftshader.
Honestly, I don't know why asking the user is unacceptable.
Many of the top selling games of all time have these options including
- Call of Duty Modern Warfare 3
- Battlefield 3
- Angry Birds (http://chrome.angrybirds./)
SwiftShader is actually faster than some integrated graphics. So detecting GPU or CPU rendering gives you no guarantees about actual performance. Also, SwiftShader is the fastest software renderer around and should do a really decent job with simple games. Are you sure your application is properly optimized?