So I'm learning WebGL for the first time, and I'm having difficulty understanding WebGL colors.
My concern is that I have seen 2 formats or conventions for writing RGBA color data (or vectors if that is what they are called? - regarding shaders)
Mozilla uses color on a scale of 0 to 255:
Here is an example
However, I have seen other cases where developers (like this YouTuber) use color on a scale of 0 to 1.0,
Is this a preference? Is it mandatory what convention you use per situation? If it is preference and not mandate, can I mix and match the two? Do browsers have a preference?
For example in regards to mixing, is [1.0, 255, 1.0, 255]
fully opaque white?
So I'm learning WebGL for the first time, and I'm having difficulty understanding WebGL colors.
My concern is that I have seen 2 formats or conventions for writing RGBA color data (or vectors if that is what they are called? - regarding shaders)
Mozilla uses color on a scale of 0 to 255:
Here is an example
However, I have seen other cases where developers (like this YouTuber) use color on a scale of 0 to 1.0,
Is this a preference? Is it mandatory what convention you use per situation? If it is preference and not mandate, can I mix and match the two? Do browsers have a preference?
For example in regards to mixing, is [1.0, 255, 1.0, 255]
fully opaque white?
-
1
Your first example shows an ImageData object which data is an
Uint8array
so values are from 0 to 255 (and btw it seems to be the context2d API docs). In the video, he uses thewebglContext.clearColor
which acceptsGLclampf
values (32 bits floats clamped between 0 and 1). You use what the API accepts. – Kaiido Commented Sep 15, 2017 at 2:45 - @Kaiido How odd of JavaScript? I figured they would have established a standard way of addressing something as simple as color across all of their APIs – user3186555 Commented Sep 15, 2017 at 2:48
- 4 @DaMaxContent "as simple as color". You just made a legion of graphics programmers turn in their graves. – Paul-Jan Commented Sep 15, 2017 at 6:55
- @Paul-Jan I suppose this is where the stigmas of standards-pliance and graphics-programming clash. This is new to me as a javascripter and it is frankly out of character for the language. "Vanilla" JavaScript's actual name is ECMAscript after all. *sarcasm* I counter with "'You use what the API accepts' just made the forefathers of ECMA turn in their graves" – user3186555 Commented Sep 22, 2017 at 17:56
4 Answers
Reset to default 7WebGL generally deals with colors in 0 to 1 range.
How it stores them though is up to you and the given situation
This is in some ways no different then CSS. CSS you can specify colors like this #123456
and like this red
and like this rgb(18,52,86)
and like this hsl(30,100%,50%)
In general WebGL uses 0 to 1 for colors. The plication is when you create buffers and textures. You decide the binary format of the buffer or texture and then you have to put data in that buffer and tell WebGL how to convert it back into values for WebGL to use.
So for example it's most mon to make textures using gl.UNSIGNED_BYTE
as the storage format. In this case values in the texture go from 0 to 255 and when used in WebGL they get converted back to 0 to 1.
There are other formats like gl.FLOAT
that use normal 0 to 1 numbers but they require 4x the store space.
Similarly it's mon to store vertex color data as gl.UNSIGNED_BYTE
and tell WebGL to convert that back into 0 to 1 values when reading the data.
As for mixing it's not as sample as is [1.0, 255, 1.0, 255]
fully opaque.
In WebGL it's up to you to draw every pixel, decide how the values are used, decide how they are blended. You could make a texture with 1 value per pixel (instead of 4, rgba, like a canvas). Will you use that one value for red? for green? for blue? For 2 of them, all 3 of them? None of them? It's 100% up to you. How will you blend the color with colors already in the canvas? That's also up to you. If you specify [10, 20, 30, 40]
that doesn't mean R = 10, G = 20, B = 30, A = 40. It really just means you have 4 values. It's traditional to use the first value as red but WebGL does not force you to use it as red. Even more, WebGL doesn't really care about colors. It's just a rasterization engine which is a fancy way to stay that it draws values into 2D arrays. Those arrays could be used for color and images but they can also be used to pute physics or mine bitcoins.
It sounds like you're new to WebGL. May I suggest some tutorials
The WebGL API is based on OpenGL which uses normalized floating point color values. It's not so much related to JavaScript or the 2D canvas API per-se.
In WebGL the actual resulting color will be a result of many factors such as light(s), shadow, distance, angle, material (diffuse, reflection, specularity, transparency etc.) and needs a much more nuanced approach than 2D canvas due to the amount of calculations that takes place, hence the normalized floating point values.
You can convert rgba values to GLclampf values dividing by 255.
https://munity.khronos/t/how-to-convert-from-rgb-255-to-opengl-float-color/29288
EXAMPLE Supposing you want to use rgba(162,41,86,1)
You can do:
const toGlclampf = (r, g, b, a) => [r/255, g/255, b/255, a];
let rgba = toGlclampf(162, 41, 86, 1.0);
gl.clearColor(rgba[0], rgba[1], rgba[2], rgba[3]);
Or just simply:
gl.clearColor(162/255, 41/255, 86/255, 1.0);
In WebGL it's different. It uses numbers in range of 0.0 to 1.0 because it is itself based on OpenGL which uses this number range for RGBA model as a rule.