I've seen shaders that create an outline around edges dynamically based on how much difference there is between the depth (distance from camera to surface) of a pixel at an edge and the depth of a pixel adjacent to it (less depth can mean a thinner outline or none at all). Like these renders:
And I'm interested in using such a shader on my three.js renders, but I think I need to figure out how to access depth data for each pixel.
Three.js documentation mentions a depth setting:
depth - whether the drawing buffer has a depth buffer of at least 16 bits. Default is true.
But I'm not sure what it means by the drawing buffer having a depth buffer. The image buffers I'm familiar with are pixel buffers, with no depth information. Where would I access this depth buffer at?
I've seen shaders that create an outline around edges dynamically based on how much difference there is between the depth (distance from camera to surface) of a pixel at an edge and the depth of a pixel adjacent to it (less depth can mean a thinner outline or none at all). Like these renders:
And I'm interested in using such a shader on my three.js renders, but I think I need to figure out how to access depth data for each pixel.
Three.js documentation mentions a depth setting:
depth - whether the drawing buffer has a depth buffer of at least 16 bits. Default is true.
But I'm not sure what it means by the drawing buffer having a depth buffer. The image buffers I'm familiar with are pixel buffers, with no depth information. Where would I access this depth buffer at?
Share Improve this question edited May 25, 2018 at 14:01 john doe asked May 25, 2018 at 13:51 john doejohn doe 5571 gold badge6 silver badges20 bronze badges2 Answers
Reset to default 5There's an example in the Three.js website that renders the scene to a THREE.WebGLRenderTarget
with it's depthBuffer
attribute set to true
. This gives you access to depth data.
The idea is as follows:
- Render the main scene to a
WebGLRenderTarget
. This target will contain RGB and Depth data that can be accessed via their.texture
and.depthTexture
attributes, accordingly. - Take these 2 textures, and apply them to a plane with custom shaders.
- In the plane's custom shaders, you can access the texture data to perform whatever calculations you want to play with colors and depth.
- Render the second scene (that contains only the plane) to canvas.
Here's the link to source code of that example Notice you can ment-out the code on line 73 to allow the color data to display.
Three.js already has a MeshToonMaterial, there's no need to create a new one.
https://github./mrdoob/three.js/blob/master/examples/webgl_materials_variations_toon.html
html, body, iframe {margin:0;width:100%;height:100%;border:0}
<iframe src="https://threejs/examples/webgl_materials_variations_toon.html">