最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - THREE.js read pixels from GPUComputationRenderer texture - Stack Overflow

programmeradmin0浏览0评论

I have been playing with GPUComputationRenderer on a modified version of this three.js example which modifies the velocity of interacting boids using GPU shaders to hold, read and manipulate boid position and velocity data.

I have got to a stage where I can put GPU puted data (predicted collision times) into the texture buffer using the shader. But now I want to read some of that texture data inside the main javascript animation script (to find the earliest collision).

Here is the relevant code in the render function (which is called on each animation pass)

//... GPU calculations as per original THREE.js example
gpuComputepute();   //... gpuCompute is the gpu putation renderer.           
birdUniforms.texturePosition.value = gpuCompute.getCurrentRenderTarget( positionVariable ).texture;
birdUniforms.textureVelocity.value = gpuCompute.getCurrentRenderTarget( velocityVariable ).texture;

var xTexture = birdUniforms.texturePosition.value;//... my variable, OK.

//... From /
//... but note that this reads from the main THREE.js renderer NOT from the gpuCompute renderer.
//var pixelBuffer = new Uint8Array(canvas.width * canvas.height * 4);               
//var gl = renderer.getContext();
//gl.readPixels(0, 0, canvas.width, canvas.height, gl.RGBA, gl.UNSIGNED_BYTE, pixelBuffer);

var pixelBuffer = new Uint8Array( WIDTH * WIDTH * 4);   //... OK.

//var gl = gpuCompute.getContext();//... no getContext function!!!

//... from Nick Whaley here: 
//WebGLRenderer.readRenderTargetPixels ( renderTarget, x, y, width, height, buffer )

gpuCompute.readRenderTargetPixels ( xTexture, 0, 0, WIDTH, WIDTH, pixelBuffer ); //... readRenderTargetPixels is not a function!

As shown in the code I was "wanting" the gpuCompute renderer object to provide functions such as .getContext() or readRenderTargetPixels() but they do not exist for gpuCompute.


EDIT:

Then I tried adding the following code:-

//... the WebGLRenderer code is included in THREE.js build
myWebglRenderer = new THREE.WebGLRenderer();                            
var myRenderTarget = gpuCompute.getCurrentRenderTarget( positionVariable );             
myWebglRenderer.readRenderTargetPixels ( 
        myRenderTarget,  0, 0, WIDTH, WIDTH, pixelBuffer );

This executes OK but pixelBuffer remains entirely full of zeroes instead of the desired position coordinate values.


Please can anybody suggest how I might read the texture data into a pixel buffer? (preferably in THREE.js/plain javascript because I am ignorant of WebGL).

I have been playing with GPUComputationRenderer on a modified version of this three.js example which modifies the velocity of interacting boids using GPU shaders to hold, read and manipulate boid position and velocity data.

I have got to a stage where I can put GPU puted data (predicted collision times) into the texture buffer using the shader. But now I want to read some of that texture data inside the main javascript animation script (to find the earliest collision).

Here is the relevant code in the render function (which is called on each animation pass)

//... GPU calculations as per original THREE.js example
gpuCompute.pute();   //... gpuCompute is the gpu putation renderer.           
birdUniforms.texturePosition.value = gpuCompute.getCurrentRenderTarget( positionVariable ).texture;
birdUniforms.textureVelocity.value = gpuCompute.getCurrentRenderTarget( velocityVariable ).texture;

var xTexture = birdUniforms.texturePosition.value;//... my variable, OK.

//... From http://zhangwenli./blog/2015/06/20/read-from-shader-texture-with-threejs/
//... but note that this reads from the main THREE.js renderer NOT from the gpuCompute renderer.
//var pixelBuffer = new Uint8Array(canvas.width * canvas.height * 4);               
//var gl = renderer.getContext();
//gl.readPixels(0, 0, canvas.width, canvas.height, gl.RGBA, gl.UNSIGNED_BYTE, pixelBuffer);

var pixelBuffer = new Uint8Array( WIDTH * WIDTH * 4);   //... OK.

//var gl = gpuCompute.getContext();//... no getContext function!!!

//... from Nick Whaley here: http://stackoverflow./questions/13475209/three-js-get-data-from-three-webglrendertarget
//WebGLRenderer.readRenderTargetPixels ( renderTarget, x, y, width, height, buffer )

gpuCompute.readRenderTargetPixels ( xTexture, 0, 0, WIDTH, WIDTH, pixelBuffer ); //... readRenderTargetPixels is not a function!

As shown in the code I was "wanting" the gpuCompute renderer object to provide functions such as .getContext() or readRenderTargetPixels() but they do not exist for gpuCompute.


EDIT:

Then I tried adding the following code:-

//... the WebGLRenderer code is included in THREE.js build
myWebglRenderer = new THREE.WebGLRenderer();                            
var myRenderTarget = gpuCompute.getCurrentRenderTarget( positionVariable );             
myWebglRenderer.readRenderTargetPixels ( 
        myRenderTarget,  0, 0, WIDTH, WIDTH, pixelBuffer );

This executes OK but pixelBuffer remains entirely full of zeroes instead of the desired position coordinate values.


Please can anybody suggest how I might read the texture data into a pixel buffer? (preferably in THREE.js/plain javascript because I am ignorant of WebGL).

Share Improve this question edited Sep 28, 2016 at 21:03 steveOw asked Sep 28, 2016 at 0:21 steveOwsteveOw 1,02115 silver badges45 bronze badges 2
  • 1 That will be hard to accept, but... you can't read back to system memory regular texture pixels in OpenGL, only from FrameBuffer Object (FBO) or RenderBuffer. To achieve that you have to use something else then THREE.WebGLRenderTarget as render target. I don't know what in three.js will help you in that case, but you certainly need to modify gpuCompute code. – weaknespase Commented Sep 28, 2016 at 21:37
  • @VallyN Thanks for the explanation and advice. Tinkering with gpuCompute is beyond my current skills. – steveOw Commented Sep 29, 2016 at 22:12
Add a ment  | 

2 Answers 2

Reset to default 5

Up to date answer

See WebGL Read pixels from floating point render target

Out of date answer

The short answer is it won't be easy. In WebGL 1.0 there is no easy way to read pixels from floating point textures which is what GPUComputationRenderer uses.

If you really want to read back the data you'll need to render the GPUComputationRenderer floating point texture into an 8bit RGBA texture doing some kind of encoding from 32bit floats to 8bit textures. You can then read that back in JavaScript and look at the values.

Sorry for the long delay. I've not logged in in SO for a long time.

In the example of water with tennis balls, https://threejs/examples/?q=water#webgl_gpgpu_water

The height of the water at the balls positions is read back from the GPU.

An integer 4-ponent texture is used to give a 1-ponent float texture.

The texture has 4x1 pixels, where the first one is the height and the other 2 are the normal of the water surface (the last pixel is not used)

This texture is puted and read back for each one of the tennis balls, and in CPU the ball physics is performed.

发布评论

评论列表(0)

  1. 暂无评论