最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Matrix transform chain in GLSL - Stack Overflow

programmeradmin1浏览0评论

I've been messing with this one all day, and I'm sure at this point I'm missing something stupid.

Essentially, I'm trying to implement a vertex transform chain within a GLSL shader. This works perfectly outside the shader, but inside it's bugging out and giving me junk positions.

Each vertex in the VBO has a transform chain, of 1- 4 matricies. Simple enough, you just perform each transform in order.

To get this into the shader as efficiently as possible, I'm packing the chain into a single integer. It's unpacked as follows:

int m0 = (iMatrixChain & (0xff << 24)) >> 24;
int m1 = (iMatrixChain & 0xff0000) >> 16;
int m2 = (iMatrixChain & 0xff00) >> 8;
int m3 = (iMatrixChain & 0xff);

As far as I can tell, this is encoded correctly and passed to the shader. For each of the 4 indicies in turn, if the matrix chain index is zero or above, and below 255, I then perform a matrix transform on the vector:

vec3 transformVector(vec3 initialVector, int matrixIndex)
{
    //column major
    float X = (initialVector.x * modelMatricies[matrixIndex][0].x) + (initialVector.y * modelMatricies[matrixIndex][0].y) + (initialVector.z * modelMatricies[matrixIndex][0].z);
    float Y = (initialVector.x * modelMatricies[matrixIndex][1].x) + (initialVector.y * modelMatricies[matrixIndex][1].y) + (initialVector.z * modelMatricies[matrixIndex][1].z);
    float Z = (initialVector.x * modelMatricies[matrixIndex][2].x) + (initialVector.y * modelMatricies[matrixIndex][2].y) + (initialVector.z * modelMatricies[matrixIndex][2].z);
    // ignoreW per DirectX

    X += 1 * modelMatricies[matrixIndex][0].w;
    Y += 1 * modelMatricies[matrixIndex][1].w;
    Z += 1 * modelMatricies[matrixIndex][2].w;
    return vec3(X, Y, Z);
}

I've verified this works as expected outside of GLSL.

The matricies are stored in a GL buffer, in the shader:

layout(binding = 12, std430) readonly buffer AnimationMatricies {
    mat4 modelMatricies[];
};

These are stored in the shader (C#, OpenTK) like this:

fixed (float* firstResult = &matriciesToShader[0].Row0.X)
{
                GL.NamedBufferStorage(bufferName, sizeof(Matrix4) * matriciesToShader.Length, matriciesToShader, BufferStorageFlags.DynamicStorageBit);
}

I've verified that pulling the matricies out of the buffer with GL.GetBufferSubData returns what I put in, so as far as I can tell they're OK.


At this point, I'm now stumped, and I'm sure I've missed something obvious, and I'd appreciate any pointers. (I've also attempted to do this with a Uniform array, with similar broken results)

The complete code for the shader is here:

I've been messing with this one all day, and I'm sure at this point I'm missing something stupid.

Essentially, I'm trying to implement a vertex transform chain within a GLSL shader. This works perfectly outside the shader, but inside it's bugging out and giving me junk positions.

Each vertex in the VBO has a transform chain, of 1- 4 matricies. Simple enough, you just perform each transform in order.

To get this into the shader as efficiently as possible, I'm packing the chain into a single integer. It's unpacked as follows:

int m0 = (iMatrixChain & (0xff << 24)) >> 24;
int m1 = (iMatrixChain & 0xff0000) >> 16;
int m2 = (iMatrixChain & 0xff00) >> 8;
int m3 = (iMatrixChain & 0xff);

As far as I can tell, this is encoded correctly and passed to the shader. For each of the 4 indicies in turn, if the matrix chain index is zero or above, and below 255, I then perform a matrix transform on the vector:

vec3 transformVector(vec3 initialVector, int matrixIndex)
{
    //column major
    float X = (initialVector.x * modelMatricies[matrixIndex][0].x) + (initialVector.y * modelMatricies[matrixIndex][0].y) + (initialVector.z * modelMatricies[matrixIndex][0].z);
    float Y = (initialVector.x * modelMatricies[matrixIndex][1].x) + (initialVector.y * modelMatricies[matrixIndex][1].y) + (initialVector.z * modelMatricies[matrixIndex][1].z);
    float Z = (initialVector.x * modelMatricies[matrixIndex][2].x) + (initialVector.y * modelMatricies[matrixIndex][2].y) + (initialVector.z * modelMatricies[matrixIndex][2].z);
    // ignoreW per DirectX

    X += 1 * modelMatricies[matrixIndex][0].w;
    Y += 1 * modelMatricies[matrixIndex][1].w;
    Z += 1 * modelMatricies[matrixIndex][2].w;
    return vec3(X, Y, Z);
}

I've verified this works as expected outside of GLSL.

The matricies are stored in a GL buffer, in the shader:

layout(binding = 12, std430) readonly buffer AnimationMatricies {
    mat4 modelMatricies[];
};

These are stored in the shader (C#, OpenTK) like this:

fixed (float* firstResult = &matriciesToShader[0].Row0.X)
{
                GL.NamedBufferStorage(bufferName, sizeof(Matrix4) * matriciesToShader.Length, matriciesToShader, BufferStorageFlags.DynamicStorageBit);
}

I've verified that pulling the matricies out of the buffer with GL.GetBufferSubData returns what I put in, so as far as I can tell they're OK.


At this point, I'm now stumped, and I'm sure I've missed something obvious, and I'd appreciate any pointers. (I've also attempted to do this with a Uniform array, with similar broken results)

The complete code for the shader is here: https://gist.github/leezer3/e24d82a6b07df74c2058dbc34634a26a

Share Improve this question asked Nov 20, 2024 at 16:02 leezer3leezer3 3004 silver badges11 bronze badges
Add a comment  | 

1 Answer 1

Reset to default 1

For anyone else finding this, it was a stupid mistake....

GL.VertexAttribPointer(VertexLayout.MatrixChain, 1, VertexAttribPointerType.Int, false, vertexSize, offset);

VertexAttribPointer with a type of int will produce corrupted data, as it's actually converting to float and normalizing (despite the fact it implies it won't).

However, if we use VertexAttribIPointer instead, things actually work as designed....

GL.VertexAttribIPointer(VertexLayout.MatrixChain, 1, VertexAttribIntegerType.Int, 48 + sizeof(int), (IntPtr)offset);
发布评论

评论列表(0)

  1. 暂无评论