最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

bit manipulation - How to use ArrayBuffers with DataViews in JavaScript - Stack Overflow

programmeradmin7浏览0评论

The only real tutorial I have seen for ArrayBuffer is from HTML5Rocks. But I am wondering specifically how to manipulate the individual bytes. For example, this cartoon on ArrayBuffers from Mozilla shows an image of an ArrayBuffer wrapped in a Uint8Array view:

It gives the feeling that you can do this with an ArrayBuffer:

var x = new ArrayBuffer(10)
x[0] = 1
x[1] = 0
...
x[9] = 1

That is, manually setting the bytes. But I haven't seen any documentation on such a feature. Instead, it seems you are supposed to either use one of the TypedArray ponents, or the DataView:

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
console.log(y.getUint32(0)) // 1
console.log(x[0]) // undefined

But again it seems after manipulating the ArrayBuffer with the DataView, you can't actually access any of the bytes on the ArrayBuffer directly.

Trying other things with the ArrayBuffer and DataView I get confused:

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(1, 2)
console.log(y.getUint32(0)) // 0 (incorrect)
console.log(y.getUint32(1)) // 2 (correct)

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(2, 2)
console.log(y.getUint32(0)) // 0 (incorrect)
console.log(y.getUint32(1)) // 0 (?)
console.log(y.getUint32(2)) // 2 (correct)

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(3, 2)
console.log(y.getUint32(0)) // 0 (incorrect)
console.log(y.getUint32(1)) // 0 (?)
console.log(y.getUint32(2)) // 0 (?)
console.log(y.getUint32(3)) // 2 (correct)

Until finally I get to 4 bytes which aligns with the 32 byte view. But then it is even more strange:

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(4, 2)
console.log(y.getUint32(0)) // 1
console.log(y.getUint32(1)) // 256
console.log(y.getUint32(2)) // 65536
console.log(y.getUint32(3)) // 16777216
console.log(y.getUint32(4)) // 2

This tells me that I need to manually place the 32-bit values in the appropriate places, but I don't understand why the other values are ing about like 256 and 65536.

Next, I would like to be able to print out the bytes as 101011100100 etc., the whole ArrayBuffer, or just parts of it.

Finally, I would like to be able to encode values other than 8, 16, and 32 bits, such as base64, or 4 bits, or an odd number of bits. There isn't a generic DataView API for doing that, such as a generic y.setUint(bitsCount, offset, value), not sure why.

To summarize, there is a lot that I am unfamiliar with when it es to dealing with low-level bit management. However, I would like to learn how to use it. So maybe if one could show quickly how to get a working knowledge of the ArrayBuffer + DataView bination, that would be really helpful.

I understand how to use the Uint8Array and related TypedArrays, I am just trying to learn how to use the lower-level ArrayBuffer.

The only real tutorial I have seen for ArrayBuffer is from HTML5Rocks. But I am wondering specifically how to manipulate the individual bytes. For example, this cartoon on ArrayBuffers from Mozilla shows an image of an ArrayBuffer wrapped in a Uint8Array view:

It gives the feeling that you can do this with an ArrayBuffer:

var x = new ArrayBuffer(10)
x[0] = 1
x[1] = 0
...
x[9] = 1

That is, manually setting the bytes. But I haven't seen any documentation on such a feature. Instead, it seems you are supposed to either use one of the TypedArray ponents, or the DataView:

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
console.log(y.getUint32(0)) // 1
console.log(x[0]) // undefined

But again it seems after manipulating the ArrayBuffer with the DataView, you can't actually access any of the bytes on the ArrayBuffer directly.

Trying other things with the ArrayBuffer and DataView I get confused:

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(1, 2)
console.log(y.getUint32(0)) // 0 (incorrect)
console.log(y.getUint32(1)) // 2 (correct)

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(2, 2)
console.log(y.getUint32(0)) // 0 (incorrect)
console.log(y.getUint32(1)) // 0 (?)
console.log(y.getUint32(2)) // 2 (correct)

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(3, 2)
console.log(y.getUint32(0)) // 0 (incorrect)
console.log(y.getUint32(1)) // 0 (?)
console.log(y.getUint32(2)) // 0 (?)
console.log(y.getUint32(3)) // 2 (correct)

Until finally I get to 4 bytes which aligns with the 32 byte view. But then it is even more strange:

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(4, 2)
console.log(y.getUint32(0)) // 1
console.log(y.getUint32(1)) // 256
console.log(y.getUint32(2)) // 65536
console.log(y.getUint32(3)) // 16777216
console.log(y.getUint32(4)) // 2

This tells me that I need to manually place the 32-bit values in the appropriate places, but I don't understand why the other values are ing about like 256 and 65536.

Next, I would like to be able to print out the bytes as 101011100100 etc., the whole ArrayBuffer, or just parts of it.

Finally, I would like to be able to encode values other than 8, 16, and 32 bits, such as base64, or 4 bits, or an odd number of bits. There isn't a generic DataView API for doing that, such as a generic y.setUint(bitsCount, offset, value), not sure why.

To summarize, there is a lot that I am unfamiliar with when it es to dealing with low-level bit management. However, I would like to learn how to use it. So maybe if one could show quickly how to get a working knowledge of the ArrayBuffer + DataView bination, that would be really helpful.

I understand how to use the Uint8Array and related TypedArrays, I am just trying to learn how to use the lower-level ArrayBuffer.

Share Improve this question asked Jul 21, 2018 at 2:51 Lance PollardLance Pollard 79.3k98 gold badges328 silver badges605 bronze badges 1
  • 1 Well, on the MDN site it states "You cannot directly manipulate the contents of an ArrayBuffer". So that's why you can't do things like x[0] = 1 It also mentions that when you new up an ArrayBuffer, all of its entries initialize to 0. That's why e.g., console.log(y.getUint32(1)) // 0 (?) The remainder of your questions are great, but you're asking a lot of questions for a single question, making this question too broad... – Heretic Monkey Commented Jul 21, 2018 at 3:15
Add a ment  | 

1 Answer 1

Reset to default 17

AFAIK DataView isn't really meant for what you're using it for. It's basically meant for parsing or creating a binary file in a known format and dealing with endian issues. In particular it handles endian issues AND it lets you read 16bit and 32bit values that are not aligned. In other words with Float32Array the offset into the buffer will be some multiple of 4 since a 32bit float is 4 bytes. There is no way to align a Float32Array on a non-4 byte boundary. With DataView you pass in the offset and it can by at any byte boundary.

DataView is also known to be slower than pure JavaScript implementations of the same API, at least at the moment.

So, for your use case you probably don't want to be using DataView but rather make your own.

Also, manipulating randomly sized strings of bits is not a very mon need so if you want to read and write by bits you're going to have to write your own library.

ArrayBuffer is just a buffer. You can not access its content directly. You have to create an ArrayBufferView into that buffer. There's a bunch of different types of views, like Int8Array, Uint32Array, Float32Array and DataView. They can view all or part of an ArrayBuffer. They also can create their own ArrayBuffer.

const view = new Uint32Array(100);

It's exactly the same as

const view = new Uint32Array(new ArrayBuffer(400));

You can get access to the ArrayBuffer of any view via it's buffer property. So these 3 lines

const buffer = new ArrayBuffer(400);
const view1 = new Uint32Array(buffer);
const view2 = new Uint8Array(buffer);

are the same as these 2 lines

const view1 = new Uint32Array(100);
const view2 = new Uint8Array(view1.buffer);

As for your first example the offset passed to DataView is in bytes so this

var x = new ArrayBuffer(100)
var y = new DataView(x)
y.setUint32(0, 1)
y.setUint32(1, 2)
console.log(y.getUint32(0)) // 0 (incorrect)
console.log(y.getUint32(1)) // 2 (correct)

The first y.setUint32(0, 1) is setting the first 4 bytes of the buffer. Bytes 0, 1, 2, and 3. The second y.setUint32(1, 2) is setting bytes 1, 2, 3, 4. So you're overwriting bytes 1, 2, and 3.

var x = new ArrayBuffer(100)

// x is 100 bytes `[0, 0, 0, 0, 0, 0, 0 ....`

var y = new DataView(x)
y.setUint32(0, 1);  // default is big endian

// x is now [0, 0, 0, 1, 0, 0 ...

y.setUint32(1, 2)

//    offset1 --+ (then 4 bytes over written with big endian 2)
//              |  |  |  |
//              V  V  V  V
// x is now [0, 0, 0, 0, 2, 0, ...  BYTES!!!

console.log(y.getUint32(0)) // 0 (incorrect)

// this gets the first 4 bytes as a big endian Uint32 so it's doing
//
// result = x[0] << 24 + x[1] << 16 + x[2] << 8 + x[3]
//          0    << 24 + 0    << 16 + 0    << 8 + 0
//          0

Doing the last example

y.setUint32(0, 1)
y.setUint32(4, 2)

// x is now [0, 0, 0, 1, 0, 0, 0, 2, ....]

console.log(y.getUint32(0)) // 1

// this is reading the bytes 0 to 3
// x[0] << 24 + x[1] << 16 + x[2] << 8 + x[3]  
// 0    << 24 + 0    << 16 + 0    << 8 + 1    = 1 

console.log(y.getUint32(1)) // 256

// this is reading the bytes 1 to 4
// x[1] << 24 + x[2] << 16 + x[3] << 8 + x[4]  
// 0    << 24 + 0    << 16 + 1    << 8 + 0    = 256

console.log(y.getUint32(2)) // 65536

// this is reading the bytes 2 to 5
// x[2] << 24 + x[3] << 16 + x[4] << 8 + x[5]  
// 0    << 24 + 1    << 16 + 0    << 8 + 0    = 65536

console.log(y.getUint32(3)) // 16777216

// this is reading the bytes 3 to 6
// x[3] << 24 + x[4] << 16 + x[5] << 8 + x[6]  
// 1    << 24 + 0    << 16 + 0    << 8 + 0    = 16777216

console.log(y.getUint32(4)) // 2

// this is reading the bytes 4 to 7
// x[4] << 24 + x[5] << 16 + x[6] << 8 + x[7]  
// 0    << 24 + 0    << 16 + 0    << 8 + 2    = 2

Apply the same logic to the rest and hopefully your DataView issues are explained?

You said you wanted to print them as bytes by referencing the MDN diagram and showing lots of 0s and 1s did you mean bits? You can print a number it bits like this

const v = 0x3A;
console.log(v.toString(2));

or padded to 8bits

const v = 0x3A;
console.log(v.toString(2).padStart(8, '0'));

To print the whole buffer, assuming you have a Uint8Array view (if not make one)

const v = new Uint8Array([0xDE, 0xAD, 0xBE, 0xEF, 0x21]);
console.log(asBits(v));

function asBits(a) {
  const nums = [];
  a.forEach(v => nums.push(v.toString(2).padStart(8, '0')));
  return nums.join('');
}

发布评论

评论列表(0)

  1. 暂无评论