最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Nodejs GPU.js slower using GPU than using CPU - Stack Overflow

programmeradmin0浏览0评论

I have run a benchmark to pare the use of CPU and GPU in nodejs with GPU.js. The NVidia icon shows GPU use in the first console timer, but it is slower than the CPU (second timer).

const {GPU} = require('gpu.js');
const gpu = new GPU();

const multiplyMatrix = gpu.createKernel(function(a, b) {
    let sum = 0;
    for (let i = 0; i < 512; i++) {
        sum += a[this.thread.y][i] * b[i][this.thread.x];
    }
    return sum;
}).setOutput([512, 512]);

var a = [];
var b = [];
for (var i = 0; i < 512; i++) {
    a.push([]);
    b.push([]);
    for (var j = 0; j < 512; j++) {
        a[i].push(1);
        b[i].push(-1);
    }
}

console.time("gpu");
const c = multiplyMatrix(a, b);
console.timeEnd("gpu"); //2148ms

console.time("cpu");
var d = [];
for (var i = 0; i < 512; i++) {
    d.push([]);
    for (var j = 0; j < 512; j++) {
        let sum = 0;
        for (let k = 0; k < 512; k++) {
            sum += a[i][k] * b[k][j];
        }
        
        d[i].push(sum);
    }
}
console.timeEnd("cpu"); //710ms

Am I doing something clearly wrong?

I have run a benchmark to pare the use of CPU and GPU in nodejs with GPU.js. The NVidia icon shows GPU use in the first console timer, but it is slower than the CPU (second timer).

const {GPU} = require('gpu.js');
const gpu = new GPU();

const multiplyMatrix = gpu.createKernel(function(a, b) {
    let sum = 0;
    for (let i = 0; i < 512; i++) {
        sum += a[this.thread.y][i] * b[i][this.thread.x];
    }
    return sum;
}).setOutput([512, 512]);

var a = [];
var b = [];
for (var i = 0; i < 512; i++) {
    a.push([]);
    b.push([]);
    for (var j = 0; j < 512; j++) {
        a[i].push(1);
        b[i].push(-1);
    }
}

console.time("gpu");
const c = multiplyMatrix(a, b);
console.timeEnd("gpu"); //2148ms

console.time("cpu");
var d = [];
for (var i = 0; i < 512; i++) {
    d.push([]);
    for (var j = 0; j < 512; j++) {
        let sum = 0;
        for (let k = 0; k < 512; k++) {
            sum += a[i][k] * b[k][j];
        }
        
        d[i].push(sum);
    }
}
console.timeEnd("cpu"); //710ms

Am I doing something clearly wrong?

Share Improve this question asked Dec 19, 2020 at 13:38 Eduardo PoçoEduardo Poço 3,1092 gold badges21 silver badges31 bronze badges 4
  • No expert here, but from what I understand GPU has big gains when calculations are kept parallel. sum += is not, so one idea is use another array to store the loop calls, and calculate the sum from this array. – Keith Commented Dec 19, 2020 at 13:47
  • Wow! I took this example from GPU.js site, gpu.rocks. I thought it would run in parallel for each pair (i, j) – Eduardo Poço Commented Dec 19, 2020 at 14:06
  • You have a slow GPU or a fast CPU, or both. I have a 1070 and get ~80ms vs 350ms on the CPU. – Sergiu Paraschiv Commented Dec 19, 2020 at 14:17
  • Not sure, if it is on gpu.js, then I would expect it knows how to optimise += & **. Running your code on my machine I get 44ms & 239ms. So GPU is running about 6 times faster. – Keith Commented Dec 19, 2020 at 14:52
Add a ment  | 

1 Answer 1

Reset to default 5

this isn't the way to benchmarking CPU vs GPU

  1. the GPU got warmup time so if you really want to benchmark pare both of them on a 1000 execution and not single execution

  2. GPU won't always be faster it depends on the task and the GPU RAM Size

  3. and finally as Keith Mention at the ment gpu works better then cpu in parallel small task and large batches

发布评论

评论列表(0)

  1. 暂无评论