最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - How to Use Dropbox Upload Session For Files Larger than 150mb? - Stack Overflow

programmeradmin1浏览0评论

I want to upload a file larger than 150mb.

In the Dropbox API V2 docs, it says you should start an upload session.

The docs say you can't send a POST with more than 150mb of data, but I'm unsure how to achieve that with the upload_session API.

I want to upload a file larger than 150mb.

In the Dropbox API V2 docs, it says you should start an upload session.

The docs say you can't send a POST with more than 150mb of data, but I'm unsure how to achieve that with the upload_session API.

Share Improve this question asked Oct 18, 2016 at 17:00 EdwardEdward 5,2482 gold badges30 silver badges43 bronze badges
Add a ment  | 

3 Answers 3

Reset to default 7

While any individual request shouldn't be larger than 150 MB (and typically you should use a significantly smaller chunk size), you can upload files larger than that by using multiple requests.

There's an example of using upload sessions below. That example uses the Python SDK, but the JavaScript SDK, but it should serve as a useful reference, as the logic is the same. (They both use the same underlying API.)

This uses the Dropbox Python SDK to upload a file to the Dropbox API from the local file as specified by file_path to the remote path as specified by dest_path. It also chooses whether or not to use an upload session based on the size of the file:

f = open(file_path)
file_size = os.path.getsize(file_path)

CHUNK_SIZE = 4 * 1024 * 1024

if file_size <= CHUNK_SIZE:

    print dbx.files_upload(f.read(), dest_path)

else:

    upload_session_start_result = dbx.files_upload_session_start(f.read(CHUNK_SIZE))
    cursor = dropbox.files.UploadSessionCursor(session_id=upload_session_start_result.session_id,
                                               offset=f.tell())
    mit = dropbox.files.CommitInfo(path=dest_path)

    while f.tell() < file_size:
        if ((file_size - f.tell()) <= CHUNK_SIZE):
            print dbx.files_upload_session_finish(f.read(CHUNK_SIZE),
                                            cursor,
                                            mit)
        else:
            dbx.files_upload_session_append(f.read(CHUNK_SIZE),
                                            cursor.session_id,
                                            cursor.offset)
            cursor.offset = f.tell()

f.close()

You can quickly upload file chunks using files/upload_session/start, files/upload_session/append_v2 and files/upload_session/finish API endpoints. Here is an example which uses my tiny dropbox v2 api wrapper (dropbox-v2-api):

const CHUNK_LENGTH = 100;
//create read streams, which generates set of 100 (CHUNK_LENGTH) characters of values: 1 and 2
const firstUploadChunkStream = () => utils.createMockedReadStream('1', CHUNK_LENGTH); 
const secondUploadChunkStream = () => utils.createMockedReadStream('2', CHUNK_LENGTH);

sessionStart((sessionId) => {
    sessionAppend(sessionId, () => {
        sessionFinish(sessionId);
    });
});

function sessionStart(cb) {
    dropbox({
        resource: 'files/upload_session/start',
        parameters: {
            close: false
        },
        readStream: firstUploadChunkStream()
    }, (err, response) => {
        if (err) { return console.log('sessionStart error: ', err) }
        console.log('sessionStart response:', response);
        cb(response.session_id);
    });
}


function sessionAppend(sessionId, cb) {
    dropbox({
        resource: 'files/upload_session/append_v2',
        parameters: {
            cursor: {
                session_id: sessionId,
                offset: CHUNK_LENGTH
            },
            close: false,
        },
        readStream: secondUploadChunkStream()
    }, (err, response) => {
        if(err){ return console.log('sessionAppend error: ', err) }
        console.log('sessionAppend response:', response);
        cb();
    });
}

function sessionFinish(sessionId) {
    dropbox({
        resource: 'files/upload_session/finish',
        parameters: {
            cursor: {
                session_id: sessionId,
                offset: CHUNK_LENGTH * 2
            },
            mit: {
                path: "/result.txt",
                mode: "add",
                autorename: true,
                mute: false
            }
        }
    }, (err, response) => {
        if (err) { return console.log('sessionFinish error: ', err) }
        console.log('sessionFinish response:', response);
    });
}

I have an example!

testFile1Data = "test file data 1";
dbx.filesUploadSessionStart({
  contents: testFile1Data,
  close: true,
})
.then(function (response) {
  file1Start = response;
})
.catch(function (err) {
  console.log(err);
});

testFile2Data = "test file data 2";
dbx.filesUploadSessionStart({
  contents: testFile2Data,
  close: true,
})
.then(function (response) {
  file2Start = response;
})
.catch(function (err) {
  console.log(err);
});

dbx.filesUploadSessionFinishBatch({entries: [
    {cursor: {session_id: file1Start.session_id, offset: testFile1Data.length}, mit: {path: "/testFile1.txt"}},
    {cursor: {session_id: file2Start.session_id, offset: testFile2Data.length}, mit: {path: "/testFile2.txt"}},
  ]})
.then(function (response) {
  finishBatch = response;
})
.catch(function (err) {
  console.log(err);
});

dbx.filesUploadSessionFinishBatchCheck({async_job_id: finishBatch.async_job_id})
.then(function (response) {
  finishBatch = response
})
.catch(function (err) {
  console.log(err);
});

I got the example from an issue thread on github - https://github./dropbox/dropbox-sdk-js/issues/80#issuement-283189888

发布评论

评论列表(0)

  1. 暂无评论