最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - handling large web uploads for photo and videos - Stack Overflow

programmeradmin5浏览0评论

My use case involves uploading thousands of full quality photo and video files using browser to S3 and Wasabi storage accounts. Currently we are pression it on client's browser and right now we did it using dropzonejs which handles uploading. Right now its pressed before being uploaded to server.

However, that's what we need to change. We need to upload original quality photos and that's where it gets stuck as we cannot upload files more than 3-4 Gbs using Dropzonejs. Not sure what prevents it but we are struggling to find solution for this. We face problem randomly with memory limit in Chrome which crashes and need to restart process again. With original quality photo we assume this would not work as we will be talking for at least 10 to 15 gbs of data at least.

What would you remend for this kind of use case where we need to upload video & photos in original quality sometimes a single photo could be taking as much as 40Mbs+. And video several Gbs.

How does Google photos manages this? We need something like that.

My use case involves uploading thousands of full quality photo and video files using browser to S3 and Wasabi storage accounts. Currently we are pression it on client's browser and right now we did it using dropzonejs which handles uploading. Right now its pressed before being uploaded to server.

However, that's what we need to change. We need to upload original quality photos and that's where it gets stuck as we cannot upload files more than 3-4 Gbs using Dropzonejs. Not sure what prevents it but we are struggling to find solution for this. We face problem randomly with memory limit in Chrome which crashes and need to restart process again. With original quality photo we assume this would not work as we will be talking for at least 10 to 15 gbs of data at least.

What would you remend for this kind of use case where we need to upload video & photos in original quality sometimes a single photo could be taking as much as 40Mbs+. And video several Gbs.

How does Google photos manages this? We need something like that.

Share Improve this question asked Mar 28, 2022 at 14:27 KrunalKrunal 3,2879 gold badges50 silver badges110 bronze badges
Add a ment  | 

5 Answers 5

Reset to default 2 +50

Do not press on client side. It actually increases memory usage on the browser session. To my experience uploading original image from the browser uses least amount of memory as the browser should only read enough from the file to send the data, as long as you're not loading the picture locally to show the thumbnails.

I was able to upload GBs of images to S3 with client side pression turned off. I was able to upload a single 20GB video file to S3, upload 200 videos totaling over 13GB using S3 Chunk upload. Chunk upload should increase, not decrease browser memory usage and was implemented to hand transmission failures for large files.

Chunking...

someone already have demo https://github./dropzone/dropzone/blob/main/test/test-sites/2-integrations/aws-s3-multipart.html

but, i think 4GB is the max file size that Chrome will accept(And I think chrome has the highest limit pare to other browsers). which means you need to use other method to upload such as ftp, stream, scp etc... or ask your clients to slice the files themselves before uploading through their browser.

Or create a custom executable that bundles with S3 client, and let your clients use that

dropzonejs supports chunking and paralleling did you use them? do you press files by dropzone like this: https://stackoverflow./a/51699311/18399373

You can have a huge performance improvement by writing your client to upload directly to S3 by providing a signed url(s) and have them skip the server as the middle man:

S3: https://docs.aws.amazon./AmazonS3/latest/userguide/PresignedUrlUploadObject.html

GCP: https://cloud.google./blog/products/storage-data-transfer/uploading-images-directly-to-cloud-storage-by-using-signed-url

I would remend using presigned url from S3. In our Project we generate presigned url by giving bucket name, path name, bucket access to upload and expiry time. So now the user can upload the file easily directly to S3. AWS will take care of all the networking issues, only condition is you should have a good internet.

发布评论

评论列表(0)

  1. 暂无评论