最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Delete more than 1000 records at a time from S3 bucket - Stack Overflow

programmeradmin2浏览0评论

Need to delete more than 1000 records from s3 bucket , but its rejecting in promise and getting error as mentioned below

 async function removeS3Objects(audios) {
    //more than 1000 records are in audios
    audios = audios.map(item => {
      return {
        Key: item.Key
      };
    });
    let params = {
      Bucket: config.get("s3BucketName"),
      Delete: {
        Objects: audios,
        Quiet: false
      }
    };
    return new Promise((resolve, reject) => {
      s3.deleteObjects(params, function (err, data) {
        if (err) {
          reject(err);
        } else {
          resolve(data);
        }
      });
    });
  }

getting error while deleting

the xml you provided was not well-formed or did not validate against our published schema

Need to delete more than 1000 records from s3 bucket , but its rejecting in promise and getting error as mentioned below

 async function removeS3Objects(audios) {
    //more than 1000 records are in audios
    audios = audios.map(item => {
      return {
        Key: item.Key
      };
    });
    let params = {
      Bucket: config.get("s3BucketName"),
      Delete: {
        Objects: audios,
        Quiet: false
      }
    };
    return new Promise((resolve, reject) => {
      s3.deleteObjects(params, function (err, data) {
        if (err) {
          reject(err);
        } else {
          resolve(data);
        }
      });
    });
  }

getting error while deleting

the xml you provided was not well-formed or did not validate against our published schema
Share Improve this question asked May 5, 2020 at 6:35 Kishan JaiswalKishan Jaiswal 6644 silver badges14 bronze badges
Add a ment  | 

2 Answers 2

Reset to default 8

You cannot delete more than 1000 objects in one API call.

Straight from the docs

The request contains a list of up to 1000 keys that you want to delete

If you want to delete more than 1000 keys you'll need to make multiple calls.

May or may not fit your usecase but it did in my case: Generate a simple CSV file of all the keys you want to delete. The format of the csv file columns is

bucket-name|key

Create a batch job with operation type 'Replace all object tags'. Place some object tag such as delete=true on these objects. Monitor your batch job using AWS. Should not be too plicated hopefully. After that, create a LifeCycle policy to Expire Current/Non-Current versions (Again based on your usecase), of objects from your S3 Bucket and let it sit for 1-2 days. And poof, all hands off, all serverless, all objects deleted! My organization used this technique to delete 36 Million and 30 Million objects from 2 different buckets. It's not an immediate solution, but at this scale, we wanted a cheap, error-free and hands off system instead of something manual.

发布评论

评论列表(0)

  1. 暂无评论