最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Fetching image from URL and uploading to another via POST in NodeJS - Stack Overflow

programmeradmin7浏览0评论

In the following snippet, I'm using node-fetch and form-data to first retrieve an image file from a remote URL and then upload it to an S3 bucket (using aws-sdk and multer in a different script):

import fetch from 'node-fetch';
import fs from 'fs';
import FormData from 'form-data';

const form = new FormData();

const processProfileImg = (imageURL, userID) => {
  fetch(imageURL, userID)
    .then((response) => {
      const dest = fs.createWriteStream(`./temp/${userID}.jpg`);
      response.body.pipe(dest);
    })
    .then((dest) => {
      form.append('profileImage', fs.createReadStream(`./temp/${userID}.jpg`));
      fetch(`=${userID}`, { method: 'POST', body: form })
        .then(response => response.json())
        .then(json => console.log(json));
    });
};

export default processProfileImg;

Problem is, this involves an intermediate step of first storing the file locally upon retrieval, before it's picked up for POST by the form-data function. Is there any way to bypass this step entirely? I don't want to save the file locally, I just want to pull it from the remote URL and POST it to the upload route without creating a local file.

Update: After slightly modifying the snippet to implement suggestions from Fransebas (first answer) and avoid async issues, I'm running into a new problem: The image being saved locally is alright, but the copy being uploaded to S3 is partially cut off!

Additional code: The route that handles POST upload, , is as follows and it works well when I try uploading the file using Postman.

import dotenv from 'dotenv';
import express from 'express';
import aws from 'aws-sdk';
import multerS3 from 'multer-s3';
import multer from 'multer';
import path from 'path';

dotenv.config();
const router = express.Router();

// Set up S3
const s3 = new aws.S3({
  accessKeyId: process.env.IAM_ACCESS_KEY_ID,
  secretAccessKey: process.env.IAM_SECRET_ACCESS_KEY,
});

const checkFileType = (file, cb) => {
  // Allowed ext
  const filetypes = /jpeg|jpg/;
  // Check ext
  const extname = filetypes.test(path.extname(file.originalname).toLowerCase());
  // Check mime
  const mimetype = filetypes.test(file.mimetype);
  if (mimetype && extname) {
    return cb(null, true);
  }
  return cb('Error: JPEG Only!');
};

// Single Upload
const profileImgUpload = multer({
  storage: multerS3({
    s3,
    contentType: multerS3.AUTO_CONTENT_TYPE,
    bucket: `${process.env.S3_BUCKET_NAME}/w`,
    acl: 'public-read',
    key(req, file, cb) {
      cb(null, req.query.userID + path.extname(file.originalname));
    },
  }),
  limits: { fileSize: 2000000 }, // In bytes: 2000000 bytes = 2 MB
  fileFilter(req, file, cb) {
    checkFileType(file, cb);
  },
}).single('profileImage');

router.post('/profile-image', (req, res) => {
  profileImgUpload(req, res, (error) => {
    if (error) {
      console.log('errors', error);
      res.json({ error });
    } else if (req.file === undefined) {
      // If File not found
      console.log('Error: No File Selected!');
      res.json('Error: No File Selected');
    } else {
      // If Success
      const imageName = req.file.key;
      const imageLocation = req.file.location;
      // Save the file name into database into profile model
      res.json({
        image: imageName,
        location: imageLocation,
      });
    }
  });
});
// End of single profile upload

// We export the router so that the server.js file can pick it up
module.exports = router;

In the following snippet, I'm using node-fetch and form-data to first retrieve an image file from a remote URL and then upload it to an S3 bucket (using aws-sdk and multer in a different script):

import fetch from 'node-fetch';
import fs from 'fs';
import FormData from 'form-data';

const form = new FormData();

const processProfileImg = (imageURL, userID) => {
  fetch(imageURL, userID)
    .then((response) => {
      const dest = fs.createWriteStream(`./temp/${userID}.jpg`);
      response.body.pipe(dest);
    })
    .then((dest) => {
      form.append('profileImage', fs.createReadStream(`./temp/${userID}.jpg`));
      fetch(`https://www.schandillia./upload/profile-image?userID=${userID}`, { method: 'POST', body: form })
        .then(response => response.json())
        .then(json => console.log(json));
    });
};

export default processProfileImg;

Problem is, this involves an intermediate step of first storing the file locally upon retrieval, before it's picked up for POST by the form-data function. Is there any way to bypass this step entirely? I don't want to save the file locally, I just want to pull it from the remote URL and POST it to the upload route without creating a local file.

Update: After slightly modifying the snippet to implement suggestions from Fransebas (first answer) and avoid async issues, I'm running into a new problem: The image being saved locally is alright, but the copy being uploaded to S3 is partially cut off!

Additional code: The route that handles POST upload, https://www.schandillia./upload/profile-image, is as follows and it works well when I try uploading the file using Postman.

import dotenv from 'dotenv';
import express from 'express';
import aws from 'aws-sdk';
import multerS3 from 'multer-s3';
import multer from 'multer';
import path from 'path';

dotenv.config();
const router = express.Router();

// Set up S3
const s3 = new aws.S3({
  accessKeyId: process.env.IAM_ACCESS_KEY_ID,
  secretAccessKey: process.env.IAM_SECRET_ACCESS_KEY,
});

const checkFileType = (file, cb) => {
  // Allowed ext
  const filetypes = /jpeg|jpg/;
  // Check ext
  const extname = filetypes.test(path.extname(file.originalname).toLowerCase());
  // Check mime
  const mimetype = filetypes.test(file.mimetype);
  if (mimetype && extname) {
    return cb(null, true);
  }
  return cb('Error: JPEG Only!');
};

// Single Upload
const profileImgUpload = multer({
  storage: multerS3({
    s3,
    contentType: multerS3.AUTO_CONTENT_TYPE,
    bucket: `${process.env.S3_BUCKET_NAME}/w`,
    acl: 'public-read',
    key(req, file, cb) {
      cb(null, req.query.userID + path.extname(file.originalname));
    },
  }),
  limits: { fileSize: 2000000 }, // In bytes: 2000000 bytes = 2 MB
  fileFilter(req, file, cb) {
    checkFileType(file, cb);
  },
}).single('profileImage');

router.post('/profile-image', (req, res) => {
  profileImgUpload(req, res, (error) => {
    if (error) {
      console.log('errors', error);
      res.json({ error });
    } else if (req.file === undefined) {
      // If File not found
      console.log('Error: No File Selected!');
      res.json('Error: No File Selected');
    } else {
      // If Success
      const imageName = req.file.key;
      const imageLocation = req.file.location;
      // Save the file name into database into profile model
      res.json({
        image: imageName,
        location: imageLocation,
      });
    }
  });
});
// End of single profile upload

// We export the router so that the server.js file can pick it up
module.exports = router;

Share Improve this question edited Jul 30, 2019 at 7:14 TheLearner asked Jul 29, 2019 at 17:36 TheLearnerTheLearner 2,8735 gold badges49 silver badges100 bronze badges 1
  • 1 It will have to be stored locally one way or another. You can store it in memory to eliminate the file IO, but that's about it. This question talks about using multer to do something seemingly similar to what you're doing. – Heretic Monkey Commented Jul 29, 2019 at 17:39
Add a ment  | 

2 Answers 2

Reset to default 4

I haven't used that specific way of sending data (I prefer ajax) but by looking at your example I suppose you can skip saving the image locally. If you see fs.createReadStream creates a read stream. Look for ways of creating a read stream from what you got.

Also, I think you should put your sending code inside the then so you don't have async problems. For example, if your code for sending data is inside the then then you could use response.body to create the stream.

You almost got it, but you are still using the file, I think you can archive it with something more like this

import fetch from 'node-fetch';
import fs from 'fs';
import FormData from 'form-data';

const form = new FormData();

const processProfileImg = (imageURL, userID) => {
  fetch(imageURL, userID)
    .then((response) => {
      // Use response.body directly, it contains the image right?
      form.append('profileImage', response.body);
      fetch(`https://www.schandillia./upload/profile-image?userID=${userID}`, { method: 'POST', body: form })
        .then(response => response.json())
        .then(json => console.log(json));
    });
};

export default processProfileImg;

If I understand the documentation of fetch correctly response.body is already a stream.

This works for me:

const axios = require('axios')
const FormData = require('form-data');

//Get image
let imageResponse = await axios({
    url: imageUrl,
    method: 'GET',
    responseType: 'arraybuffer'
})

//Create form data
const form = new FormData()
form.append('image', imageResponse.data, {
    contentType: 'image/jpeg',
    name: 'image',
    filename: 'imageFileName.jpg'
})

//Submit form
let result = await axios({
    url: serverUrl, 
    method: "POST",
    data: form, 
    headers: { "Content-Type": `multipart/form-data; boundary=${form._boundary}` }
})
发布评论

评论列表(0)

  1. 暂无评论