最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - NodeJS fetch failed (object2 is not iterable) when uploading file via POST request - Stack Overflow

programmeradmin4浏览0评论

I'm trying to upload a file using native fetch in NodeJS (added in node 17.5, see .5.0/).

However, I keep getting the following error -

TypeError: fetch failed
at Object.processResponse (node:internal/deps/undici/undici:5536:34)
at node:internal/deps/undici/undici:5858:42
at node:internal/process/task_queues:140:7
at AsyncResource.runInAsyncScope (node:async_hooks:202:9)
at AsyncResource.runMicrotask (node:internal/process/task_queues:137:8)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
cause: TypeError: object2 is not iterable
at action (node:internal/deps/undici/undici:1660:39)
at action.next (<anonymous>)
at Object.pull (node:internal/deps/undici/undici:1708:52)
at ensureIsPromise (node:internal/webstreams/util:172:19)
at readableStreamDefaultControllerCallPullIfNeeded
node:internal/webstreams/readablestream:1884:5)
at node:internal/webstreams/readablestream:1974:7
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

      

I'm using the following code to create and submit the form response -

function upload(hub_entity_id, document_path) {
  let formData = new FormData();
  formData.append("type", "Document");
  formData.append("name", "ap_test_document.pdf");
  formData.append("file", fs.createReadStream("ap_test_document.pdf"));
  formData.append("entity_object_id", hub_entity_id);

  const form_headers = {
    Authorization: auth_code,
    ...formData.getHeaders(),
  };

  console.log(
    `Uploading document ap_test_document.pdf to hub (${hub_entity_id}) `
  );
  console.log(formData);

  let raw_response = await fetch(urls.attachments, {
    method: "POST",
    headers: form_headers,
    body: formData,
  });

  console.log(raw_response);
}

I'm trying to upload a file using native fetch in NodeJS (added in node 17.5, see https://nodejs.org/ko/blog/release/v17.5.0/).

However, I keep getting the following error -

TypeError: fetch failed
at Object.processResponse (node:internal/deps/undici/undici:5536:34)
at node:internal/deps/undici/undici:5858:42
at node:internal/process/task_queues:140:7
at AsyncResource.runInAsyncScope (node:async_hooks:202:9)
at AsyncResource.runMicrotask (node:internal/process/task_queues:137:8)
at process.processTicksAndRejections (node:internal/process/task_queues:95:5) {
cause: TypeError: object2 is not iterable
at action (node:internal/deps/undici/undici:1660:39)
at action.next (<anonymous>)
at Object.pull (node:internal/deps/undici/undici:1708:52)
at ensureIsPromise (node:internal/webstreams/util:172:19)
at readableStreamDefaultControllerCallPullIfNeeded
node:internal/webstreams/readablestream:1884:5)
at node:internal/webstreams/readablestream:1974:7
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)

      

I'm using the following code to create and submit the form response -

function upload(hub_entity_id, document_path) {
  let formData = new FormData();
  formData.append("type", "Document");
  formData.append("name", "ap_test_document.pdf");
  formData.append("file", fs.createReadStream("ap_test_document.pdf"));
  formData.append("entity_object_id", hub_entity_id);

  const form_headers = {
    Authorization: auth_code,
    ...formData.getHeaders(),
  };

  console.log(
    `Uploading document ap_test_document.pdf to hub (${hub_entity_id}) `
  );
  console.log(formData);

  let raw_response = await fetch(urls.attachments, {
    method: "POST",
    headers: form_headers,
    body: formData,
  });

  console.log(raw_response);
}
Share Improve this question edited Jun 10, 2022 at 5:29 Harrison Broadbent asked Jun 10, 2022 at 3:32 Harrison BroadbentHarrison Broadbent 8049 silver badges14 bronze badges 9
  • Looks like it might be a problem with fs.createReadStream(). Why do you have two of them? Have you checked that the file path is correct? – Phil Commented Jun 10, 2022 at 4:13
  • @Phil my bad, shouldnt have left that in. Took that out and I'm still getting that error so I've updated the original question. Yep checked the file path and it looks ok. – Harrison Broadbent Commented Jun 10, 2022 at 5:30
  • What version of nodeJS are you using? – Ukor Commented Jun 10, 2022 at 12:00
  • @Ukor this is in node v18.5 – Harrison Broadbent Commented Jun 12, 2022 at 0:55
  • 1 Hey @HarrisonBroadbent, were you able to solve this issue? I am having the same error and I am not sure how to fix it. – Delali Commented Dec 6, 2022 at 20:14
 |  Show 4 more comments

5 Answers 5

Reset to default 12 +250

Issue with form-data package:

The formData structure is not parseable by Node.js, so it throws: object2 is not iterable.

On the other hand, the sad story is formData will not be maintained anymore, and you may have noticed that two years have passed since the last version was published. So they officially announced that formData will be archived: The final nail in the coffin of formData.

Will this be the time for deprecation? form-data haven't been updated in a while and it still lacks some method that should be provided according to the spec. node-fetch@3 stopp recommended ppl using form-data due to inconsistency with spec compatible FormData and recommend that ppl use built in FormData or a spec:ed formdata polyfill that supports iterating over all fields and having Blob/File support


Solutions

1. using form-data package:

So in simple words, we need to somehow convert the form-data stream to a node.js stream. This can be done with the help of some stream methods as follows:

stream.Transform:

With the stream.Transform class from Node.js stream and passing the form-data instance, we can send the request with the built-in fetch API.

from Node.js doc:

Transform streams are Duplex streams where the output is in some way related to the input. Like all Duplex streams, Transform streams implement both the Readable and Writable interfaces.

So we can achieve it like this:

import { Transform } from 'stream';

// rest of code

const tr = new Transform({
  transform(chunk, encoding, callback) {
    callback(null, chunk);
  },
});
formData.pipe(tr);

const request = new Request(url, {
  method: 'POST',
  headers: form_headers,
  duplex: 'half',
  body: tr
})

let raw_response = await fetch(request);

stream.PassThrough:

Instead of returning each chunk of stream, We can simply use stream.PassThrough:

import { PassThrough } from 'stream';

// rest of code

const pt = new PassThrough()
formData.pipe(pt);

const request = new Request(url, {
  method: 'POST',
  headers: form_headers,
  duplex: 'half',
  body: pt
})

let raw_response = await fetch(request);

Important note: If you don't pass duplex: 'half', you would get this error:

duplex option is required when sending a body

2. using built-in form-data

Currently, the part of the Node.js core that handles fetch is named undici.

Luckily, you don't need to use any third-party module for handling any form-data, since undici has implemented it and is now a part of Node.js core.

Sadly, working with part of undici streaming is not easy and straightforward. However, you can still achieve it.

import { Readable } from 'stream';
// For now, it is essential for encoding the header part, or you can skip importing this module and instead implement it by yourself. 
import { FormDataEncoder } from 'form-data-encoder';

// This is a built-in FormData class, as long as you're using Node.js version 18.x and above, 
// no need to import any third-party form-data packages from NPM.
const formData = new FormData();
formData.set('file', {
  name: 'ap_test_document.pdf',
  [Symbol.toStringTag]: 'File',
  stream: () => fs.createReadStream(filePath) 
});

const encoder = new FormDataEncoder(formData)

const request = new Request(url, {
  method: 'POST',
  headers: {
    'content-type': encoder.contentType,
    Authorization: auth_code
  },
  duplex: 'half',
  body: Readable.from(encoder)
})
let raw_response = await fetch(request);

P.S: You may need to read this issue, for the part about encoding.

The usage of

new Request(url, {
  method: 'POST',
  headers: ...,
  duplex: 'half',
  body: ...
})

in Mostafa Fakhraei's answer is necessary to avoid errors caused by the absence of a size attribute in the formData.set statement. See also this GitHub issue, including the comment which considers this approach a misuse of FormData.

Why would this be a misuse of FormData?

I believe this is because including a file stream in an HTTP request in this way leads to a chunked request, which many servers do not support, as explained here. Transfer-Encoding: chunked is generally supported only for responses, not for requests.

The legitimate use

fetch(new Request(url, {
  method: 'POST',
  body: formData
}));

sets the Content-Length header based on the formData's size attribute and then reads and sends the file chunk-by-chunk (without keeping the entire file in memory). Thanks to the presence of the Content-Length header, this still produces a non-chunked request, as demonstrated by this program:

var formData = new FormData();
var stream = fs.createReadStream(filepath);
stream.on("data", function(chunk) {
  console.log("<", chunk.length);
});
formData.append("file", {
  [Symbol.toStringTag]: "File",
  size: fs.statSync(filepath).size, // size is necessary
  stream: () => stream
});
var r = new Request("http://localhost:8080", {
  method: "POST",
  body: formData
});
http.createServer(function(req, res) {
  console.log("Transfer-Encoding", req.headers["transfer-encoding"]);
  req.on("data", function(chunk) {
    console.log(">", chunk.length);
  });
})
.listen(8080, function() {
  fetch(r);
});

This logs something like

< 65536
< 65536
< 65536
Transfer-Encoding undefined
> 65254 multipart header Content-Disposition: form-data; name="file"
> 65536
> 65536
> 411
< 65536
< 65536
> 65536
> 65536
... many more such "two-in/two-out quartets"
< 9633 the last chunk
> 9633
> 44 the last line of the multipart body

But this works only if formData has the size attribute, otherwise it gives an "invalid content-length header" error.

This error can be circumvented if fetch(r) is replaced by

fetch(r.url, r);

This variant does not need the size but always gives a Transfer-Encoding: chunked request without Content-Length header, whether size is given or not, even if the given size is small: The final multipart line

------formdata-undici-0.61749873xxx56806--

is then always a Transfer-Encoding chunk of its own.

I have also faced this type of formdata issue in react app. After using single quote instead of double quote resolved my issue.

Try this:

formData.append('type', "Document");

formData.append('name', "ap_test_document.pdf");

formData.append('file', fs.createReadStream("ap_test_document.pdf"));

formData.append('entity_object_id', hub_entity_id);

Let me know if this solved your problem or not.

Full credit to Mostafa Fakhraei for the example, but for anyone who needs a full example especially with something like formidable and Next.js see the following code. I spent a few hours trying to figure this out even with the help above so thought I'd post the full thing.

Note at least for me the duplex property on fetch was giving my IDE a type error. If you're on Node 18+ don't worry about that and add a // @ts-ignore

import FormData from 'form-data';
import formidable from 'formidable';
import fs from 'fs';
import type { NextApiRequest, NextApiResponse } from 'next';
import { Transform } from 'stream';


const avatarUpload = async (req: NextApiRequest, res: NextApiResponse) => {
    try {
        const parseFile = () =>
            new Promise<formidable.Files>((resolve, reject) => {
                const form = new formidable.IncomingForm();
                form.parse(req, async (err, _fields, files) => {
                    if (err) {
                        reject(err);
                    }
                    resolve(files);
                });
            });

        const files = await parseFile();
        const formidableFile = Array.isArray(files.file) ? files.file[0] : files.file;
        const file = fs.createReadStream(formidableFile.filepath);

        const formData = new FormData();
        formData.append('file', file);


        const tr = new Transform({
            transform(chunk, _encoding, callback) {
                callback(null, chunk);
            },
        });
        formData.pipe(tr);

        // this is for cloudflare images api. you can use any other image upload api
        const imageReq = await fetch('https://api.cloudflare.com/client/v4/accounts/23a1b30d95b2ebe5e7f5fce83318994c/images/v1', {
            headers: {
                Authorization: `Bearer ${process.env.CLOUDFLARE_IMAGE_API_KEY}`,
                ...formData.getHeaders(),
            },
            method: 'POST',
            body: tr as any,
            duplex: 'half',
        });
        // do what you want with the imageObject here

        const imageObject = (await imageReq.json()).result;


        return res.status(200).json({ imageObject });
    } catch (err) {
        console.error(err);
        return res.status(500).json({ error: 'Something went wrong' });
    }
};

export default function handler(req: NextApiRequest, res: NextApiResponse) {
    if (req.method === 'POST') {
        avatarUpload(req, res);
    } else {
        // Handle any other HTTP method
    }
}

export const config = {
    api: {
        bodyParser: false,
    },
};

Use fs.readFileSync instead of createReadStream

formData.append("file", fs.readFileSync("ap_test_document.pdf"));

Also you can use filename direct in formData.append

formData.append("file", fs.readFileSync("ap_test_document.pdf"), "ap_test_document.pdf");

发布评论

评论列表(0)

  1. 暂无评论