I am creating an application that allows users to upload a CSV. The CSV is just a ma separated file.
On the frontend, I send the file to my backend with a PUT request:
Reactjs | fileupload_frontend.js
sendFile = () => {
const data = new FormData();
data.append('file', this.myCSV_file);
axios.post('/parse-csv', data)
.then(response => console.log('File sent to server for parsing')
.catch(error => console.log(error);
}
The above code successfully sends the uploaded CSV (info) to my server, which is listening at the /parse-csv
endpoint.
I am using the csv-parser
npm package to help with this.
Nodejs | fileupload_backend.js
const csv = require('csv-parser');
const fs = require('fs');
const results = [];
app.post('/parse-csv', (request, response) => {
fs.createReadStream(request.files.file.data) { // this is a buffer on the req obj
.pipe(csv())
.on('data', results.push())
.on('end', (results) => {
console.log(results);
}
}
}
For reference, the request object from the frontend to the backend looks like:
Request from upload { name: 'data.csv',
data: <Buffer 22 89 56 24 5y 86 k9 22 . ... >,
encoding: '7bit',
truncated: false,
mimetype: 'text/csv',
md5: [Function: md5],
mv: [Function: mv] }
It is error-ing out on each request with a message that seems to be related to my header rows (first row in the csv file), with a message that reads:
Error: ENOENT: no such file or directory, open 'my,header,row,is,referenced,here'
...then the data is shown below.
Do I have to save the CSV file in some directory, then parse it first? I am just looking for a way to parse the CSV into json, so I can insert each row as an entry into my database.
I am creating an application that allows users to upload a CSV. The CSV is just a ma separated file.
On the frontend, I send the file to my backend with a PUT request:
Reactjs | fileupload_frontend.js
sendFile = () => {
const data = new FormData();
data.append('file', this.myCSV_file);
axios.post('/parse-csv', data)
.then(response => console.log('File sent to server for parsing')
.catch(error => console.log(error);
}
The above code successfully sends the uploaded CSV (info) to my server, which is listening at the /parse-csv
endpoint.
I am using the csv-parser
npm package to help with this.
Nodejs | fileupload_backend.js
const csv = require('csv-parser');
const fs = require('fs');
const results = [];
app.post('/parse-csv', (request, response) => {
fs.createReadStream(request.files.file.data) { // this is a buffer on the req obj
.pipe(csv())
.on('data', results.push())
.on('end', (results) => {
console.log(results);
}
}
}
For reference, the request object from the frontend to the backend looks like:
Request from upload { name: 'data.csv',
data: <Buffer 22 89 56 24 5y 86 k9 22 . ... >,
encoding: '7bit',
truncated: false,
mimetype: 'text/csv',
md5: [Function: md5],
mv: [Function: mv] }
It is error-ing out on each request with a message that seems to be related to my header rows (first row in the csv file), with a message that reads:
Error: ENOENT: no such file or directory, open 'my,header,row,is,referenced,here'
...then the data is shown below.
Do I have to save the CSV file in some directory, then parse it first? I am just looking for a way to parse the CSV into json, so I can insert each row as an entry into my database.
Share Improve this question asked Oct 29, 2018 at 2:52 bruhbruh 2,3058 gold badges34 silver badges44 bronze badges 8- What is the expected size of the csv? It may be easier to do it all in memory - no streams. Looking at the source of parse-csv - it seems overly plicated – niry Commented Oct 29, 2018 at 3:10
- @niry The file is 0.62kb – bruh Commented Oct 29, 2018 at 3:27
- 2 @bruh why dont you parse it on frontend? – Abhishek Mani Commented Oct 29, 2018 at 4:08
-
@AbhishekMani would I do this using the same
csv-parser
package, except in my react code instead of node? Is there some native Javascript function to parse CSVs? Forgive me if I'm missing the obvious, I've been struggling with this for 10hours now, and just running out of steam – bruh Commented Oct 29, 2018 at 4:16 - there is a module by which you can parse your CSV content at frontend easily,do let me know if you are stuck using that library – Abhishek Mani Commented Oct 29, 2018 at 4:48
3 Answers
Reset to default 8fs.createReadStream accepts a path as its argument which can be a string, Buffer or URL.
Since you pass a Buffer
, it tries to open the Buffer
as a path so it errors with
Error: ENOENT: no such file or directory, open 'my,header,row,is,referenced,here'
You need to create a stream from the Buffer before piping it to the CSV parser. There a various ways to go about this; implementing a Readable stream that uses the buffer is one. e.g.
const { Readable } = require('stream');
class BufferStream extends Readable {
constructor(opts) {
super(opts);
this.buf = opts.buf;
}
_read(size) {
this.push(this.buf);
this.push(null); // signal end of stream
}
}
Now implement the request handler to use this class.
app.post('/parse-csv', (request, response) => {
const results = [];
const bufStream = new BufferStream(request.files.file.data);
bufStream.pipe(csv())
.on('data', (data) => results.push(data))
.on('end', (results) => {
console.log(JSON.stringify(results));
response.sendStatus(200);
}
}
}
Take a look at the following package, https://www.npmjs./package/xlsx. It will allow you to parse csv and most spreadsheet file formats interchangeably.
You can use it in the following way:
const XLSX = require('xlsx')
const workbook = XLSX.readFile(source_file_path)
const sheetName = workbook.SheetNames[0]
const sheet = workbook.Sheets[sheetName]
const obj = XLSX.utils.sheet_to_json(sheet)
This should work for any spread sheet format supported by the library.
This does mean that the file must be saved somewhere in your directory structure in for the XLSX.readFile
to work.
this is my solution, it works with buffer dirrectly and doesn't need to save file, just install module "csv-parse" first.
const { parse } = require('csv-parse/sync');
const data = await parse(request.files.file.data);
and callback version
const csv = require('csv-parse');
csv.parse(request.files.file.data, function (err, data) {
console.log('my_data', data);
)}