最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Read lines of a txt file and organize in a JSON file - Stack Overflow

programmeradmin0浏览0评论

I have a text file where each line is separated into 4 categories by colons and I want to put this into a JSON file, where each category is a value to the corresponding name in the JSON file.

Example data.txt file:

Date1:cat1:dog1:bug1
Date2:cat2:dog2:bug2
Date3:cat3:dog3:bug3

Example JSON file:

{
  "Date1": {
    "cat": "cat1",
    "dog": "dog1",
    "bug": "bug1"
  },
  "Date2": {
    "cat": "cat2",
    "dog": "dog2",
    "bug": "bug2"
    ...
  ...
}

I've never used JSON before but I think that's how to format it. How would I sort each line using the colons as markers for the next value and store it in the JSON file with the correct name using JavaScript and Node.js?

I have a text file where each line is separated into 4 categories by colons and I want to put this into a JSON file, where each category is a value to the corresponding name in the JSON file.

Example data.txt file:

Date1:cat1:dog1:bug1
Date2:cat2:dog2:bug2
Date3:cat3:dog3:bug3

Example JSON file:

{
  "Date1": {
    "cat": "cat1",
    "dog": "dog1",
    "bug": "bug1"
  },
  "Date2": {
    "cat": "cat2",
    "dog": "dog2",
    "bug": "bug2"
    ...
  ...
}

I've never used JSON before but I think that's how to format it. How would I sort each line using the colons as markers for the next value and store it in the JSON file with the correct name using JavaScript and Node.js?

Share Improve this question edited Aug 1, 2018 at 2:08 Rick 4,1249 gold badges27 silver badges37 bronze badges asked Jul 31, 2018 at 23:35 E tomE tom 1171 gold badge2 silver badges10 bronze badges
Add a ment  | 

3 Answers 3

Reset to default 2

Use the csv package if you don't want to handle parsing csv file by yourself.

const fs = require("fs");
const csv = require("csv");

const result = {};
const keys = ["cat", "dog", "bug"]

// Read data
const readStream = fs.createReadStream("yourfile.txt");

// Parser
const parser = csv.parse({ delimiter: ":" });

parser.on("data", (chunk) => {
result[chunk[0]] = {};
    for(let i = 1; i < chunk.length; i ++) {
        result[chunk[0]][keys[i - 1]] = chunk[i];
    }
});

parser.on("end", () => {
    console.log(result);
});

readStream.pipe(parser);

If your JSON has this defined structure you can go about it with the following code:

import * as fs from 'fs';

/* If you have a large file this is a bad Idea, refer to reading from a stream 
 * From zhangjinzhou's answer
 */
const file = fs.readFileSync('path/to/data.txt', 'utf8');

const json = file.split(/\n|\r\n/).map(line => {
  const values = line.split(":");
  let obj = {}

  obj[values[0]] =  {
    cat: values[1],
    dog: values[2],
    bug: values[3],
  };

  return obj
}).reduce((acc, current) => Object.assign(acc, current), {})

Using RegExp and Array#forEach, convert the string to lines, then iterate over them and fill up the object with the corresponding data via:

const dataFileContent = 
`Date1:cat1:dog1:bug1
Date2:cat2:dog2:bug2
Date3:cat3:dog3:bug3`;

function processData(data) {
  // convert to lines
  const lines = data.match(/[^\r\n]+/g) || [];  
  const object = {};
  
  // iterate over the lines
  lines.forEach(line => {
    const parts = line.split(':');    
    const main = parts.shift();
    const pattern = /^(.*?)(\d+)$/;
    
    // create an object for each main part
    object[main] = {};    
    
    // fill each main part with the sub parts
    parts.forEach(part => {
      const match = part.match(pattern) || [];
      const key = match[1];
      const value = match[2];
            
      if (match) {
        object[main][key] = key + value;
      }
    });
  }); 
  
  return object;
}

const processedData = processData(dataFileContent);
console.log(processedData);

Then convert the processedData to JSON by using JSON.stringify and save it to a file via:

const fs = require('fs');

...
// processData
...

const json = JSON.stringify(processedData);
fs.writeFile('my_json_file.json', json, 'utf8');

For larger files, consider using Streams in Node.js as suggested by @zhangjinzhou.

发布评论

评论列表(0)

  1. 暂无评论