JSON to CSV in JavaScript: npm Library vs Browser Tool
- json2csv npm package is the standard library for production Node.js use
- A 10-line vanilla JS function handles most simple cases without any library
- React and frontend apps can trigger CSV downloads using Blob + URL.createObjectURL
- For one-off conversions, the browser tool beats writing any code
Table of Contents
Three ways to convert JSON to CSV in JavaScript: a small vanilla function for simple cases, the json2csv npm package for production pipelines, and — when you just need a CSV file right now without writing or running code — the browser tool. Here's when each approach makes sense, with working code for all three.
Vanilla JavaScript: 10-Line CSV Conversion (No Library)
For flat arrays of objects with consistent keys, a minimal function is all you need:
function jsonToCsv(data) {
if (!data || !data.length) return '';
const headers = Object.keys(data[0]);
const rows = data.map(row =>
headers.map(h => {
const val = row[h] ?? '';
return typeof val === 'string' && val.includes(',')
? '"' + val.replace(/"/g, '""') + '"'
: val;
}).join(',')
);
return [headers.join(','), ...rows].join('
');
}
const csv = jsonToCsv(myJsonArray);
console.log(csv);
This handles: string values with commas (wraps in quotes), null/undefined values (outputs empty string), and consistent key sets across rows. It does not handle: nested objects, values with newlines, or inconsistent keys across rows.
Add a download trigger for browser use:
function downloadCsv(csv, filename = 'data.csv') {
const blob = new Blob([csv], { type: 'text/csv' });
const url = URL.createObjectURL(blob);
const a = document.createElement('a');
a.href = url;
a.download = filename;
a.click();
URL.revokeObjectURL(url);
}
json2csv: The Standard npm Package for Production
The json2csv package handles edge cases the vanilla function misses: nested objects, custom field mappings, transforms, streaming for large files.
npm install json2csv
Basic usage (Node.js):
const { Parser } = require('json2csv');
const fields = ['name', 'email', 'address.city']; // dot notation for nested
const parser = new Parser({ fields });
const csv = parser.parse(data);
require('fs').writeFileSync('output.csv', csv);
Streaming for large files:
const { AsyncParser } = require('json2csv');
const parser = new AsyncParser();
const csv = await parser.parse(data).promise();
React / browser bundle:
import { Parser } from 'json2csv';
const parser = new Parser();
const csv = parser.parse(myData);
downloadCsv(csv, 'export.csv'); // use the download function above
json2csv is maintained, well-documented, and handles most production requirements. Use it when you need field selection, custom headers, transforms on values, or streaming for memory efficiency.
Sell Custom Apparel — We Handle Printing & Free ShippingOther npm Libraries Worth Knowing
- papaparse — primarily a CSV parser, but also stringifies. Better known for parsing CSV into JSON than the reverse. Often already in a project's dependencies.
- fast-csv — streaming-focused, good for large datasets in Node.js. Requires a stream-based approach rather than synchronous conversion.
- csv-writer — opinionated API, writes directly to files. Good for Node.js scripts that need to write CSVs to disk without an intermediate string.
- xlsx (spreadsheet engine) — if you need Excel (.xlsx) output rather than CSV, this is the standard library. Heavier dependency but produces proper Excel files with formatting.
For most frontend-triggered CSV downloads in React or Vue, the vanilla JS approach (10 lines) or json2csv are sufficient. Don't add a heavy dependency like xlsx unless you actually need Excel format.
When to Use the Browser Tool Instead of Writing Code
Writing even a simple Node.js script has setup overhead: creating the file, installing the package, running the script, debugging errors. For a developer who needs to convert one JSON array to CSV to share with a stakeholder, that's 5–10 minutes of friction.
The browser tool takes 30 seconds: paste the JSON array, download the CSV. No npm, no script file, no terminal.
Write the code when:
- This conversion will happen repeatedly as part of a feature or automated process
- The conversion needs to happen server-side as part of an API response
- The JSON data is too large to paste (hundreds of MB)
Use the browser tool when: you have a JSON array from an API test, a database export, or a coworker's Slack message, and you need a CSV for a meeting in five minutes.
No npm, No Script — Convert JSON to CSV Right Now
Paste your JSON array into the browser converter. Download CSV in seconds. No Node.js, no json2csv, no code.
Open Free JSON to CSV ConverterFrequently Asked Questions
What is the best npm package for JSON to CSV in Node.js?
json2csv is the most widely used. It handles nested objects, custom field selection, streaming, and transforms. It's well-maintained, TypeScript-friendly, and works in both Node.js and browser bundles.
How do I handle nested JSON objects in JavaScript CSV conversion?
json2csv handles nested objects using dot notation in the fields array. For vanilla JS, write a recursive flatten function before converting: flatten the JSON first, then apply the simple CSV conversion to the flat array.
Can I trigger a CSV download from React without a backend?
Yes. Convert JSON to a CSV string client-side (vanilla function or json2csv in the browser bundle), then use Blob + URL.createObjectURL + a programmatic anchor click to trigger the download. No server, no API call needed.
Is papaparse better than json2csv for CSV generation?
papaparse excels at parsing (CSV to JSON). For the reverse (JSON to CSV), json2csv has a more complete feature set. If papaparse is already in your project, its unparse() function works fine for simple cases — no need to add json2csv just for basic conversion.

