What Is JSON to CSV Conversion — and Why Do You Need It?
JSON (JavaScript Object Notation) is the standard format for REST APIs, database exports, and configuration files. CSV (comma-separated values) is the universal format for spreadsheets, data analysis, SQL imports, and reporting tools. Converting between them is one of the most common data tasks in development, data analysis, and business operations.
The core transformation is simple: a JSON array of objects maps naturally to a table. Each object becomes one row. Each unique key across all objects becomes a column header. Primitive values (strings, numbers, booleans) become cell values. The challenge comes from real-world JSON — nested objects, arrays within objects, inconsistent keys across records, and special characters that need escaping.
This guide covers every approach: the online converter for one-off conversions, Python scripts for automation, JavaScript for browser and Node.js contexts, and the full handling guide for every edge case you'll encounter in real API responses.
How to Convert JSON to CSV Online (Step-by-Step)
What JSON Structures Can Be Converted to CSV?
Not all JSON maps cleanly to a flat CSV table. Here is how each common structure is handled — with the exact input, expected CSV output, and when special handling is needed.
[
{ "id": 1, "name": "Alice", "city": "Paris" },
{ "id": 2, "name": "Bob", "city": "Tokyo" }
]id,name,city 1,Alice,Paris 2,Bob,Tokyo
The ideal input — each object has the same keys, all values are primitives. Maps directly to rows and columns with no special handling.
[
{ "id": 1, "name": "Alice",
"address": { "city": "Paris", "zip": "75001" } }
]id,name,address.city,address.zip 1,Alice,Paris,75001
Nested objects are flattened using dot notation. 'address.city' becomes a column header. Deeply nested objects (3+ levels) still work — keys are joined with dots.
[
{ "id": 1, "name": "Alice",
"tags": ["dev", "design"] }
]id,name,tags 1,Alice,"dev,design"
Array values are joined into a single cell separated by a configurable character (default: comma inside quotes). Each tag does NOT become its own column — use the split option if you need one row per array item.
[
{ "id": 1, "name": "Alice", "email": "a@x.com" },
{ "id": 2, "name": "Bob" }
]id,name,email 1,Alice,a@x.com 2,Bob,
All unique keys across all objects become columns. Missing values produce empty cells. Common in real API responses where optional fields appear in some records.
{ "id": 1, "name": "Alice", "city": "Paris" }id,name,city 1,Alice,Paris
A single JSON object (not wrapped in an array) is converted as a one-row CSV. Keys become headers, values become the single data row.
Which Delimiter to Use — Comma, Semicolon, Tab, or Pipe?
The delimiter is the character that separates columns in your CSV. Choosing the right one depends on what your data contains and where the file will be opened.
| Delimiter | Use when | Opens with | Avoid when |
|---|---|---|---|
Comma (,), | Standard CSV — the default everywhere | Excel, Google Sheets, Numbers, any CSV tool | When your data values contain commas (addresses, descriptions). The converter auto-quotes these fields, but some older tools don't handle quoted commas well. |
Semicolon (;); | European locales where Excel expects semicolons | Excel in France, Germany, Italy, Spain, and most EU countries | Don't use if your target audience is in the US or UK — Excel there expects commas. |
Tab (\t)\t | TSV — when data contains both commas and semicolons | Excel (File > Open > Text), databases (MySQL LOAD DATA), R, pandas | When values themselves contain tab characters (rare but possible in multi-line text fields). |
Pipe (|)| | Legacy data pipelines, SQL exports | Custom scripts, data warehouses, some ETL tools | For general spreadsheet use — Excel and Google Sheets don't auto-detect pipe delimiters. |
How to Convert JSON to CSV in Python
Python's standard library includes both json and csv modules — no third-party packages required for basic conversion. For nested JSON or data analysis workflows, the pandas library adds powerful flattening and transformation capabilities.
Basic: flat JSON array to CSV
The simplest case — a JSON array of objects with consistent keys and primitive values.
import json
import csv
# Load JSON from a file
with open("data.json", "r") as f:
data = json.load(f)
# Extract column headers from the first object
fieldnames = list(data[0].keys())
# Write to CSV
with open("output.csv", "w", newline="", encoding="utf-8") as f:
writer = csv.DictWriter(f, fieldnames=fieldnames)
writer.writeheader()
writer.writerows(data)
print(f"Written {len(data)} rows to output.csv")Flatten nested JSON before converting
Use a recursive flatten function to convert nested keys like address.city into flat column names before writing.
import json
import csv
def flatten(obj, parent_key="", sep="."):
"""Recursively flatten a nested dict."""
items = {}
for k, v in obj.items():
new_key = f"{parent_key}{sep}{k}" if parent_key else k
if isinstance(v, dict):
items.update(flatten(v, new_key, sep))
elif isinstance(v, list):
# Join list values with a pipe separator
items[new_key] = "|".join(str(i) for i in v)
else:
items[new_key] = v
return items
with open("nested.json", "r") as f:
data = json.load(f)
# Flatten every record
flat_data = [flatten(record) for record in data]
# Collect ALL unique keys across all records
all_keys = []
for record in flat_data:
for key in record:
if key not in all_keys:
all_keys.append(key)
with open("output.csv", "w", newline="", encoding="utf-8") as f:
writer = csv.DictWriter(
f, fieldnames=all_keys, extrasaction="ignore"
)
writer.writeheader()
writer.writerows(flat_data)Convert JSON string (not file) to CSV
When working with an API response or JSON already in memory as a string — use json.loads() instead of json.load().
import json
import csv
import io
json_string = '''[
{"id": 1, "name": "Alice", "score": 95},
{"id": 2, "name": "Bob", "score": 87}
]'''
data = json.loads(json_string) # Note: loads(), not load()
fieldnames = list(data[0].keys())
# Write to an in-memory string buffer (useful for web apps)
output = io.StringIO()
writer = csv.DictWriter(output, fieldnames=fieldnames)
writer.writeheader()
writer.writerows(data)
csv_string = output.getvalue()
print(csv_string)Using pandas (recommended for data analysis)
If you're already using pandas, json_normalize() handles nested JSON flattening automatically.
import pandas as pd
import json
# From a file
with open("data.json") as f:
data = json.load(f)
# json_normalize flattens nested objects automatically
df = pd.json_normalize(data)
# Preview the columns
print(df.columns.tolist())
print(df.head())
# Export to CSV
df.to_csv("output.csv", index=False)
# For deeply nested JSON, use the 'sep' and 'max_level' params:
# df = pd.json_normalize(data, sep="_", max_level=2)How to Convert JSON to CSV in JavaScript
Whether you're working in a browser, a React or Vue app, or a Node.js script, JavaScript can handle JSON-to-CSV conversion with no external libraries. The key concerns are: collecting all unique keys across inconsistent objects, properly escaping values that contain the delimiter or quotes, and triggering a file download in the browser.
Browser: convert JSON to CSV and download
Paste this function into any browser console or script to convert a JSON array and trigger a file download.
function jsonToCsv(data, delimiter = ",") {
if (!Array.isArray(data) || data.length === 0) return "";
// Collect all unique keys across all objects
const allKeys = [...new Set(data.flatMap(Object.keys))];
const escape = (val) => {
const str = val === null || val === undefined ? "" : String(val);
// Wrap in quotes if value contains delimiter, quotes, or newlines
return str.includes(delimiter) || str.includes('"') || str.includes("\n")
? `"${str.replace(/"/g, '""')}"`
: str;
};
const header = allKeys.join(delimiter);
const rows = data.map((obj) =>
allKeys.map((key) => escape(obj[key])).join(delimiter)
);
return [header, ...rows].join("\n");
}
function downloadCsv(csvString, filename = "data.csv") {
const blob = new Blob([csvString], { type: "text/csv;charset=utf-8;" });
const url = URL.createObjectURL(blob);
const a = document.createElement("a");
a.href = url;
a.download = filename;
a.click();
URL.revokeObjectURL(url);
}
// Usage:
const data = [
{ id: 1, name: "Alice", city: "Paris" },
{ id: 2, name: "Bob", city: "Tokyo" },
];
downloadCsv(jsonToCsv(data), "users.csv");Node.js: read JSON file, write CSV file
A complete Node.js script — read a JSON file, convert it, and write the output to a CSV file.
const fs = require("fs");
function flattenObj(obj, prefix = "", sep = ".") {
return Object.entries(obj).reduce((acc, [key, val]) => {
const newKey = prefix ? `${prefix}${sep}${key}` : key;
if (val !== null && typeof val === "object" && !Array.isArray(val)) {
Object.assign(acc, flattenObj(val, newKey, sep));
} else {
acc[newKey] = Array.isArray(val) ? val.join("|") : val;
}
return acc;
}, {});
}
function jsonToCsv(data, delimiter = ",") {
const flat = data.map((r) => flattenObj(r));
const keys = [...new Set(flat.flatMap(Object.keys))];
const escape = (v) => {
const s = v === null || v === undefined ? "" : String(v);
return s.includes(delimiter) || s.includes('"') || s.includes("\n")
? `"${s.replace(/"/g, '""')}"`
: s;
};
const lines = [
keys.join(delimiter),
...flat.map((row) => keys.map((k) => escape(row[k])).join(delimiter)),
];
return lines.join("\n");
}
// Read → Convert → Write
const input = JSON.parse(fs.readFileSync("data.json", "utf8"));
const csv = jsonToCsv(input);
fs.writeFileSync("output.csv", csv, "utf8");
console.log(`Converted ${input.length} records to output.csv`);Flattening Nested JSON — The Full Explanation
Nested JSON is the most common reason a simple JSON-to-CSV conversion fails or produces unexpected output. A CSV is fundamentally a flat structure — it cannot represent hierarchy. The question is how to turn the hierarchy into flat columns.
There are three standard strategies:
address.city → column "address.city"Best when: When you need all nested data as individual columns — data analysis, SQL import, Excel pivot tables
Risk: Column count explodes for deeply nested structures (5+ levels). Circular references cause infinite loops.
address → column "address", value: '{"city":"Paris","zip":"75001"}'Best when: When you want to preserve the raw nested data in the CSV but mostly work with the top-level fields
Risk: Nested values are stored as JSON strings — you'll need to parse them again in the consuming application.
address → column omitted entirelyBest when: When nested fields are metadata you don't need in the spreadsheet output
Risk: Data loss — only use if you're certain the nested fields are unnecessary.
The dot-notation approach is recommended for most use cases. It produces the most analysis-friendly output — every value is a primitive cell, every nested path is a distinct column. Use the stringify approach when the nested data is opaque and you only need to pass it through to another system.
How to Open a CSV in Excel, Google Sheets & LibreOffice
Downloading a CSV is only half the job — opening it correctly in your spreadsheet application matters. Each application has specific import steps to ensure columns split correctly and encoding is preserved.
- Open Excel and go to File → Open.
- Select "All Files (*.*)" in the file type dropdown.
- Browse to your downloaded .csv file and click Open.
- The Text Import Wizard opens. Select "Delimited" and click Next.
- Check "Comma" (or whichever delimiter you used). Click Finish.
- Your JSON data now appears in columns. Save as .xlsx to preserve formatting.
- Open Google Sheets and create a new spreadsheet.
- Go to File → Import.
- Click "Upload" and drag your .csv file into the dialog.
- Under "Separator type", select "Comma" (or Custom for other delimiters).
- Click "Import data". Your JSON data appears in a new sheet.
- Open LibreOffice Calc and go to File → Open.
- Select your .csv file. The Text Import dialog opens automatically.
- Under "Separator options", check "Comma" or set a custom separator.
- Check "String delimiter" is set to a double-quote (").
- Click OK. Your data loads in column layout.
Edge Cases — How Real-World JSON Values Are Handled
Clean toy examples always convert perfectly. Real API responses contain values that require careful handling to produce valid, correctly quoted CSV output that won't break in Excel or pandas.
| Edge Case | Problem | Correct handling |
|---|---|---|
Values containing commas{ "address": "123 Main St, Apt 4" } | The comma inside the address value would split into two columns if not handled. | Wrap the value in double quotes: "123 Main St, Apt 4". The converter does this automatically for any value containing the delimiter. |
Values containing double quotes{ "note": 'He said "hello"' } | A double quote inside a quoted field would end the field prematurely. | Escape double quotes by doubling them: "He said ""hello""". This is the RFC 4180 standard for CSV escaping. |
Values containing newlines{ "bio": "Line 1\nLine 2" } | A newline inside a value would look like a new CSV row to most parsers. | Wrap the value in double quotes. RFC 4180 allows quoted fields to contain literal newlines. Most modern tools handle this correctly. |
Null and undefined values{ "id": 1, "email": null } | null and undefined have no natural CSV representation. | Convert to an empty string — the cell is left blank. Do not write the string "null" unless your downstream system expects it. |
Boolean values{ "active": true, "verified": false } | Booleans may be misread as strings in spreadsheets. | Write as "true" / "false" (lowercase strings). Excel and Google Sheets do not automatically treat these as boolean — use 1/0 if you need numeric boolean for formulas. |
Unicode and emoji{ "name": "José 😊" } | Non-ASCII characters may display incorrectly if the file encoding is wrong. | Always save CSV files with UTF-8 encoding. When opening in Excel, use the import wizard and specify UTF-8 — Excel may default to a legacy encoding otherwise. |
JSON to CSV vs JSON to Excel (.xlsx) — Which to Use?
CSV and XLSX (Excel's native format) are not the same format and have different strengths. Here is when to choose each:
- Universal — every tool reads CSV
- Plain text — version-controllable
- No format overhead — smallest file size
- Direct SQL import (LOAD DATA, COPY)
- Opens in any text editor
- No multiple sheets
- No formatting (colors, bold, borders)
- No formulas
- Encoding ambiguity (UTF-8 vs legacy)
- Multiple sheets in one file
- Full formatting support
- Formulas and charts
- Explicit column types (numbers, dates)
- Better for sharing with non-technical users
- Requires a library (openpyxl, SheetJS)
- Larger file size
- Not directly importable into most databases
- Binary format — not diff-able
For data pipelines, database imports, and developer workflows — use CSV. For sharing with business users who need formatting, multiple sheets, or formulas — convert to XLSX using pandas df.to_excel() or SheetJS in JavaScript.
Privacy — Your JSON Never Leaves Your Browser
All JSON to CSV conversion runs locally in your browser using JavaScript. No JSON data is sent to any server. This matters because API responses often contain sensitive information: user records, authentication tokens, personal data, business metrics, or unreleased product data. You can verify this by opening your browser's Network tab and confirming no outbound requests are made when you paste and convert.
Related Tools
Before converting, validate and format your JSON with the JSON Formatter & Validator — it catches syntax errors, fixes common mistakes with the Repair button, and lets you explore deeply nested structures in the Tree View before you decide how to flatten them. If you're comparing two CSV exports from different time periods, the Text Diff Checker shows exactly which rows were added or removed. Need to minify the JSON before logging or shipping it? The JSON Formatter's Minify function strips all whitespace in one click.
Frequently Asked Questions
What is JSON to CSV conversion?
JSON to CSV conversion transforms a JSON array of objects into a comma-separated values file where each object becomes a row and each unique key becomes a column header. CSV is the universal format for spreadsheets, databases, and data analysis tools that cannot read JSON directly.
Can I convert nested JSON to CSV?
Yes. Nested objects are handled by flattening — joining parent and child keys with a dot (address.city becomes a column header) — or by stringifying the nested value as a JSON string inside the CSV cell. The ToolLance converter supports both approaches.
Can I open the converted CSV in Excel?
Yes. After downloading, open Excel and use File → Open to open the .csv file. Use the Text Import Wizard to specify the delimiter. For semicolon-delimited files, use Data → From Text/CSV and specify the delimiter manually.
What JSON structure can be converted to CSV?
CSV conversion works best with a JSON array of objects where each object has the same keys — the standard format from REST APIs and databases. A single object converts as a one-row CSV. Mixed key sets are handled by collecting all unique keys and leaving empty cells where a record is missing a key.
Is my JSON uploaded to a server?
No. All conversion runs entirely in your browser. Nothing is sent anywhere — safe for API responses with user data, authentication tokens, or any sensitive content.
How do I convert JSON to CSV in Python?
Use Python's built-in csv and json modules. Load your JSON with json.load(), extract fieldnames from the first object's keys, then write with csv.DictWriter. For nested objects, use a recursive flatten function that joins parent and child keys with a dot before passing to DictWriter. For data analysis, use pandas json_normalize() which handles flattening automatically.
What delimiter should I use?
Use commas for standard CSV — compatible everywhere. Use semicolons if you're in a European locale where Excel expects semicolons. Use tabs (TSV) when your data contains both commas and semicolons, or for database imports.
What happens if my JSON objects have different keys?
All unique keys across every object are collected to build the full column set. Objects missing a key get an empty cell in that column. This handles real-world API responses where optional fields appear in some records but not others.
Should I use CSV or XLSX for my JSON data?
Use CSV for data pipelines, database imports, and developer workflows — it's universal and the smallest format. Use XLSX when sharing with business users who need formatting, multiple sheets, or formulas. You can always convert a CSV to XLSX by opening it in Excel and saving as .xlsx.
How do I convert a JSON API response to CSV?
Copy the JSON response from your browser's Network tab or API client, paste it into the ToolLance JSON to CSV converter, and click Download. For repeated conversions from the same API, use the Python or JavaScript code examples above to automate the process in a script.
