Free Online JSONL to JSON Converter

Got a log file or a database export in JSONL format that you need to open in a standard editor? This tool wraps those individual JSON lines back into a single, valid JSON array. It’s the easiest way to make streaming data readable and ready for your next project.

JSONL to JSON

Download Sample Files to Practice

Before & After

Lines to array conversion

JSONL Input (one object per line)
json
{"id": 1, "name": "Alice", "role": "admin"}
{"id": 2, "name": "Bob", "role": "editor"}
{"id": 3, "name": "Carol", "role": "viewer"}
JSON Array Output
json
[
  {"id": 1, "name": "Alice", "role": "admin"},
  {"id": 2, "name": "Bob", "role": "editor"},
  {"id": 3, "name": "Carol", "role": "viewer"}
]

Format Comparison

JSONL vs JSON vs CSV

Data format comparison for common interchange scenarios
FeatureJSONLJSONNDJSONCSV
StructureOne object per lineSingle root array/objectOne object per lineHeader row + data rows
StreamingYes - line by lineNo - must parse entire fileYes - line by lineYes - line by line
Nested DataSupported per lineFully supportedSupported per lineNot supported
File Extension.jsonl.json.ndjson.csv
Common UseLogs, ML datasets, BigQueryAPIs, web apps, configsLog aggregation, ElasticsearchSpreadsheets, databases
Human ReadableModerateGood (when formatted)ModerateExcellent
Append-FriendlyYes - just add linesNo - must rewrite arrayYes - just add linesYes - just add rows

How It Works

Four simple steps

1

Load JSON Lines

Provide a .jsonl or .ndjson file with one object per line.

2

Validate Lines

Each line is parsed individually to ensure valid JSON syntax.

3

Reconstruct Array

Objects are wrapped in array brackets with proper commas.

4

Download JSON

Get a formatted .json file ready for APIs or web apps.

Programmatic Conversion

Python and Node.js

Pythonjsonl_to_json.py
import json

# Read JSONL and convert to JSON array using list comprehension
with open("data.jsonl", "r") as f:
    records = [json.loads(line) for line in f if line.strip()]

# Write as standard JSON array
with open("output.json", "w") as f:
    json.dump(records, f, indent=2)

print(f"Converted {len(records)} JSONL lines to JSON array")
Node.jsjsonl-to-json.js
const fs = require("fs");
const readline = require("readline");

async function jsonlToJson(inputPath, outputPath) {
  const records = [];
  const rl = readline.createInterface({
    input: fs.createReadStream(inputPath),
    crlfDelay: Infinity,
  });

  for await (const line of rl) {
    if (line.trim()) {
      records.push(JSON.parse(line));
    }
  }

  fs.writeFileSync(outputPath, JSON.stringify(records, null, 2));
  console.log(`Converted ${records.length} lines to JSON array`);
}

jsonlToJson("data.jsonl", "output.json");

FAQ

Common questions

Related Articles

Complete Guide

In-depth walkthrough

Why JSONL exists and when you'd convert it

JSONL is better for streaming because you can read one line at a time without loading the whole file into memory. This makes it perfect for processing logs, large datasets, or real-time data feeds where you process records as they arrive.

Standard JSON arrays are better for tools that expect a JSON array input, like most data analysis tools, Pandas, and REST APIs. They need the data wrapped in brackets with commas between objects.

Convert when your downstream tool doesn't support JSONL natively. If you're loading data into a web app, sending it to an API, or importing into a tool that expects standard JSON, you need to convert from JSONL to a JSON array first.

Most conversions happen because the next step in your workflow expects standard JSON format, not line-delimited records.

What to do if a line fails to parse

If one line is malformed (missing bracket, trailing comma, unquoted key), the whole conversion will fail or skip that line. This is the #1 problem users hit when converting JSONL files.

How to find it: open the file in VS Code and look for the red squiggle on the bad line. VS Code's JSON validator will highlight syntax errors immediately.

Or run jq -c . yourfile.jsonl in the terminal. jq will tell you exactly which line number failed and what the error is. This is the fastest way to debug JSONL parsing issues.

Fix the malformed line, save the file, and try converting again. One bad line can break the entire conversion, so validation is critical.

File size and large JSONL files

The tool runs in the browser so there's no server upload. Your data stays on your machine, which is great for privacy but means large files depend on your browser's memory.

Very large files (1GB+) may be slow in the browser or cause it to freeze. For those, use the Python one-liner: python3 -c "import json,sys; print(json.dumps([json.loads(l) for l in open('file.jsonl')]))" > output.json

This processes the file line by line without loading everything into memory at once, making it much faster for large datasets.