How to Merge Multiple JSON Files: Step-by-Step Guide
Imad Uddin
Full Stack Developer

Merging multiple JSON files into one is something you run into all the time in development and data work. Maybe you've pulled data from a paginated API and now have 50 separate response files. Maybe you're consolidating configuration files from several microservices. Or maybe you split a dataset for processing and now need to reassemble it. Whatever brought you here, the good news is that merging JSON files is straightforward once you know the right approach for your situation.
I deal with this regularly when working with API exports, log aggregation, and test data management, so I've put together this guide covering every method I use. We'll go through an online tool (quickest option if you just need it done), Python (best for automation), JavaScript (natural fit for web and Node.js developers), and jq on the command line (ideal for scripting). I'll also cover how to handle nested and inconsistent JSON structures, which is where most people get stuck.
Why Merge JSON Files?
Before jumping into the methods, here's why you'd want to combine multiple JSON files:
Consolidating data from different sources. You've exported data from multiple APIs, services, or tools, and you need everything in a single file for analysis or import. This comes up constantly when integrating third-party services.
Reassembling split files. If you previously split a large JSON file for processing, transfer, or storage, you'll eventually need to put it back together. This is especially common when working with systems that have file size limits.
Preparing data for analysis or migration. Most analysis tools expect a single input file. If your data is scattered across multiple JSON files, merging them is a prerequisite for running queries, generating reports, or importing into a database.
Bundling configuration files. Microservice architectures often have separate config files for different environments or components. Merging them into a unified configuration simplifies deployment and management.
Simplifying parsing and file management. Working with one file is just easier than managing dozens. Fewer files mean fewer I/O operations, simpler error handling, and less overhead in scripts that process the data.
Combining test datasets. If you maintain test fixtures across different test suites, merging them into a master dataset makes comprehensive testing more manageable.
Log aggregation. Application logs exported as JSON from different time periods or services need to be combined for complete analysis.
Understanding JSON Merge Strategies
Not all JSON merges are the same. The right strategy depends on your data structure:
Array concatenation is the most common scenario. Each file contains a JSON array, and you want to combine all elements into one big array. If file1.json has
[A, B][C, D][A, B, C, D]Object merging combines the keys from multiple JSON objects. If file1.json has
{"name": "Alice"}{"age": 30}{"name": "Alice", "age": 30}Deep merging recursively combines nested objects instead of overwriting them. If both files have a
settingssettingsKey-based merging combines arrays of objects by matching on a specific key like
idKnowing which strategy you need determines which tool and approach to use. Let's get into the methods.
Method 1: Use Our Online JSON Merge Tool (No Code Required)
If you don't want to write any code and just need to get it done, this is the fastest route.
Try it here: merge-json-files.com
How it works:
- Go to our JSON Merger Tool
- Upload two or more JSON files
- Choose your merge strategy:
- Merge arrays together
- Combine objects by key
- Preview the result to make sure it looks right
- Download the merged JSON file
The tool works entirely in your browser, which means your data never gets uploaded to any server. It supports deeply nested JSON, auto-validates malformed input (and tells you what's wrong), and has optional pretty-print formatting so the output is readable.
I use this when I need to quickly combine a few files and don't want to fire up a terminal or write a script. It's particularly handy for non-technical team members who need to merge data exports without any coding knowledge.
Method 2: Merge JSON Files Using Python
Python is the go-to choice for JSON merging when you need automation, custom logic, or are working with large datasets. Here are several approaches covering different scenarios.
Basic Array Merge
The simplest case: each file contains a JSON array, and you want one combined array.
Pythonimport json import glob # Load all JSON files in folder json_files = glob.glob("./data/*.json") merged_data = [] for file in json_files: with open(file) as f: data = json.load(f) merged_data.extend(data) # Assumes each file contains a JSON array # Save merged file with open("merged.json", "w") as f: json.dump(merged_data, f, indent=4) print(f"Merged {len(json_files)} files, {len(merged_data)} total records")
This handles many files automatically using glob patterns. Use
extendappendMerging Object-Based JSON Files
When each file is a JSON object rather than an array:
Pythonimport json import glob merged = {} for filepath in sorted(glob.glob("./configs/*.json")): with open(filepath) as f: data = json.load(f) merged.update(data) # Later files override earlier ones for duplicate keys with open("merged_config.json", "w") as f: json.dump(merged, f, indent=2)
Keep in mind that
updateDeep Merge for Nested Objects
When your JSON files have nested structures and you want to preserve data at all levels:
Pythonimport json import glob import copy def deep_merge(base, override): """Recursively merge override into base.""" result = copy.deepcopy(base) for key, value in override.items(): if key in result and isinstance(result[key], dict) and isinstance(value, dict): result[key] = deep_merge(result[key], value) elif key in result and isinstance(result[key], list) and isinstance(value, list): result[key] = result[key] + value else: result[key] = copy.deepcopy(value) return result merged = {} for filepath in sorted(glob.glob("./data/*.json")): with open(filepath) as f: data = json.load(f) merged = deep_merge(merged, data) with open("deep_merged.json", "w") as f: json.dump(merged, f, indent=2)
This handles the case where both files have
"settings": {"theme": "dark"}"settings": {"language": "en"}"settings": {"theme": "dark", "language": "en"}Merging with Deduplication
If you're combining arrays that might contain duplicate records:
Pythonimport json import glob merged_data = [] seen_ids = set() for filepath in sorted(glob.glob("./exports/*.json")): with open(filepath) as f: data = json.load(f) for record in data: record_id = record.get('id') if record_id and record_id in seen_ids: continue # Skip duplicate if record_id: seen_ids.add(record_id) merged_data.append(record) with open("merged_unique.json", "w") as f: json.dump(merged_data, f, indent=2) print(f"Merged to {len(merged_data)} unique records (skipped {len(seen_ids)} duplicates)")
Handling Large Files Efficiently
For very large JSON files where memory is a concern:
Pythonimport json import glob with open("merged.json", "w") as out: out.write("[\n") first = True for filepath in sorted(glob.glob("./data/*.json")): with open(filepath) as f: data = json.load(f) for record in data: if not first: out.write(",\n") json.dump(record, out) first = False out.write("\n]")
This writes records directly to the output file instead of building the entire merged array in memory.
Python gives you fully automated merging for any number of files, support for complex logic like deduplication and validation, and the ability to handle edge cases gracefully with try/except. The main things to watch out for are making sure all files have a consistent structure (or handling inconsistencies in your code) and being mindful of memory with very large datasets.
Method 3: Merge JSON with JavaScript (for Web and Node.js Developers)
If you're working in a JavaScript ecosystem, merging JSON files feels natural.
Browser Example (for handling uploaded files):
HTML<input type="file" multiple id="jsonFiles" /> <script> document.getElementById("jsonFiles").addEventListener("change", async (e) => { let files = e.target.files; let merged = []; for (let file of files) { let text = await file.text(); let json = JSON.parse(text); merged = merged.concat(json); } console.log(JSON.stringify(merged, null, 2)); }); </script>
Node.js Version:
JavaScriptconst fs = require("fs"); const path = require("path"); const dataDir = "./data"; let merged = []; const files = fs .readdirSync(dataDir) .filter((f) => f.endsWith(".json")) .sort(); for (const file of files) { const filepath = path.join(dataDir, file); const data = JSON.parse(fs.readFileSync(filepath, "utf8")); if (Array.isArray(data)) { merged = merged.concat(data); } else { merged.push(data); } console.log(`Read ${file}: ${Array.isArray(data) ? data.length : 1} records`); } fs.writeFileSync("merged.json", JSON.stringify(merged, null, 2)); console.log(`\nMerged ${files.length} files, ${merged.length} total records`);
Deep Merge in Node.js:
For merging configuration objects with nested structures:
JavaScriptconst fs = require("fs"); function deepMerge(target, source) { for (const key of Object.keys(source)) { if ( source[key] && typeof source[key] === "object" && !Array.isArray(source[key]) ) { if (!target[key]) target[key] = {}; deepMerge(target[key], source[key]); } else if (Array.isArray(source[key]) && Array.isArray(target[key])) { target[key] = target[key].concat(source[key]); } else { target[key] = source[key]; } } return target; } const files = ["base.json", "overrides.json", "local.json"]; let config = {}; for (const file of files) { const data = JSON.parse(fs.readFileSync(file, "utf8")); config = deepMerge(config, data); } fs.writeFileSync("merged_config.json", JSON.stringify(config, null, 2));
JavaScript is a natural choice if you're already in a web or Node.js environment. The browser approach is great for client-side tools, while Node.js handles file system operations for server-side automation.
Method 4: Command-Line Approach with jq (Linux/macOS)
If you're comfortable in the terminal,
jqInstall jq:
Bashsudo apt install jq # Debian/Ubuntu brew install jq # macOS choco install jq # Windows (via Chocolatey)
Merge Array-Based Files:
Bashjq -s add file1.json file2.json file3.json > merged.json
The
-saddMerge all JSON files in a directory:
Bashjq -s add ./data/*.json > merged.json
Merge Object-Based Files (shallow):
Bashjq -s 'reduce .[] as $item ({}; . * $item)' file*.json > merged.json
The
*Merge with key-based deduplication:
Bashjq -s 'add | unique_by(.id)' file1.json file2.json > merged.json
Pretty-print the output:
Bashjq -s 'add' *.json | jq '.' > merged_pretty.json
jq is lightning fast, scriptable, and perfect for DevOps and automation workflows. The syntax has a learning curve, but the one-liners above cover the vast majority of use cases.
Handling Nested and Inconsistent JSON
This is where things get interesting. Real-world JSON files often don't have identical structures. Here's how to deal with that.
Merging arrays nested inside objects:
Pythonimport json import glob merged = {"users": []} for filepath in sorted(glob.glob("./data/*.json")): with open(filepath) as f: data = json.load(f) merged["users"].extend(data.get("users", [])) with open("merged_users.json", "w") as f: json.dump(merged, f, indent=2)
The
.get("users", [])usersHandling files with mixed structures:
Pythonimport json import glob merged_arrays = [] merged_objects = {} for filepath in sorted(glob.glob("./data/*.json")): with open(filepath) as f: data = json.load(f) if isinstance(data, list): merged_arrays.extend(data) elif isinstance(data, dict): merged_objects.update(data) # Decide how to combine based on what you found if merged_arrays and not merged_objects: result = merged_arrays elif merged_objects and not merged_arrays: result = merged_objects else: result = {"records": merged_arrays, **merged_objects} with open("merged.json", "w") as f: json.dump(result, f, indent=2)
The key principle is to never assume all files have the same structure. Use type checking and
.get()Comparison of Merge Methods
| Method | Best For | Skill Level | Batch Support | Deep Merge | Deduplication |
|---|---|---|---|---|---|
| Online Tool | Quick tasks | Beginner | Yes, via upload | Basic | No |
| Python Script | Automation, large sets | Intermediate | Yes, unlimited | Yes | Yes |
| JavaScript | Web or Node.js devs | Intermediate | Yes | Yes | Yes |
| jq CLI | DevOps/Linux | Advanced | Yes, wildcard | Shallow | Yes |
Real-World Use Cases
Combine paginated API results into one file. When an API returns data across multiple pages, you collect each page as a separate JSON file, then merge them all into a single dataset for import or analysis.
Merge configuration files for web apps or microservices. A base config, environment-specific overrides, and local developer settings can be deep-merged into one final configuration object that the application loads.
Prepare datasets for training ML models. Training data often comes from multiple sources or annotation batches. Merging them into a single JSONL or JSON array is a common preprocessing step.
Unify split files from backup or ETL systems. Data pipelines often produce chunked output. Merging the chunks back together is needed for downstream consumers that expect a single input. If you need the opposite operation, check out our guide on how to split JSON files.
Aggregate application logs. JSON-formatted logs from different servers or time periods need to be combined for analysis in tools like Elasticsearch, Splunk, or even simple grep-based searching.
Consolidate test fixtures. When you want to run integration tests against a comprehensive dataset, merging fixtures from different test suites gives you broader coverage.
Best Practices When Merging JSON Files
Validate input files before merging. A single malformed file can break the entire merge process. Run inputs through a JSON validator (jsonlint.com works well) before attempting to combine them.
Use version control to track merges. If you're merging configuration files or any data that changes over time, keep everything in Git. It makes it easy to see what changed and revert if something goes wrong.
Back up originals before merging. Keep the source files intact until you've verified the merged output is correct. It's a simple precaution that can save hours of re-work.
Ensure consistent structure across files. If files have different structures, decide up front how to handle mismatches. Don't discover the issue after processing 10,000 files.
Handle duplicate and conflicting fields deliberately. When two files have the same key with different values, make a conscious choice about which one should win rather than relying on implicit behavior.
Validate the output. After merging, verify that the result is valid JSON and that the record count matches your expectations. A quick sanity check catches most issues.
Use pretty-printing for debugging, compact format for production.
indent=2Final Thoughts
Merging JSON files is one of those tasks that's deceptively simple for basic cases and surprisingly nuanced when you're dealing with real-world data. The right approach depends on your situation:
For quick, one-off merges, our online JSON merge tool handles it in seconds with no code and no installation. Your data stays in your browser, and you can preview the result before downloading.
For automation and complex logic, Python gives you the most flexibility. You can handle deduplication, deep merging, validation, and any other custom requirement with a few lines of code.
For web and Node.js projects, JavaScript keeps everything in one language ecosystem, whether you're merging files on the client side or the server side.
For command-line workflows, jq is incredibly fast and concise. A single
jq -s add *.json > merged.jsonNo matter which method you choose, always validate your input, back up your originals, and check the output before putting it to use.
Try our JSON Merge Tool for free, instant merging right in your browser.
Related Tools: If you're working with different data formats, you might also find our YAML to JSON converter and JSON to Excel converter useful for your workflow.
Read More
All Articles
How to Split JSON File into Multiple Files: Step-by-Step Guide
Learn how to split large JSON files into smaller parts using Python, jq command-line, and online tools. A complete guide for developers handling big JSON datasets and nested structures.

How to Add an Image in JSON: A Comprehensive Guide
Learn how to add an image to a JSON object using URLs, file paths, or base64 encoding. This guide provides examples and best practices for each method.

How to Create a JSON File in Java: Beginner to Advanced Guide
Learn how to create a JSON file in Java step by step using popular libraries like org.json and Gson. This detailed guide covers JSON creation, file writing, real-world examples, and best practices for beginners and developers alike.