How to Read JSON File in JavaScript: A Complete Step by Step Guide

Reading a JSON file in JavaScript depends on where your code runs. In a browser, you typically fetch JSON over HTTP or read a user selected file. In Node.js, you can read from disk using the file system APIs.
This guide covers the reliable methods across both environments: Fetch API in the browser, modern JSON modules for static data, FileReader for user uploads, and Node.js fs for server side and scripting workflows. It also includes practical notes on error handling and large files.
Why JSON Has Become the Industry Standard
JSON is popular because it is lightweight, language agnostic, and maps directly to common data structures (objects and arrays).
Common scenarios include:
- API integration: fetching data from REST endpoints (product lists, user profiles, dashboards).
- Configuration: loading settings files and feature flags.
- Data migration: reading exports for transformation or reporting.
- Local development: using mock JSON while building a UI before the backend is ready.
Once the environment is clear, reading JSON becomes a predictable sequence: load the text, parse it, validate what you got, and handle errors.
Understanding the Environment: Browser vs. Node.js
The first step is identifying where the code runs.
- Browser (client side): for security reasons, the browser cannot read arbitrary local files from disk. It can fetch JSON from a URL, or read a file a user explicitly selects via an .
- Node.js (server side): Node.js runs on your operating system, so it can read files from disk using the fs module.
Rule of thumb: browser code uses fetch() for URLs and FileReader for user uploads. Node.js code uses fs.
Method 1: Reading JSON in the Browser Using the Fetch API
Fetch API represents modern, standard resource request method in browsers. Built into every modern browser. Completely replaced older, clunky XMLHttpRequest tool.
The Basic Implementation
JSON file sitting on server (example: data.json). Most robust reading method using async/await:
async function loadUserData() {
try {
// 1. Start the request
const response = await fetch('./data.json');
// 2. Check if the file exists and is accessible
if (!response.ok) {
throw new Error(<span class="inline-code">Could not fetch the file. Status: ${response.status}</span>);
}
// 3. Convert the response body to a JavaScript object
const data = await response.json();
// 4. Use your data!
console.log("User Name:", data.name);
console.log("User Email:", data.email);
} catch (error) {
console.error("Oh no, something went wrong:", error.message);
}
}
loadUserData();
Why response.ok matters
A common pitfall is skipping the response.ok check. fetch() only rejects on network failures. If the server returns a 404 or 500, the promise still resolves, and response.json() can end up trying to parse an HTML error page. That is how you get errors like Unexpected token < in JSON at position 0.
Check response.ok (or the status code) before parsing so failures are handled intentionally and error messages stay clear.
Method 2: Modern ES6 JSON Modules (Importing JSON)
Modern JavaScript (2024 to 2025) supports JSON modules, which lets you treat a JSON file like a normal module with an import statement.
This is best for static configuration files or small data sets.
How to Use Import Assertions
Requires with clause (previously called "assertions") telling browser file importing specifically represents JSON file.
// Note the 'with' syntax for JSON modules
import configData from './config.json' with { type: 'json' };
console.log("Application Theme:", configData.theme);
console.log("API Version:", configData.version);
This approach works well for small, static JSON (config, copy, lookup tables). It loads before your script runs, keeps data immediately available, and avoids extra fetch() calls.
The tradeoff is caching. If the JSON file changes on the server, users might not see the update until they refresh, so this is not a good fit for dynamic API data.
Method 3: Reading Local User Files (The File Input Method)
Sometimes a page needs to accept a user provided JSON file, like in a data cleanup or transformation tool.
Use the FileReader API for that scenario.
const fileInput = document.getElementById('myFileInput');
fileInput.addEventListener('change', (event) => {
const file = event.target.files[0];
if (!file) return;
const reader = new FileReader();
reader.onload = (e) => {
try {
const jsonContent = JSON.parse(e.target.result);
console.log("File content successfully parsed:", jsonContent);
} catch (err) {
alert("This file is not a valid JSON format!");
}
};
reader.readAsText(file);
});
This is a strong fit for browser based utility tools. The file stays on the user’s machine, which keeps it fast and private, and it avoids any upload requirement.
Method 4: Reading JSON in Node.js (Primary Backend Method)
Server-side work provides multiple JSON handling methods.
Building CLI tool or backend server: fs (File System) module represents primary tool.
Option A: Using fs/promises (The Modern Standard)
Non-blocking, promise-based file reading method. Recommended for almost all modern Node.js applications.
import { readFile } from 'node:fs/promises';
async function getLocalSettings() {
try {
const filePath = new URL('./settings.json', import.meta.url);
const rawData = await readFile(filePath, 'utf8');
// Remember: fs only gives you a string. You MUST parse it.
const settings = JSON.parse(rawData);
console.log("Merging data...", settings.db_host);
} catch (error) {
if (error.code === 'ENOENT') {
console.error("The file does not exist at the specified path.");
} else {
console.error("Syntax Error or Permission Denied:", error.message);
}
}
}
Option B: Using require() (CommonJS shortcut)
In a CommonJS based Node.js project, require() can be a convenient shortcut for small, static JSON configuration.
const localData = require('./data.json');
// No JSON.parse needed. require() parses JSON automatically.
console.log(localData.items);
Avoid this pattern for large data files. require() caches the module in memory, so if the JSON changes on disk, you will not see the update until the process restarts.
Comparison: Which Method is Best for You?
| Scenario | Environment | Best method | Why |
|---|---|---|---|
| Web page (external data) | Browser | fetch | Works with URLs and async requests. |
| Web page (bundled config) | Browser | JSON modules (import) | Clean for build time or static configuration. |
| User uploaded file | Browser | FileReader | Reads a local file the user selects. |
| Backend or CLI | Node.js | fs promises | Async and reliable for disk files. |
| Small local config | Node.js | require | Convenient, but cached and not ideal for large files. |
Deep Insight: Unexpected Token Errors
Unexpected token errors usually mean the input is not valid JSON, or it is not the type you think it is. Common causes include:
- A BOM (byte order mark) at the start of a file.
- Single quotes or trailing commas that are valid in JavaScript objects but invalid in JSON.
- An HTML error page being parsed as JSON (often Unexpected token <).
- Passing a Buffer or object into JSON.parse() instead of a string.
Handling Large JSON Files (The Professional Way)
What happens if you need to read a JSON file that is 5GB? A single JSON.parse() call can run out of memory because it tries to build the entire object in RAM at once.
For large files, use streaming. Instead of reading the whole file, you process it piece by piece.
In Node.js, packages like stream-json can parse large arrays incrementally:
const fs = require('fs');
const { chain } = require('stream-chain');
const { parser } = require('stream-json');
const { streamArray } = require('stream-json/streamers/StreamArray');
const pipeline = chain([
fs.createReadStream('massive-data.json'),
parser(),
streamArray()
]);
pipeline.on('data', (data) => {
// processes one object at a time. Memory stays low.
console.log('Processing record:', data.value.id);
});
If you only need a quick way to work with a massive file, splitting it into smaller chunks can also make debugging and processing more predictable.
Best Practices for Reading JSON in Production
These best practices keep JSON loading stable in production code:
1. Always use try and catch
Wrap parsing in a try...catch block so one bad file does not crash your entire app.
2. Validate the shape of the data
Valid JSON can still be the wrong structure. If your code expects user.email but the file only contains user.username, you will still get runtime errors. Libraries like Zod or Joi help enforce a schema immediately after parsing.
3. Be mindful of CORS in the browser
When using fetch() across domains, CORS rules apply. Ensure the server returns the right headers (for example, Access-Control-Allow-Origin) for the environment you are testing.
4. Treat encoding as part of the contract
Use UTF 8 consistently. If you see gibberish characters, the file is often being read with the wrong encoding, or it includes a BOM marker at the start.
Troubleshooting Common JSON Reading Errors
"Request has been blocked by CORS policy"
This is a browser security feature. You cannot fetch a JSON file from one domain while running on another domain unless the server allows it. During local development, use a local server instead of opening the HTML file directly with a file:/// URL.
"Unexpected token < in JSON at position 0"
This often means the response was HTML, not JSON. A common cause is a missing file or wrong URL, which returns a 404 page that starts with .... Check the request path and inspect the Network response body.
"Cannot find module" (Node.js)
If you are using require('./data'), include the .json extension and verify the path. Also remember that require resolves relative to the current file, while many fs paths are relative to where you run the command.
Related Tools and Guides
If you want a few practical next steps, these are the most relevant tools and guides:
- JSON File Splitter for breaking very large files into smaller chunks.
- How to Merge JSON Files when you need to recombine multiple JSON outputs.
- JSON Flattener for simplifying deeply nested JSON before processing.
- How to Format JSON in Notepad++ for cleaning up and validating JSON on Windows.
Final Thoughts
Reading JSON in JavaScript becomes straightforward once the environment is clear. In the browser, load JSON via fetch() or FileReader. In Node.js, read from disk with fs and parse with JSON.parse().
For production code, the biggest improvements come from predictable error handling and validation. Log the raw input when parsing fails, validate the data shape early, and use streaming or chunking when files are large.
Read More
All Articles
How to Add an Image in JSON: A Comprehensive Guide
Learn how to add an image to a JSON object using URLs, file paths, or base64 encoding. This guide provides examples and best practices for each method.

How JSON Powers Everything: The Hidden Force Behind Modern Web Applications
Learn how JSON powers modern web applications, APIs, and online tools. Discover why JSON became the universal data format and how it enables everything from file converters to real-time applications.

How Does JSON Work? A Complete Technical Guide to Structure, Parsing, Serialization, and Data Exchange
Learn how JSON works internally - from serialization and parsing to network communication. Complete technical guide covering structure, syntax rules, performance, and cross-language compatibility.