Blog
Wild & Free Tools

Escape JSON for SQL — Storing and Querying JSON Strings

Last updated: January 2026 7 min read
Quick Answer

Table of Contents

  1. Use Native JSON Columns When You Can
  2. When You Have to Escape (String Columns)
  3. Always Use Parameterized Queries
  4. Querying JSON Out of the Database
  5. Common Failure Modes
  6. Frequently Asked Questions

Storing JSON in a SQL database is two different problems depending on whether the column is a native JSON type or a plain string type. With native JSON, you pass the value as a parameter and the driver handles serialization. With string columns, you're back to per-dialect escape rules — and you should almost always use parameterized queries instead of building SQL strings by hand.

Use Native JSON Columns When You Can

All three major SQL databases have native JSON support:

Use these when available. The driver handles serialization — you pass a JSON value, it stores correctly. No escape characters in your SQL, no string concatenation, no escape rules to memorize per dialect.

// Node + pg (Postgres)
await client.query('INSERT INTO events(payload) VALUES($1)', [eventObject]);

// Node + mysql2
await connection.execute('INSERT INTO events(payload) VALUES(?)', [JSON.stringify(eventObject)]);

The driver does the right thing. You never see escape characters in your code.

When You Have to Escape (String Columns)

If you're stuck with a VARCHAR or TEXT column storing JSON as a string — usually because of a legacy schema — you have to handle escape characters at the SQL layer. JSON's escapes are inside the string; SQL's quoting is outside.

The character that needs handling at the SQL layer is the single quote (most dialects use single quotes as string delimiters):

DatabaseSingle quote escapeBackslash handling
PostgreSQL'' (doubled)Literal by default
MySQL'' or \'\\ required
SQL Server'' (doubled)Literal by default
SQLite'' (doubled)Literal by default

So for a JSON value {"msg": "It's fine"} stored in a Postgres VARCHAR:

INSERT INTO events(payload) VALUES('{"msg": "It''s fine"}');

The single quote inside the JSON string gets doubled per Postgres rules.

Sell Custom Apparel — We Handle Printing & Free Shipping

The Real Answer — Parameterized Queries

Hand-escaping JSON for SQL is error-prone and dangerous. A missed escape isn't just a bug — it's a SQL injection vulnerability. The right answer for any production code is parameterized queries (also called prepared statements).

// Bad — string concatenation
const query = "INSERT INTO events(payload) VALUES('" + jsonString + "')";

// Good — parameterized
const query = "INSERT INTO events(payload) VALUES($1)";
await client.query(query, [jsonString]);

The driver sends the JSON as a separate parameter to the database. The SQL itself never contains the value, so there's nothing to escape and no injection surface. This is true for every modern database driver in every language.

If you're building SQL strings by concatenation in 2026, you're either deep in legacy code or doing something wrong. Get to parameterized queries first; worry about JSON-specific concerns second.

Querying JSON Out of the Database

Pulling JSON back is the easier direction. Native JSON columns return as parsed objects in most drivers; string columns return as strings that you parse yourself.

Querying inside the JSON varies per dialect:

-- PostgreSQL (jsonb)
SELECT payload->>'name' AS name FROM events WHERE payload->>'type' = 'login';

-- MySQL
SELECT JSON_EXTRACT(payload, '$.name') FROM events WHERE JSON_EXTRACT(payload, '$.type') = 'login';

-- SQL Server
SELECT JSON_VALUE(payload, '$.name') FROM events WHERE JSON_VALUE(payload, '$.type') = 'login';

For one-off debugging, copy the JSON value out of the result and paste into a JSON formatter for readability. For repeated queries, indexing on extracted paths (Postgres GIN, MySQL functional indexes) keeps performance reasonable as the table grows.

Common Failure Modes

Three things that bite people storing JSON in SQL:

1. Double-stringification. Application calls JSON.stringify on an object, then the driver stringifies again. You end up with "\"{\\\"name\\\":\\\"Alex\\\"}\"" in the column. Solution: pass the object directly when the column is a JSON type; only stringify when the column is a string type.

2. Encoding mismatches. JSON is UTF-8 by spec. If your column collation is Latin1 or your connection isn't UTF-8, non-ASCII characters silently corrupt. Set the connection charset to UTF-8 and use UTF-8 columns.

3. Size limits. MySQL JSON columns top out at 1GB but practical limits are much lower for performance. SQL Server NVARCHAR(MAX) is 2GB. Postgres jsonb is 1GB. For very large payloads, consider external storage with a reference column.

For most apps, the JSON-in-database fundamentals are: native JSON column, parameterized query, UTF-8 connection. Get those three right and you can ignore most escape concerns entirely.

Escape JSON for SQL

For when parameterized queries are not an option. Paste JSON, get a SQL-safe string.

Open Free JSON Escape / Unescape Tool

Frequently Asked Questions

Should I store JSON as a JSON column or a string column?

JSON column when available — better validation, indexing, and query support. String columns for legacy schemas only.

How do I escape JSON for an INSERT statement?

Use parameterized queries — the driver handles it. If you must build SQL by hand, double single quotes per your database's rules.

Why is my JSON showing up as escaped strings in the column?

Double-stringification — your code stringified once, the driver stringified again. Pass the object directly to a JSON column or stringify exactly once for a string column.

Can I query inside a JSON column?

Yes — Postgres has -> and ->>, MySQL has JSON_EXTRACT, SQL Server has JSON_VALUE. Each dialect has its own syntax.

David Rosenberg
David Rosenberg Technical Writer

David spent ten years as a software developer before shifting to technical writing covering developer productivity tools.

More articles by David →
Launch Your Own Clothing Brand — No Inventory, No Risk