How to skip errors and continue JSON INSERT operations in SQL Server

I’m working with SQL Server and trying to insert JSON data into a table. When I use OPENJSON and one record causes an error (like a primary key violation), the entire insert fails and no records get added.

My problem: I have a JSON array with multiple records, but if one record has a duplicate key, nothing gets inserted. I want it to skip the bad record and keep inserting the good ones.

What I want: The insert should work like individual INSERT statements where each one is processed separately. If one fails, the others still go through.

Here’s my current code that fails completely when there’s a duplicate:

CREATE TABLE #temp_data
(
    user_id int NOT NULL, 
    CONSTRAINT PK_temp_user_ID PRIMARY KEY CLUSTERED (user_id)
)

DECLARE @jsonData nvarchar(max);
SET @jsonData = N'[
 { "user_id" : 10},
 { "user_id" : 20},
 { "user_id" : 30},
 { "user_id" : 20},
 { "user_id" : 50}
]';

INSERT INTO #temp_data (user_id)
    SELECT user_id
    FROM OPENJSON(@jsonData)
       WITH (user_id int)

This approach with separate statements works but isn’t practical for JSON:

insert into #temp_data(user_id) values(10)
insert into #temp_data(user_id) values(20)
insert into #temp_data(user_id) values(30)
insert into #temp_data(user_id) values(20) -- fails but others succeed
insert into #temp_data(user_id) values(50)

Is there a way to make JSON inserts behave like this? I need it to work for any type of SQL error, not just primary key violations.

Interesting challenge! Have you tried using MERGE with a CTE? Parse the JSON first, then merge it against your target table with WHEN NOT MATCHED clauses. What’s the volume like though - one-time migration or regular runs?

The problem is SQL Server treats your INSERT as one atomic operation. When it processes JSON arrays through OPENJSON, it evaluates the entire result set before inserting anything. That’s why you can’t skip individual bad records.

Here’s what works: Parse your JSON array and build individual INSERT statements, then run each one in a TRY-CATCH block. It’s basically automating what you did manually.

Or try a staging table approach. Dump all your JSON records into a temp table first (no constraints), then use INSERT with NOT EXISTS or LEFT JOIN to move only the valid records to your main table. You avoid constraint violations but keep it set-based.

For production, I’d use MERGE statements with an error logging table to catch failed records. You get the error handling you need plus an audit trail for fixing data issues later.

Easiest fix I’ve found? Use a cursor to loop through each JSON element. I know cursors get hate, but they’re perfect for this kind of error handling. Parse your JSON into a temp table first, then cursor through with try/catch blocks on each insert. Not pretty, but it works when you need record-by-record processing.