I’m working with SQL Server and trying to insert JSON data into a table. When I use OPENJSON and one record causes an error (like a primary key violation), the entire insert fails and no records get added.
My problem: I have a JSON array with multiple records, but if one record has a duplicate key, nothing gets inserted. I want it to skip the bad record and keep inserting the good ones.
What I want: The insert should work like individual INSERT statements where each one is processed separately. If one fails, the others still go through.
Here’s my current code that fails completely when there’s a duplicate:
CREATE TABLE #temp_data
(
user_id int NOT NULL,
CONSTRAINT PK_temp_user_ID PRIMARY KEY CLUSTERED (user_id)
)
DECLARE @jsonData nvarchar(max);
SET @jsonData = N'[
{ "user_id" : 10},
{ "user_id" : 20},
{ "user_id" : 30},
{ "user_id" : 20},
{ "user_id" : 50}
]';
INSERT INTO #temp_data (user_id)
SELECT user_id
FROM OPENJSON(@jsonData)
WITH (user_id int)
This approach with separate statements works but isn’t practical for JSON:
insert into #temp_data(user_id) values(10)
insert into #temp_data(user_id) values(20)
insert into #temp_data(user_id) values(30)
insert into #temp_data(user_id) values(20) -- fails but others succeed
insert into #temp_data(user_id) values(50)
Is there a way to make JSON inserts behave like this? I need it to work for any type of SQL error, not just primary key violations.