Mule 4 Bulk Insert into SQL Server: Date Conversion Issue

Mule 4 bulk insert to SQL Server fails due to a date conversion error from a JSON payload. Below is an example SQL statement and DataWeave snippet:

INSERT INTO LogData (
  log_creator, log_datetime, log_status
) VALUES (
  :log_creator, :log_datetime, :log_status
);
output application/json
---
{
  log_creator: payload.userId default "",
  log_datetime: (currentTimestamp() >> "US/Pacific") as String {format: "yyyy-MM-dd HH:mm:ss.SSS"},
  log_status: payload.flag default ""
}

hey, i was wondering if anyone tinkered with the date conversion phase earlier? maybe splitting timezone conversion into two steps might help. have u seen any other workaround? would love to hear ur experiance on this.

hey silvia85, ive faced this too. make sure try converting to utc first then reapply your format. sometimes the timezone messes up the direct conversion. hope that tweaks it for ya.

In my experience with Mule 4 bulk inserts to SQL Server, the key to resolving the date conversion error was to ensure that the conversion format exactly matches what SQL Server expects. I found that subtle discrepancies between the DataWeave timestamp output and the SQL Server datetime format could be a trigger for this issue. Adjusting the conversion method to produce a string that does not introduce unwanted timezone artifacts proved effective. It is beneficial to rigorously test the transformation with realistic sample values in a pre-production setup to catch such issues early.