I’m running into a weird issue with my authentication setup. I have Keycloak running in Docker on my Synology NAS through Portainer, and it’s set up to handle client credentials auth using Service Account Roles.
My Express.js backend uses keycloak-connect to secure API routes. Everything works perfectly when I run the backend on my local machine. The tokens get issued properly and I can access all the protected endpoints without any problems. But as soon as I put the backend in a Docker container on the same NAS, I start getting 403 Forbidden errors on the protected routes.
Here’s how I have my Keycloak setup configured:
Authentication Config (security/keycloak-service.js)
const KeycloakConnect = require('keycloak-connect');
const keycloakAuth = new KeycloakConnect({ }, {
"realm": process.env.KC_REALM,
"bearer-only": true,
"auth-server-url": process.env.KC_AUTH_URL,
"ssl-required": "none",
"resource": process.env.KC_SERVICE_CLIENT
});
module.exports = keycloakAuth;
Main App Setup (server.js)
const keycloakAuth = require("./security/keycloak-service");
...
app.use(keycloakAuth.middleware());
...
Secured Endpoint Example
router.put("/api/device-status", keycloakAuth.protect(), (req, res, next) =>
updateDeviceStatus(req, res, next)
);
Some more context:
- Both containers are on the same Docker network in Portainer
- I’m using identical client credentials as my local setup
- The service account tokens have all the right roles assigned
- Backend container can communicate with Keycloak using internal Docker networking
- No obvious errors showing up in the container logs
I’m testing this with Postman from my dev machine and the JWT token looks correct when I decode it. Has anyone run into similar issues with keycloak-connect in containerized environments? Could there be some Docker-specific configuration I’m missing for service accounts?