Angular fails to receive complete response from FastAPI backend

I’m having trouble with my FastAPI backend and Angular client setup. The API works fine when I test it with Postman or curl commands, but Angular never gets the full response.

The Problem:
When Angular makes requests, I get net::ERR_CONTENT_LENGTH_MISMATCH errors. The transfer shows mismatched sizes like:

12.3 kB / 567 kB transferred
14.1 kB / 8.2 MB resources  
Finish: 4.1 min

I tried adding longer timeouts but then get net::ERR_INCOMPLETE_CHUNKED_ENCODING instead:

def start_server():
    uvicorn.run(
        server_app, 
        host="localhost", 
        port=8000,
        timeout_keep_alive=300)

CORS Setup:

server_app.add_middleware(
    CORSMiddleware,
    allow_origins=allowed_origins, 
    allow_credentials=True,
    allow_methods=["GET", "POST"],  
    allow_headers=["*"],  
)

Angular Service:

public fetchDocumentData(): Observable<any> {
  return new Observable(subscriber => {
    const authToken = this.tokenService.getToken();
    const requestUrl = authToken ? `${this.BASE_URL}?auth=${encodeURIComponent(authToken)}` : this.BASE_URL;

    fetch(requestUrl, {
      method: 'GET',
      credentials: 'include',
      headers: { 'Content-Type': 'application/json' }
    })
    .then(result => {
      console.log("Status:", result.status);
      return result;
    })
    .catch(err => {
      console.error("Request failed:", err);
      subscriber.error(err);
    });
  });
}

FastAPI Endpoint:

docs_router = APIRouter(prefix="/documents", tags=["docs"])

@docs_router.get("/fetch/info")
async def fetch_document_info():
    result = retrieve_document_data()
    return StreamingResponse(json_stream_generator(result), media_type="application/json")

async def json_stream_generator(dataset):
    yield '{"status": "ok", "payload": {'  

    is_first = True
    for item_key, item_value in dataset.items():  
        if not is_first:
            yield ","
        else:
            is_first = False

        yield json.dumps(item_key) + ": " + json.dumps(item_value, default=str)
        await asyncio.sleep(0)

    yield "}}"  
    await asyncio.sleep(2)

Other POST requests work fine. My backend runs on EC2 and I connect from localhost via HTTP. Any ideas what could cause this response truncation issue?

I’ve hit this exact problem streaming large JSON through FastAPI. Your Angular service isn’t actually consuming the response stream - you’re calling fetch but never processing result.json() or the response body, so the connection just hangs there. Ditch fetch and use Angular’s HttpClient instead - it handles streaming way better. Also, your json_stream_generator has a race condition with that manual JSON construction. I’d either use FastAPI’s JSONResponse for large payloads instead of StreamingResponse, or add proper backpressure handling to your generator. One more thing - EC2 to localhost connections can have different network timeouts than your app timeouts, which might be causing issues too.

your streaming response is cutting off mid-transfer. add a content-length header to your fastapi response or just use regular response instead of streamingresponse. that manual json building in your generator will break with special characters - use a proper json serializer.

Interesting issue… does this happen with all response sizes or just the big ones? Why’d you choose streaming for JSON instead of regular responses - is the dataset huge? Also, is there a reverse proxy or load balancer in front of your EC2 that might be buffering or timing out the streams?