Angular client not receiving complete response from FastAPI backend

I’m having trouble with my Angular app not getting the full response from my FastAPI server. When I test with Postman or curl, everything works fine and I get the complete data. But when I try to call the same endpoint from Angular, the response gets cut off or never finishes loading.

I’ve tried using both fetch and HttpClient in Angular, and I’ve also experimented with streaming responses and regular JSON responses on the FastAPI side. Nothing seems to work properly. The browser shows errors like net::ERR_CONTENT_LENGTH_MISMATCH and sometimes net::ERR_INCOMPLETE_CHUNKED_ENCODING.

The transfer shows something like 12.3 kB / 567 kB transferred which means it’s not getting all the data. My API is running on an EC2 instance and I’m calling it from localhost using HTTP. Other POST requests work fine, but this GET request keeps failing.

Here’s my Angular service method:

public fetchDocumentData(): Observable<any> {
  return new Observable(subscriber => {
    const authToken = this.tokenService.getToken();
    const requestUrl = authToken ? `${this.BASE_URL}?auth=${encodeURIComponent(authToken)}` : this.BASE_URL;
    
    fetch(requestUrl, {
      method: 'GET',
      credentials: 'include',
      headers: { 'Content-Type': 'application/json' }
    })
    .then(result => {
      console.log("Status:", result.status);
      return result.json();
    })
    .catch(err => {
      console.error("Fetch error:", err);
      subscriber.error(err);
    });
  });
}

And here’s my FastAPI endpoint:

documents_router = APIRouter(prefix="/documents", tags=["documents"])

@documents_router.get("/fetch/data")
async def fetch_documents():
    result = fetch_document_data()
    return StreamingResponse(generate_json_stream(result), media_type="application/json")

async def generate_json_stream(result):
    yield '{"status": "ok", "payload": {'
    
    is_first = True
    for item_key, item_value in result.items():
        if not is_first:
            yield ","
        else:
            is_first = False
            
        yield json.dumps(item_key) + ": " + json.dumps(item_value, default=str)
        await asyncio.sleep(0)
        
    yield "}}" 

I’ve also tried increasing timeout_keep_alive in uvicorn but that just makes it take longer before failing. Any ideas what might be causing this issue?

hmm this is intresting - are you handeling the streaming response properly in your angular code? i notice you’re using result.json() but with StreamingResponse you might need to read the stream differently. have you tried logging what actually comes back before calling .json()? also curious about your EC2 setup - any load balancers or proxies that might be timing out the connection?

looks like a cors issue mixed with streaming problems. since you said other POST requests work fine but this GET fails, try switching back to regular JSON response instead of StreamingResponse first - that streaming setup looks overcomplicated for what you’re doing. also check if your EC2 security groups allow the full response size, sometimes they have limits that cut off larger payloads.

The issue appears to be in your Angular service implementation. You’re creating an Observable but never calling subscriber.next() or subscriber.complete() after getting the response. Your fetch promise resolves but the Observable subscriber never receives the data. Try this fix:

fetch(requestUrl, {
  method: 'GET',
  credentials: 'include',
  headers: { 'Content-Type': 'application/json' }
})
.then(result => result.json())
.then(data => {
  subscriber.next(data);
  subscriber.complete();
})
.catch(err => {
  subscriber.error(err);
});

Alternatively, just use HttpClient directly since you’re already importing it. The ERR_CONTENT_LENGTH_MISMATCH suggests the streaming response isn’t setting proper headers - your manual JSON streaming might be causing content-length calculation issues on the server side.