I’m having issues uploading big BAM files (0.7GB to 2.3GB) from my website to the Flask backend. I’ve tried a few things:
- Uploading the whole file at once, which resulted in disconnection errors.
- Increasing the timeout to 10 minutes, but that didn’t help.
- Uploading the files in chunks works better until the last chunk always fails.
Here’s what my setup looks like:
- Frontend: HTML form with file input and JavaScript handling chunked uploads.
- Backend: A Flask route that processes file chunks and then combines them.
The error I’m getting is as follows:
POST http://0.0.0.0:443/_upload_file [HTTP/1.1 500 INTERNAL SERVER ERROR 4248ms]
Error uploading chunk 73: Argument must be string, bytes or unicode.
It seems like the problem always occurs with the final chunk. Any advice on what might be causing this issue or how to fix it would be greatly appreciated!
I’ve encountered similar issues when dealing with large genomic files. One effective solution I found was implementing a resumable upload mechanism. This approach allows the upload to continue from where it left off if there’s an interruption. It’s particularly useful for BAM files, given their size.
Additionally, ensure your server is configured to handle large file uploads. You might need to adjust settings like client_max_body_size in Nginx or LimitRequestBody in Apache. On the Flask side, consider using a library like Flask-Uploads, which is designed to handle large file uploads more efficiently.
Lastly, it’s worth checking if your BAM files are compressed. If not, compressing them before upload can significantly reduce transfer time and potentially avoid some of these issues.
yo, try using a different file upload library like Flask-Reuploads. it’s built for handling big files n might solve ur problem. also, check ur server’s memory limits - could be hittin a cap there. good luck with those BAM files, they can be a pain!
hm, that’s an interesting problem! have u tried using a streaming approach instead of chunking? it might handle large files better. also, what about checking ur server’s memory usage during uploads? could be hitting a limit there. curious to hear if anyone else has faced similar issues with big BAM files?