What's the best way to handle large media files upload and retrieval in Express.js backend?

I’m working on an Express.js application where users need to upload multiple large images and videos. When they hit save, all files should be stored properly. The tricky part is that when users refresh the page or return later, they should see all their uploaded media files again.

I looked into using Multer for file handling, but I’m running my app on a Google Cloud Platform VM with only 2-4GB RAM. Storing big files directly in memory or handling multiple users uploading simultaneously seems like it would cause performance issues.

My current plan is to:

  1. Upload files directly to a cloud storage bucket
  2. Save the file locations in MongoDB
  3. When users request their files, fetch the paths from database
  4. Generate signed URLs for secure access
  5. Send these URLs to the frontend

Is this approach secure and efficient? Are there better alternatives for handling large media files in Express applications?

interesting challenge! but what happens during the actual upload? if someone uploads a 2gb video, won’t that still crash your server before it even hits cloud storage? and how are you handling progress indicators for users?

Your approach is solid and follows best practices. I did something similar last year and it works great at scale. The biggest improvement you should make is streaming uploads directly to cloud storage instead of buffering through your Express server. Use the stream option in your upload handler to pipe requests straight to Google Cloud Storage - this completely prevents memory issues. For MongoDB, store metadata like file size, upload timestamp, and content type with the storage paths. Helps with validation and frontend display. Security-wise, validate file types and set size limits before uploads start. Generate time-limited signed URLs with the right permissions based on user roles. One thing that bit me - handle upload failures gracefully and clean up orphaned files in storage when database operations fail.

yeah, streaming’s the right call, but watch out for concurrent uploads. if 20 users hit you with 500mb files at once, your vm’s gonna choke. set up a queue (redis works great) or just cap how many files each user can upload at once. for progress tracking, use multipart upload apis - gcs has it built in and you can push updates through websockets.