Testing Django ImageField with S3 Storage Without Actual File Uploads

I’m working on a Django project where I have a model that uses ImageField connected to an S3 storage backend through django-storages. When I run my tests for the image upload functionality, they’re really slow because each test actually uploads files to S3.

I want to make my test suite run faster. What’s the recommended approach for this situation? Should I create mock objects for the S3 storage system? Or maybe there’s a way to use an in-memory storage solution just for testing that would automatically clean up files after tests finish?

Any suggestions for best practices when testing file uploads without hitting external services would be helpful.

Oh cool! Are you using pytest-django with temp directories? Quick question though - do you test the actual S3 integration anywhere, or mock everything? I’m wondering how you split testing the storage backend vs the model logic.

I’ve dealt with this before - just use a custom test storage backend that acts like S3 without the network calls. Override your storage class in test settings with Django’s InMemoryStorage or set up a temp directory that clears between runs. Your model validation and file handling still get tested, but you skip the S3 roundtrip delays. I used Django’s override_settings decorator with local storage and cut test time by about 80% without losing coverage.

totally agree! switching to a local filesystem for tests is the easiest way to speed things up. you can use django’s default storage and just configure it in your test settings. makes everything a lot quicker without the hassle of cleaning up s3.