I’m working on a Django project where I have a model that includes an ImageField. This field uses django-storages with S3Boto as the storage backend. The problem I’m facing is with my test suite.
When I run tests that involve uploading images through my upload view, the tests actually send files to S3. This makes my test suite really slow because of the network calls to Amazon S3.
I want to speed up my tests but I’m not sure what the best approach is. Should I create mocks for the S3Boto backend? Or maybe there’s some kind of in-memory storage solution that would work better for testing?
It would be great if whatever solution I use could handle cleanup automatically so I don’t have to worry about test files piling up. What do other developers usually do in this situation?
wait, does inmemorystorage actually work with imagefields? i’ve heard it gets tricky with file validation. did you test if image processing still works with that setup? any gotchas i should kno about?
Had the same problem. I ditched InMemoryStorage
and went with django-storages’ FileSystemStorage
for testing instead. Just override your storage settings in the test config - set DEFAULT_FILE_STORAGE = 'django.core.files.storage.FileSystemStorage'
and point MEDIA_ROOT
to something like /tmp/test_media/
. This works way better for ImageField validation since files actually get written to disk, which some image libraries need anyway. I throw in a tearDown
method to clean up temp files after each test. No more S3 calls and everything still plays nice with Django’s file handling.
you can just change your storage in the test settings. I found using DEFAULT_FILE_STORAGE = 'django.core.files.storage.InMemoryStorage'
works awsome! it skips any S3 calls and cleans up after every test. saves a lot of time!