I’m working on file uploads, using django-storages
I want to be able to upload files to either FileSystemStorage or GoogleCloudStorage. It depends on the configuration, for production environments I would like to upload files to Cloud Storage, for test & development environments, I would like to upload files to the local filesystem
I want to use different buckets for different file types. For example, user avatars should have their own bucket. I don’t want to use a “default” bucket
I want to use an abstract base class to achieve this. It would be nice if I could calculate the storage dynamically based on the model - or I would be ok with a similar DRY solution
For 1, you can make the STORAGES value conditional on the env.
3 sounds like you’ve got something half in mind already. It’s a simple matter of programming I imagine. Ultimately you have to set the storage on the FileField. How you do that is up to you, and probably quite project specific.
Difficult to say more given what’s there. Sounds fun!
Can you wrap it in a callable, so as to mask that it changes?
(Or else it’s a FileField subclass to override deconstruct, maybe, but — I’m sure it’s fine )
The issue is the callable is resolved in FileField.__init__. Maybe Django could resolve FileField.storage lazily instead? It would solve my use-case
I wonder if I’m missing something, I can’t see any related tickets. I don’t think my use-case is that complex, it’s pretty standard, isn’t it? Do you handle file uploads differently?
So different storages per field, yes. Different ones per environment, yes. But both of those are doable even if the callable is resolved at model definition time. That’s essentially once per-process.
It’s sounds like you’re wanting to do it per-request… — which I’d have to play with. ?
So, for context I have an abstract base class, and my goal is to get something like this working:
class File(Model):
STORAGE = ""
file = models.FileField(...)
extension = models.CharField(...)
mime_type = models.CharField(...)
...
class Meta:
abstract = True
class UserAvatar(File):
STORAGE = "user_avatar"
I went with your suggestion to use a callable for FileField.storage, so the migrations display a callable instead of a specific storage.
The Storage instance is created by this callable, so no need to define anything in STORAGES.
The question is, how do I set FileField.storage dynamically from the File.STORAGE attribute. I tried multiple solutions, the best (or least hacky) way I found is to create the FileField dynamically like this:
I’m still not quite seeing the benefit of that over just:
class UserAvatar(Model):
file = models.FileField(..., storage=partial(get_storage, "user_avatar"))
...
Or even just storage=settings.STORAGES["user_avatar"]. (I’m imagining get_storage mapping to settings.STORAGES but you can create them dynamically, sure. I’m wondering if using settings.STORAGES.__getItem__ would do there.)
In either case the storage is resolved at class definition time.
No doubt there’s some extra configuration that you’re wanting that I’m not seeing here? If you need it, I don’t see too much wrong with your solution.
Yes, it’s just so I don’t have to repeat the FileField config yes (I like to set things like verbose_name, db_comment, help_text for example). The lengths I go to keep things DRY…