I’m not sure this question even makes sense and the answer may be “you’re doing it wrong”, but here it goes.
There is a push in my company to built tools as services that can be used by one or more end-user applications. We are in the electric power utility industry and I specific work with device configurations. We have a need for different applications to use these device configurations for different purposes, depending on the end-user and their needs. However, each of these applications may need to have the configuration file (CSV, XML, custom binary, etc.) to be parsed into a common structure for later analysis or usage.
I have Django experience and would prefer to write these services using Django and I believe DRF makes sense to provide the data as a web API.
I want to avoid duplication of serializers and have been looking for ways to share these definitions between application. I also want to have openapi capabilities. I believe that DFR serializers will require django to be installed, which is a lot to ask for something using a different framework. I looked into pydantic but I haven’t found a library that cleanly integrates DRF and pydantic and has an openapi interface. I found marshamallow and marsharllow-dataclass which allows me to write dataclasses and use marshallow for serialization/de-serialization which are easy to share across applications. However, I am still working out the openapi piece.
Am I making this too complicated? By trying to reduce duplication on serialization/deserialization, am I adding too much complexity elsewhere? Is there a better way to think about this problem?
Thank you for your time in advance.
My gut reaction is that yes, you are making this wayyyyyy too complicated. Start simple and build from there. Don’t start by assuming you need to make things complicated.
Why do you think that having multiple Django apps is going to help anything?
Fundamentally all Django works the same.
That url is mapped to a view.
A view receives a
Request and returns a
That response could be any of HTML, JSON, XML, text, image, pdf, etc, etc.
What the front end does with that response is up to it.
If your responses are closely tied to your data, then the DRF may very well be your best choice. Yes, DRF is a Django app - it works with Django, it does not replace it. It provides (among many other features) a standard way to request data and provide JSON responses. (
<opinion> If you write more than a couple views that would return a
JsonResponse, you’ll find that you’re starting to repeat the code that you write across those different views. What the DRF does is keep you from writing that code for all those views. That is what the DRF does for you.
I don’t know why you would think it’s necessary to integrate pydantic into this. From what I can see, it’s basically just replicating what Django and DRF already does.
Openapi is a specification and not a library. Complying with that specification is up to you.
Thanks Ken, I realize I didn’t do a very good job laying out my reasoning, it is hard to put it into words.
The reason I mentioned pydantic (or marshmallow/dataclasses) is so that I can define my models in one place, and then they can be used by the server (with automatic serialization/deserialization) and by one or my clients (also with automatic serialization/deserialization).
The reason I mention openapi, but didn’t explain very well, is that I want the spec to be automatically generated and I want a swaggerui automatically generated.
I hope this helps explain a little better.
But Django models are tied to the database, they’re not an abstract data structure that make sense to use outside that context.
You could create an interface layer where Django creates pydantic models from Django models, but quite frankly, that seems like a lot of wasted effort.
In the typical architecture, all non-Django clients access the data through views that expose the data as some type of serialized format. (JSON, XML, etc) What those formats mean is a matter of coordination between the server and the clients.
Keep in mind that when you’re passing data between processes, you’re passing a serialized format - you are not under any circumstance passing Python objects.
Finding it tough to explain your reasoning is generally a sign that you don’t have an adaquate understanding of some part of this architecture.
What has brought you to making these design decisions?
You should be able to clearly identify the benefits to be gained by doing this.
If you can’t validate those identified benefits against what those components provide, then you may want to reconsider those decisions.
See the DRF docs at: 3.10 Announcement - Django REST framework among others.
The Swagger UI is generated from the OpenAPI file from the server. I don’t know what you mean by “automatically” in this context.
From what you had said It looks like you came from FastAPI to Django.
I’m recently using a DRF Spectacular and it’s really close to what FastAPI provides for documentation. Maybe it’s worth checking out.
Thanks again Ken. I apologize for being imprecise in my terms, it has been a few months since I looked at this project and got some things confused. I think my problem is that I have developed a couple of web applications using Django but this effort to turn these applications into services with separate front-ends is throwing me for a loop.
I used “model” where I should have been using “serializer/deserializer”. The project I have in mind really doesn’t have any Django models, it is strictly a service that accepts a file, parses it, and returns a json representation.
This is where I may be using Django for something it isn’t a good fit for. However, when I look at some of the other frameworks, I feel like I would end up recreating a good portion of Django/DRF when you think about Django bringing a well-defined project structure with multiple applications, testing, management commands, serialization/deserialization, etc… Even though I am not using models in this case, I am familiar with Django and its batteries. Maybe this is a “square peg/round hole” situation though.
Yea, it probably isn’t a good fit. In addition to the normal Django Model perspective, keep in mind that DRF is built around the fundamental concept of converting Django Models to/from a serialized format. My guess is you’d have to do some serious re-engineering of parts of DRF to use it the way you describe.
For an application such as this, I might still use Django, but I’d be taking DRF out of the picture. (It depends upon how closely the files being parsed match the targeted output format.)
I actually have an application that does something like you describe, but it saves both the original uploaded file, and the parsed/reformatted output gets saved in the database. It allows for multiple people (or the same person at different times) to retrieve that file without having to re-upload and re-process it.
Thanks @leandrodesouzadev, I have done a number of projects in Django and just read about FastAPI, but I like some of the features of FastAPI. I think this is maybe my issue, FastAPI brings some nice features but I really enjoy Django and appreciate that it has been around a long time, is stable, and is well-supported. I am looking for those same features in Django/DRF.
If I had to boil it down, these are the features I want:
- Django, with its well-defined project structure and applications, testing support, URL routing, views, etc.
- A single place to define serializer/deserializers that can be used in both the server and one or more clients without needing to include Django/DFR in every client application, possibly in a library imported in both the server and any clients
- Automatic generation of the openapi schema
Last time I looked, and it was a few months ago, DRF Spectacular would do some of those items but required the DRF serializers/deserializers. I looked at using Pydantic for the serializers/deserializers but it didn’t seem to have great support in DRF. There is another library, marshmallow-dataclass that fills the need of the single serializer/deserializer definition but doesn’t work with the openapi schema generation tools. I also looked at django-ninja but again it would require Django to be installed in all the clients.
(Now that I write this out, I think this I finally arrived at the definition of my problem, thanks everyone for helping me get here)