Save model no matter if outer transaction fails

Is there any way to save a database model created inside a transaction, even if that transaction fails?

For example, imagine the following:

def save_log(...):
    HttpLog.objects.create(...)

@transaction.atomic
def do_something_that_could_fail():
    # do something

    save_log(...)
    
    # do something else

Is there any way to somehow save the HttpLog model, no matter if the do_something_that_could_fail itself fails and the transaction is rolled back? Essentially I want the HttpLog to be on a parallel transaction that has nothing to do with the outer transaction (without restructuring the code ideally).

Some context on what I want to achieve. I have a backend that has many 3rd party API / webhook integrations. Whenever something goes wrong with any of those api calls, it’s kind of hard to debug by looking at the (text) logs. What I want to do is create an “HttpLog” model and save all details about the request / response in the database so that it’s easier to lookup later and debug (similar to how Stripe shows you the full request history on their Dashboard).

To do this, I just have a view mixin (for the webhooks), and an adapter for the requests library (for the api calls) that just saves all request / response details into this model. All this works fine on the happy path, the problem is when things start to fail. Many times the place where the API call is made is in a transaction, and if anything raises, the whole transaction will be rolled back, including the HttpLog model. I want to avoid this, and save the HttpLog model no matter if the outer transaction fails or not, so that it’s essentially in a completely separate parallel transaction. The straightforward solution would be to re-structure the code, but since I have so many integrations, this would take forever, and I want to avoid it.

Thank you!

I do something like that in Urd: urd/urd/worker.py at main · boxed/urd · GitHub

1 Like

Works perfectly, thank you!

1 Like