Django, Bulk Update and Deadlocks

dboost.me
2 min readSep 10, 2023

--

It’s very easy to make a mistake. Especially in things related to concurrency. Here we are going to cover a simple thing you can do in django and get in trouble.

Bulk Update

Imaging you perform a batch update from two different processes. You have a Counter model you want to update.

Process 1:

Counter.objects.bulk_update([c1, c2], fields=["value"], batch_size=1)

Process 2:

Counter.objects.bulk_update([c2, c1], fields=["value"], batch_size=1)

One of the processes will succeed the other one will fail with an error:

deadlock detected
DETAIL: Process 17277 waits for ShareLock on transaction 12881; blocked by process 17278.
Process 17278 waits for ShareLock on transaction 12882; blocked by process 17277.
HINT: See server log for query details.

Let’s illustrate how deadlock happens:

process 1 acquires lock on c1
process 2 acquires lock on c2
process 1 tries to acquire lock on c2, waits
process 2 tries to acquire lock on c1, postgres detecs deadlock and aborts transaction
process 2 aborts
process 1 acquires lock on c2
process 1 finishes

Problem is in the order of acquired locks — if it was the same for both processes no deadlock would have happened. We can fix that by overriding bulk_update method:

def bulk_update(self, objs, fields, batch_size=None):
objs = sorted(objs, key=lambda x: x.id)
return super().bulk_update(objs, fields, batch_size)

Conclusion

Encountering deadlocks is a sign you are working on something interesting and it’s good to know how to avoid them.

--

--