Comment by physicles

1 day ago

I recently took a couple 100 line stored procedures that I wrote years ago, and replaced them with equivalent go code and a few inline SQL calls. We had extensive tests on those stored procedures too, or rather, tests on the go methods that called them. The tests all ran in a few seconds. The go code feels so much nicer to deal with, and I thought about why:

- Stored procedures are deployed separately from the code that calls them: one is a db migration, one is a k8s container deploy. And that binding isn’t strongly-typed, so there’s more thinking and double checking involved during deployment (and, heaven forbid, rollback).

- The SQL procedures duplicated some business logic and constants that are defined in some of our go core libraries. It felt good to remove that duplication.

- The procedures also had a few specific return codes. We needed to define some constants (like postgres itself does) that the go code could detect and translate into go errors. That’s more complexity.

- Maybe there’s a good SQL IDE out there, but I don’t have it, so I was writing these things in vscode with syntax highlighting. Plenty of errors that would get an immediate underline in go instead required 20 seconds to compile and run the test and interpret the error message.

- This lack of tooling, and the fact that I’m not great at SQL (I’m fairly good with queries, but not with stuff like variables and IF EXISTS that you only see in stored procedures), made it hard to use all the complexity-cutting techniques I’m used to applying in go and other languages. So the 100 line stored procedure stayed 100 lines, and everyone hated working on it.

Stored procedures are basically a cross-service API call, with all the complexities that entails. I won’t reach for them again unless there’s a technical problem that can’t be solved any other way. We don’t have a dedicated DBA so Conway’s Law doesn’t factor in.