← Back to context

Comment by hamburglar

5 years ago

I'm surprised that nobody's pointed out that there are actually valid reasons other than greed to obscure your file format. It's an implementation detail, not a contract. If customers begin relying on the implementation details, you end up with angry customers when you change the implementation details. A SQLite db without the header is basically a statement saying, "we are using the obvious file format here for our convenience, not for general purpose access. Screw around in here at your own risk."

If you modified their app's internal state db and screwed it up because they have designed their software with certain assumptions that aren't clear from just reading their db schema, that would be a nightmare for them to support. The easiest thing for them to do is just to try to discourage tampering with their internal state.

This is especially true if there's a chance that a market for secondary apps/utils will spring up. If that's to happen and be viable, they absolutely would want to put thought into what their supported interfaces are for those apps/utils, otherwise they will end up painted into a corner and unable to change their architecture without destroying a marketplace.

I really don't get this line of reasoning. If I do something with a product that is (maybe even explicitly) unsupported by the manufacturer I don't have a reasonable leg to stand on. We've recently had this with a customer where they used the concrete calls of our app to automate some of their stuff (we had had no public API for that action at that point because we hadn't had the need so far and so did no other customer or to be precise: no customer was prepared to shell out the cash for us to develop it) and after we changed something they suddenly weren't able to script the creation of customer entries in their installation. We had told them at least two years ago that the way they interacted with our system was but supported (we only noticed them automating stuff back then because it was throwing exceptions in our backend) and while we were nice enough to fix it this time because the fix was trivial we recommended that they should switch to our supported API. Two weeks ago theyir stuff broke again and we told them to use the API or fuck off.

  • So you see the problem and you've wasted time on it, and now you've come to the point where you're ok with being put in a situation where you have to tell your users to fuck off.

    Just because they don't have a leg to stand on doesn't mean this isn't potentially a huge pain in your ass that you'd probably rather avoid. Imagine if you had a LOT of customers contacting you with this type of problem to the point where you felt like you were painted into a corner and had to support some ad-hoc APIs that weren't designed to be customer-facing and which you might have been planning to remove altogether because they're part of a design that's changing.

    This is exactly the situation the obfuscation is attempting to avoid. They're just doing it with a technical solution rather than a human telling another human "don't do that."

I've never understood the "We can ship this but it's not contract" mentality. To my mind if you ship something it's contract. I think developers keeping that in mind would make the world full of more resilient code. You're making a contract w/ your future self, if nothing else.

To the point re: modifying internal state and screwing-up the application - If you're writing anything out to persistent storage you should assume that it's untrusted data when you read it back in. If for no other reason than physics itself is a malicious (or, at best, ambivalent) actor.

  • If your app ships with a dynamically loaded library and I dig through the exposed functions and find something undocumented that looks useful and I figure out how to use it, that's a contract to you? That's utterly insane.

    Re: the concept of untrusted data, this is off in the weeds argument for argument's sake IMO. Do reasonable validations of the state data, sure, but picking nits about the nature of trust and internal application state is an infinite hole I'm not jumping into with you.

  • Sometimes a programmer may like to create such a contract, if only with their future self, but have reason to fear their boss will force them to break that contract in some not unlikely future.

+1. I recently made a variation of this argument on my FOSS app. [1] If it were commercial software with support I'd feel even more strongly.

Philosophically, people should be able to do what they want on their machines. But expecting support (eg figuring out how their third-party software has corrupted my database) is another matter, so I can see why people would install a speed-bump or two...

[1] https://github.com/scottlamb/moonfire-nvr/issues/44#issuecom...