← Back to context

Comment by integralpilot

17 hours ago

We don't use AI to help write code due to copyright concerns, it's against our policy. We obviously need to be very careful with what we're doing, and we can't be sure it hasn't seen Apple docs or RE'ed Apple binaries etc (which we have very careful clean-room policies on) in its training data. It also can't be guaranteed that the generated code is GPL+MIT compatible (as it may draw inspiration from other GPL only drivers in the same subsystems) but we wish to use GPL+MIT to enable BSD to take inspiration from the drivers.

Given that literally no one is enforcing this it seems like a moral rather than a business decision here no? Isn’t the risk here that your competitors, who have no such moral qualms, are just going to commit all sorts of blatant copyright infringement but it really doesn’t matter because no one is enforcing it?

  • I don't see open source as having "competitors". If someone wants to make a fork and use AI to write code (which I also think wouldn't be very useful, as there's no public documentation and everything needs to traced and RE-ed), they are welcome to. We're interested in upstreaming though, which means we need to make sure the origin of code and licence is all compatible and acceptable for mainline, and don't want to infringe on Apple's copyright (which they may enforce on a fork with less strict rules than ours).

    • I get “fear of being sued or decoupled from the upstream project” for sure. It definitely speaks to the sad state of affairs currently when companies at Apple’s scale simply operate with complete impunity at copyright law when it comes to using AI (you think Apple isn’t using stuff like Claude internally? I can 100% guarantee you they are) but are able to turn around and bully people who might dare to do the same

  • Who is a competitor for Asahi? What would that even entail?

    > Given that literally no one is enforcing this

    Presumably Apple's lawyers would enforce it.

    • I’ll believe it when I see a court case of them going after someone for some ai generated slop and they win. Don’t see much evidence of that happening right now, or really ever since the advent of these things

      1 reply →