← Back to context

Comment by SOLAR_FIELDS

9 hours ago

Given that literally no one is enforcing this it seems like a moral rather than a business decision here no? Isn’t the risk here that your competitors, who have no such moral qualms, are just going to commit all sorts of blatant copyright infringement but it really doesn’t matter because no one is enforcing it?

I don't see open source as having "competitors". If someone wants to make a fork and use AI to write code (which I also think wouldn't be very useful, as there's no public documentation and everything needs to traced and RE-ed), they are welcome to. We're interested in upstreaming though, which means we need to make sure the origin of code and licence is all compatible and acceptable for mainline, and don't want to infringe on Apple's copyright (which they may enforce on a fork with less strict rules than ours).

  • I get “fear of being sued or decoupled from the upstream project” for sure. It definitely speaks to the sad state of affairs currently when companies at Apple’s scale simply operate with complete impunity at copyright law when it comes to using AI (you think Apple isn’t using stuff like Claude internally? I can 100% guarantee you they are) but are able to turn around and bully people who might dare to do the same

Who is a competitor for Asahi? What would that even entail?

> Given that literally no one is enforcing this

Presumably Apple's lawyers would enforce it.

  • I’ll believe it when I see a court case of them going after someone for some ai generated slop and they win. Don’t see much evidence of that happening right now, or really ever since the advent of these things

    • Why would any serious project want to risk being the legal guinea pig for that experiment? And to what end? Everyone is pretty much in agreement that reusing code you're not licensed to use is bad for open source and just an all around shitty thing to do.