The No Fakes Act has a “fingerprinting” trap that kills open source?

15 hours ago (old.reddit.com)

This reply sounds like a lot more sensible take: https://old.reddit.com/r/LocalLLaMA/comments/1q7qcux/the_no_...

OP replied and there's another in-depth reply to that below it

  • OP's reply to that appears to be drafted with heavy help from ChatGPT:

    > I appreciate you citing the specific clauses. You are reading the 'Black Letter Law' correctly, but you are missing the Litigation Reality (how this actually plays out in court). The 'Primarily Designed' Trap: You argue this only targets a specific 'Arnold Bot.' blah blah blah.

Likely unconstitutional as it violates the 1st amendment, which has done a very good job of protecting the right to author and distribute software over the years. Clearly an unintended positive consequence, since no one who worked or voted on the Bill of Rights had a computer.

If the courts upheld the part in question, it would create a clear path to go after software authors for any crime committed by a user. Cryptocurrencies would become impossible to develop in the US. Holding authors responsible for the actions of their users basically means everyone has to stop distributing software under their real names. There would be a serious chilling effect, as most open source projects shutdown or went underground.

Not saying this would be the right way to go about preventing undesirable uses, but shouldn't building 'risky' technologies signal some risk to the ones developing them? Safe harbor clauses have long allowed the risks to be externalised onto the user, fostering non-responsibility on the developers behalf.

  • Foisting the responsibility of the extremely risky transport industry onto the road developers would certainly prevent all undesirable uses of those carriageways. Once they are at last responsible for the risky uses of their technology, like bank robberies and car crashes, the incentive to build these dangerous freeways evaporates.

    • I think this is meant to show that moving the responsibility this way would be absurd because we don't do it for cars but... yeah, we probably should've done that for cars? Maybe then we'd have safe roads that don't encourage reckless driving.

      17 replies →

  • No.

    The reason safe harbor clauses externalize risks onto the user is because the user gets the most use (heh) of the software.

    No developer is going to accept unbounded risk based on user behavior for a limited reward, especially not if they're working for free.

    • The reason safe harbor clauses exist is because you don't blame the car manufacturer for making the bank robbery get away car.

  • How can you know how people are going to use the stuff you make? This is how we end up in a world where a precondition to writing code is having lawyers on staff.

  • Just last weekend I developed a faster reed-solomon encoder. I'm looking forward to my jail time when somebody uses it to cheaply and reliably persist bootlegged Disney assets, just because I had the gall to optimize some GF256 math.

    • That is not what I said. It is about signalling risks to developers, not criminalising them. And in terms of encoders, I would say it relates more to digital 'form' than 'content' anyways, the container of a creative work vs the 'creative' (created) work itself.

      While both can be misused, to me the latter category seems to afford a far larger set of tertiary/unintended uses.

You know kids, in the 80s, a lot of time before the First Crypto Wars, we had something called the Porn Wars on American Congress. I could leave out many depositions to Congress on Youtube, but I will leave you with some good music.

(shill)

https://www.youtube.com/watch?v=2HMsveLMdds

Which is of course the European Version, not the evil American Version.

  • Is there some other music artist that has been present in the Congress as prevalently as Zappa...?

  • sorry for offtop but I've just discovered Frank Zappa and he looks and sounds like the precursor of Serj Tankian (SOAD) - I mean "sounds" like as in similar crazy all over the place style

> I contacted my reps email to flag this as an "innovation killer."

Chinese companies will happy to drive innovation further after Google and OpenAI giants goes on with this to kill competition in the US.

US capitalism eats itself alive with this.

  • I really love how, thanks to China, people are beginning to see how technologically suppresive American oligarchy is and who's the reason why we can't have nice things.

"Open Source" in this case means "ML models with open weights"

(not my interpretation, it's what the post states - personally that is not what I think of when I read "Open Source")

Yay another bill modeled after the DMCA, what could go wrong?

  • This is worse than the DMCA because there's no provision for companies that develop or host the tech.

    Why the hell can't these legislators keep to punishing the law breakers instead of creating unending legal pitfalls for innovation?

    We have laws against murdering people with kitchen knives. Not laws against dinnerware. Williams Sonoma doesn't have to worry about lawsuits for its Guy Degrenne Beau Manoir 20-Piece Flatware Set.

I think this title is quite misleading given that it's only impacting open source models which is a very narrow interpretation of open source.

We do have tech that is "behind doors". Just look at military applications (nuclear, tank and jet design etc). Should "clonable voice and video" be behind close doors? Or should AGI be behind close doors? I think that the approach of the suggested legistation may not the right way to go about; but at a certain level of implementation capability I'm not sure how I would handle this situation.

If current tech appeared all of a sudden in 1999; I am sure as a society we would all accept this, but slow boiling frog theory I guess.

It should be called the anti-AGI bill, because trying to ban AI with certain capabilities is essentially banning embodied AI capable of learning/updating its weights live. The same logic applied to humans would essentially ban all humans, because any human can learn to draw and paint nudes of someone else.

> voice-conversion RVC model on HuggingFace, and someone else uses it to fake a celebrity, you (the dev) can be liable for statutory damages ($5k-$25k per violation). There is no Section 230 protection here. This effectively makes hosting open weights for audio models a legal suicide mission unless you are OpenAI or Google.

Good.

  • So you'd prefer that only rich megacorporations and criminals have access to this technology, and not normal people and researchers?

    • How is that surprising? The advent of modern AI tools has resulted in most people being heavily pro-IP. Everyone now talks about who has the copyright to something and so on.

      4 replies →

    • The studios did already rip off Mark Hamill of all people.

      Arguing regulatory capture versus overt piracy is a ridiculous premise. The "AI" firms have so much liquid capital now... they could pay the fines indefinitely in districts that constrain damages, and already settled with larger copyright holders like it was just another nuisance fee. =3