← Back to context

Comment by jumpingbeans

1 day ago

Interesting.

Do you have a link to the patent?

Here it is: https://patents.justia.com/patent/20230401438

On Google Patents: https://patents.google.com/patent/US20230401438A1/en

The authors simply implement a continued fraction library in Pytorch and call the backward() function on the resulting computation graph.

That is, they chain linear neural network layers and use the reciprocal (not RELU ) as the primary non-linearity.

The authors reinvent the wheel countless times:

1. They rename continued fractions and call them ‘ladders’. 2. They label basic division ‘The 1/z nonlinearity’. 3. Ultimately, they take the well-defined concept of Generalized Continued Fractions and call them CoFrNets and got a patent.

IBM's lawyers can strip out all the buzzword garbage if they feel litigious and sue anyone whose written a continued fraction library. Because, that's what the patent (without all the buzzwords) protects.

  • Thanks for that. That is patently absurd.

    You sent me down a rabbit hole. In trying to track it down for myself I read a couple of others that I thought might be it, and was stunned by how obtuse these patents are.

    What sort of leverage does this stuff provide? You mentioned "charge rent". What does that look like?

    • Honestly, I don't even know where to begin. It's insane IBM owns the patent to continued fractions.

      If you wrote a continued fraction class in Pytorch and called backwards (or even differentiated the power series) then you're infringing on their copyright.