← Back to context

Comment by pfdietz

18 hours ago

What does their patent moat look like?

Google owns the core transformer patent(s), for one thing, e.g. https://patents.google.com/patent/US10452978B2/en.

I haven't read the claims, so I don't know how easy it will be to work around them. This particular one seems to cover encoder-decoder networks, so it's not necessarily applicable to later LLM implementations. But I'd be amazed if Google didn't have several other relevant patents in their arsenal.