← Back to context

Comment by tough

3 days ago

sorry for the tangent, how you deal with AI?

Why would you expect them to? It's really easy to live without AI.

  • nothing prevents them to run a gpu locally or on their own infra.

    I was asking because I wonder what the enterprises that want to both use AI on their workflows like LLM's and have air-gap owned 100% data and pipelines are doing rn.

    Feels like one of the few areas where to compete with big labs to me, might be wrong

External AI are banned, local or otherwise on-prem models are allowed. We're currently experimenting with some kind of llama instance running on one of our servers, but I personally don't use it much.