Comment by kathir05

3 days ago

This is an interesting read!

For loop, if else are replaced by LLM api calls Now LLM api calls needs

1. needs GPU to compute the context

2. Spawn a new process

3. Search internet to build more context

4. reconcile result and return api calls

Oh man! if my use case is simple like Oauth, I would solved using 10 lines of non LLM code!

But today people have the power to do the same via LLM without giving second thought about efficiency

Sensible use of LLMs still only deep engineers can do!!

But today, "Are we using resources efficiently?", wonder at what stage of tech startup building, people will turn and ask this question to real engineers in coming days.

Till then deep engineers has to wait