← Back to context

Comment by eddythompson80

21 days ago

> ChatGPT alone has hundreds of millions of active users that are clearly getting value from it

True, but it’s been years now since the debut of the chat-interface-AI to the general public and we have yet to figure out another interface that would work for generative AI for the general public. I’d say the only other example is Adobe and what they are doing with generative AI in their photo editing tools, but thats a far cry from a “general public” type thing. You have all the bumbling nonsense coming out of Microsoft and Google trying to shove AI into whatever tools they are selling while still getting 0 adoption. The copilot and Gemini corporate sales teams have been both “restructured“ this year because they managed to sign up so many clients in 2023/2024 and all those clients refused to renew.

When it comes to the general public, we have yet to find a better application of AI than a chat interface. Even outside of the general public, I oversee few teams that are building “agentic AI tools/workflows” and the amount of trouble they have to go through to make something slightly coherent is insane. I still believe that the right team with the right architecture and design can probably achieve things that are incredible with LLMs, but it’s not as easy as the term “AI” makes it sound.

Putting generative AI inside tools without giving deep understanding of those tools to the AI generally made me more confused and frustrated than outside of it:

for example Gemini forced itself on me on my SMS app, so I thought I ask it to search something simple inside the messages, and it just started generating some random text about searching and saying that it doesn't have access to the messages themselves.

When I use ChatGPT, of course I know they don't have access to my SMSs (it would be weird).

I can give ChatGPT the exact context I want to, and I know it will work with it as long as the context is not too big.

  • I think that’s fundamentally an important part of the value of ChatGPT compared to all the “integrated” AI approaches. Like you, I like that ChatGPT doesn’t have access to my SMSs, emails, documents, code, etc. When I use it, I like to choose the context it’s exposed to and then evaluate the output based on how much context I exposed it to.

    For example, one of my main use cases for ChatGPT is crafting SQL queries. I never liked the SQL language. I understand how powerful it is, but I was never good at it. I find it to be structured in such an awkward and confusing way. I’m not smart enough to express what I’m looking for in SQL format. But ChatGPT is perfect for that. I give it a brief explanation of tables schema then ask it for a query and it spits out a SQL query for me. 70% of the time it looks good, 30% I’m skeptical it solves what I want.

    I have tried many “Database/postgres AI services” that promise much better interface. They ask you for creds to your database, then inject all context/schema into the LLM so you just ask questions directly. They have all been too underwhelming. I ask a question about joining and reducing tables A and B, and it insists to involves tables C and D. I have to say “don’t worry about C and D”. So it involves E and F. I say “fuck, only consider A and B” and it generated an invalid query that assumes A and B have some columns from C, D, E or F. I say “please don’t consider any schema other that A and B schema. I’m looking for a result of joining A and B, reduces by columns X/Y giving the average of those” then feel stupid having to argue with a machine about what I’m looking for.