Comment by retrocryptid
2 years ago
<unpopular-opinion>
Bardini's book about Doug Engelbart recaps a conversation between Engelbart and Minsky about the nature of natural language interfaces... that took place in the 1960s.
AI interfaces taking so long has less to do with the technology (I mean... Zork understood my text sentences well enough to get me around a simulated world) and more to do with what people are comfortable with.
Lowey talked about MAYA (Most Advanced Yet Acceptable.) I think it's taken this long for people to be okay with the inherent slowness of AI interfaces. We needed a generation or two of users who traded representational efficiency for easy to learn abstractions. And now we can do it again. You can code up a demo app using various LLMs, but it takes HOURS of back and forth to get to the point it takes me (with experience and boilerplate) minutes to get to. But you don't need to invest in developing the experience.
And I encourage every product manager to build a few apps with AI tools so you'll more easily see what you're paying me for.
</unpopular-opinion>
Sure, and not many people are seriously trying to suggest that one should hire an AI instead of a software engineer _at this point_, assuming you have a real budget.
But, especially with GPT-4, it is entirely feasible to create a convenient and relatively fast user experience for building a specific type of application that doesn't stray too far from the norm. AI can call the boilerplate generator and even add some custom code using a particular API that you feed it.
So many people are trying to build that type of thing (including me). As more of these become available, many people who don't have thousands of dollars to pay a programmer will hire an AI for a few tens or hundreds of dollars instead.
The other point is that this is the current state of generative AI at the present moment. It gets better every few months.
Project the current rate of progress forward by 5-10 years. One can imagine that if we are selling something at that point, it's not our own labour. Maybe it would be an AI that we have tuned with skills, knowledge, face, voice, and personality that we think will be saleable. Possibly using some of our own knowledge and skills to improve that recipe. Although there will likely be marketplaces where you can easily select the abilities or characteristics you want.
In Jaron Lanier's review of John Markoff's book "What the Dormouse Said", he mentioned an exchange between Douglass Engelbart and Marvin Minsky:
https://web.archive.org/web/20110312232514/https://www.ameri...
>Engelbart once told me a story that illustrates the conflict succinctly. He met Marvin Minsky — one of the founders of the field of AI — and Minsky told him how the AI lab would create intelligent machines. Engelbart replied, "You're going to do all that for the machines? What are you going to do for the people?" This conflict between machine- and human-centered design continues to this day.
[dead]