Comment by samatman
1 year ago
> If you ask generative AI for a picture of a "nurse", it will produce a picture of a white woman 100% of the time, without some additional prompting or fine tuning that encourages it to do something else.
> If you ask a generative AI for a picture of a "software engineer", it will produce a picture of a white guy 100% of the time, without some additional prompting or fine tuning that encourages it to do something else.
These are invented problems. The default is irrelevant and doesn't convey some overarching meaning, it's not a teachable moment, it's a bare fact about the system. If I asked for a basketball player in an 1980s Harlem Globetrotters outfit, spinning a basketball, I would expect him to be male and black.
If what I wanted was a buxom redheaded girl with freckles, in a Harlem Globetrotters outfit, spinning a basketball, I'd expect to be able to get that by specifying.
The ham-handed prompt injection these companies are using to try and solve this made-up problem people like you insist on having, is standing directly in the path of a system which can reliably fulfill requests like that. Unlike your neurotic insistence that default output match your completely arbitrary and meaningless criteria, that reliability is actually important, at least if what you want is a useful generative art program.
No comments yet
Contribute on Hacker News ↗