← Back to context

Comment by tomjen3

2 days ago

But accessiblity on the frontend is to a large extend patterns - if it looks like a checkbox it should have the appropriate ARIA tag, and patterns are easy for an LLM.

That kind of pattern was easy before AI.

It's just… a lot of people don't see this on their bottom line. Or any line. My awareness of accessibility issues is the Web Accessibility Initiative and the Apple Developer talks and docs, but I don't think I've ever once been asked to focus on them. If anything, I've had ideas shot down.

What AI does do is make it cheap to fill in gaps. 1500 junior developers for the price of one, if you know how to manage them. But still, even there, they'd only be filling in gaps as well as the nature of those gaps have been documented in text, not the lived experience of people with e.g. limited vision, or limited joint mobility whose fingers won't perform all the usual gestures.

Even without that issue, I'd expect any person with a disability to describe an AI-developed accessibility solution as "slop": because I've had to fix up a real codebase where nobody before me had noticed the FAQ was entirely Bob Ross quotes (the app wasn't about painting, or indeed in English), I absolutely anticipate that a vibe-coded accessibility solution will do something equally weird, perhaps having some equivalent to "As a large language model…" or to hard-code some example data that has nothing to do with the current real value of a widget.