Comment by dfxm12
1 day ago
The importance of "learning how to learn" has been emphasized by all of my teachers since I was in highschool, or maybe even 8th grade, decades ago.
My computer engineering professors also emphasized user centered design. For one of Google's top scientists to bring this up is an admission that they won't, or can't, design a good user experience for their tools.
I remember the same thing. That doesn't mean they knew how to teach us to "learn how to learn". Neither does it mean that the underlying education system supported that goal.
Same goes for user-centered design. Trying to make something user-friendly is one thing, successfully doing it is another. Large organizations are especially poor at user-friendly design because the underlying structures which support that goal don't exist. Organizational science is still in its infancy.
Not to mention successfully measuring success or failure. The overwhelming majority of people I know (from your average Joe to your tenured professor at elite universities, from philosophers to physicists[0]) underestimate the difficulty of measuring things.
Almost everyone treats any metric as they would a ruler or tape measure. Even your standard ruler is not as good of a measuring device as you probably think! But this becomes a huge mess when we start talking about any measurement of statistics or some other abstraction. People treat metrics and algorithms as black boxes, rather than tools. Tools still require craftsmen, who understands: when they work, when they don't work, when they can be used in a pickle, what can be substituted in a pickle, their limits, what new problems they create, and so on. It is incredible how much complexity there is to things that appear so simple. But then again, that's why you get things like an engineering manual on o-rings that is over a thousand pages. And even those aren't comprehensive.
I'm not suggesting we all need to be "master craftsmen", but I actually think we would all do better if we recognized that everything has more depth than might appear. If only to give people a moment of pause to question if they are actually doing things the right way. There's always a better way. The real trick is learning what's good enough and you'll never know what is good enough when everything is simple.
[0] The exception tends to be those that need to work with high precision, since with these jobs you tend to be forced to deal with this in an explicit manner. So more common among people like machinists or experimental physicists. Though sometimes this ends up worse as they can end up operating on vibes. I think it happens when intuitions are successful for too long and not enough meta-analysis is done to update them.
The US education system only has one mode, and thats to survive in a slim way with overworked staff and huge classrooms. 40 kids in a math class is seen as normal.
Everything you see of its character, including emphasizing tests and practice, follows from that. Talking about good UX is miles away.
It's a problem that goes beyond the United States, overworked staff, and constraints in general, although these are legitimate concerns. I studied in a non-US country, but the attention paid by teachers to pedagogy was virtually zero.
I mean, we had five years of English classes in high school, and by the end of high school, less than five out of 30 people in my cohort were able to string a couple of sentences together in English. And my class was made up of serious, studious young people. It seems to me that the time was not well spent, but did the teacher, a caring and generally competent person, reflect on the poor results? I highly doubt it.
2 replies →
I found that emphasis to be similar to two teams building from opposite starting points, and never meeting in the middle.
The issue with "learning to learn" is that it does not include the foundational skill of "how to communicate". Far too often it is not a lack of desire to learn, it is the inability to communicate what one is trying to learn. When seeking help, not only does the seeker have difficulty expressing their situation, those trying to help are not taught how to listen and will offer solutions to an issue only starting to be explained. This difficulty is then compounded by self conversation bias that negatively spins against the person seeking. That is two very high hurdles: negative self bias, and inarticulate communications while seeking guidance.
“You have to learn how to learn” has been a phrase often repeated by teachers, but I don’t remember any of them emphasizing, for example, retrieval practice: you learn a skill or subject, move on to the next one, and leave it up to fate whether you remember anything from the first one.
It always surprises and saddens me that, despite having been an excellent student throughout my years of education, I remember practically nothing about 90% of the subjects I studied.
There is a certain amount of "use it or lose it" seemingly inherent to virtually every human endeavor. But I suspect if you were to enroll in a class in any of those subjects, you'd perform radically better than a peer who'd never studied them. IOW, there's often more latent memory than we realize or can easily retrieve.
"IOW, there's often more latent memory than we realize or can easily retrieve." -
I did not find it to be true almost at all, and I tested many other people on it. When I voiced my concerns, the usual answer was, "Yes, but when you pick up a book, you will remember". And then I asked, "Try it", and the subsequent answer was, "I have to admit you are right".
The "re-absorption speed" is heavily confounded by general IQ and the kind of cognitive stimulation one receives in daily life, but the original learnings are mostly gone. Among other things, this is why retrieval practice is important: it slows down the "forgetting rate".
> One thing we'll know for sure is you're going to have to continually learn ... throughout your career," he said.
This has been the case for literally my entire career and I assume most of the professional world for the last half century.
Technology is continually reshaping industries and while many eschew learning and adopting, those who embrace it are the ones who succeed best IME.
> Technology is continually reshaping industries and while many eschew learning and adopting, those who embrace it are the ones who succeed best IME.
I said it already in a reply to GP, but I'm going to say it again: I stopped caring about what people list on their resumes, your work history and education don't matter to me. I'd rather hire a hungry junior that finished a bootcamp, that has a drive and ability to absorb new things and adapt to changing environments, over somebody who's got 10 years of experience and can't do shit outside of their comfort zone.
The number of people who aren't able to learn and adapt to changing times, new tools, new ways of working, etc. is shocking.
Thing is, this has nothing to do with UX or even AI at the end of the day. Over the years I have adjusted the way I handle interviews when I'm on a hiring panel to focus on critical thinking, problem solving, and lifelong learning above all else - because they are the #1 indicator that a hire is going to be successful in a position, even if that means they need a lot more training before they become productive.
I can teach somebody who finished a 6-month coding bootcamp Go, all the internal tooling, go over the business with them, etc. if they have these skills and end up with a productive mid-level engineer who gets shit done in a few years. What I can't teach is the drive and ability to learn, that's a much longer process and if you don't already have it then I'm not prepared to develop it.
Hell, outside of looking for signs of obvious bullshit I stopped giving a shit about resumes. Your work history does me no good, your education doesn't matter to me, and your references are useless beyond making sure you aren't straight up lying to me about your employment history. Every single time I have hired somebody who has 5 years of "experience" working with technologies I bullet pointed on a JD they ended up fumbling the moment they had to do something new. Doing leetcode, pair programming sessions, take-home assignments, whiteboarding system designs, etc. for SWE positions did nothing to really improve this; for SRE/DevOps roles I tried trivia questions (how are containers implemented - like what kernel technologies do they use and what do they do, how would you go about investigating why a service is consuming 100% CPU time), throwing them at broken VM's and more take-home assignments.
AI tools only make this skillset more important - I can throw Junie, Claude Code, or Copilot and small task and end up with...an implementation. But they still fuck up, constantly, and yet again, anything that's not already been done, regularly, requires a lot of guidance from an engineer in the loop. And with the god damned death of the web thanks to AI slop being posted anywhere, the ability to find answers and reason through problems is only going to become more important when these tools fail miserably for the third time in a row.
This reminds me of old videogames. Many didn't have tutorials. Or rather, they did, but they were the first level. Unlike most modern games many of them would just drop you in and you'd need to figure it out or read the instruction booklet. It definitely helped that there was a common language, that's still mostly used today, but the point was more to let users "discover" the controls themselves. Like here's the start to Wolfenstein 3D[0]. No popup messages, no nothing. A lot of this was done for space savings, but it also forced makers to design in a way that teaches the mechanics and how the world works. You can even see here how the level introduces players to secret rooms. Leading to many players doing the same thing they do in Zelda games, stabbing the walls to see if something is different or pay attention to subtle clues that a secret is here.
It's pretty hard to do this design but I think anyone who's played these games will both admit it is frustrating but rewarding. I think that's true for any learning. The advantage with a videogame is you can provide nearly immediate feedback as well as design feedback delays. I'm highly educated in both math and CS and I think that's actually one of the key differences. When programming there's quick feedback loops. Your program runs or doesn't[1]. Whereas in math you finish a proof and aren't even sure if it is right or not. This does end up teaching different and useful skills, but it sure does create a higher barrier to entry (barrier isn't intelligence, it is persistence).
I think my main concern today (having taught hundreds of college students over the last 5 years) is a lowering in this resilience. I mean I feel it in myself too. We've definitely generated a world where we have quick feedback mechanisms, yet this is impossible to create in more advanced education. It can take weeks, months, or even years to see the real fruits of your labors. I found that in classes where we had autograders or provided students with test cases[2] that often these ended up hurting the students more than helping. They became over-reliant on them, outsourcing their thinking to what we were providing as aids. I watched ChatGPT come out during this time and was not surprised that this only furthered the problem. I was only a grad student, so most classes I did not have good control and sometimes not much of a say, but if I were to do it again I'd try to push the aids out more slowly[3]. The most common problem was that students wrote to the test, not to the requirements. It's actually not a uncommon outside school, and I see a lot of people do quite similar things in industry. Thinking that passing tests is sufficient. But writing to tests will only result in you being as complete as the tests. It's a failed paradigm, you'll never have full coverage.[4]
There's definitely other problems with the education system and I don't want to dismiss them. There's no cureall, but I think this might be something most people might want to think about. Despite saying these words, they are still something I need to reinforce. Good habits are hard to maintain and it is only becoming easier to unknowingly slip into bad ones.
[0] https://www.youtube.com/watch?v=n0esIfiOGFA
[1] Well there's the secret third option which is the most common: it runs, but doesn't run like you think it runs.
[2] We always stated that these were incomplete, that they should write additional ones. There probably wasn't a single office hour I held where I hadn't mentioned that one can never have complete coverage through tests.
[3] I don't know the full answer but here's something I would try. We had autograders and allowed students to submit as many times as they wanted. I'd keep these, but re-implement to have an exponential backoff (until say 2hrs before the assignment was due. I honestly never marked anyone late unless it hadn't been submitted by noon the next day). After a few failed submissions, pass them a subset of test cases. Then repeat. This definitely puts a lot more work on our end as the educator, but it would put students into a position where they need think about the problem first. That's a critical self-learning strategy. The struggle is necessary for success. Too often people just want to jump to the end, assuming a well defined answer already exists. They'll find something that looks appropriate, implement it, and declare success while missing the devil hiding in the details. Too early of feedback only reinforces that strategy.
[4] I'm certain someone will read this believing I am suggesting no tests. I assure you, if this was your interpretation then your interpretation is wrong. Trust me, it is my thoughts I'm trying to convey.