Comment by arjie

2 days ago

This kind of limited device is something I've been thinking about with respect to what interactions I want my children to have with computers. I remember when I was 9 years old and we got these computers at the lab at school and we wrote some LOGO and BASIC and it was a mind-blowing experience. We were drawing SQUARES! And we were making TRIANGLES of ASTERISKS! Hahaha, what a glorious thing that felt like.

I got so much joy from computers and I'd like my kids to have that kind of experience too without accidentally detouring into social media (which has my mind in a vice grip).

Still a couple of years away, but I think I'd like to evaluate this kind of device then and see if it's the right model to use.

I recall those days as well, but I think it's much harder to replicate today. In that era, computers were new and rare. Any interaction with a computer was notable, and then the ability yourself to not just interact, but actually CONTROL one was amazing.

Today, kids are surrounded by all kinds of tech. They see people interacting with tech in all kinds of ways from the moment they are cognizant. It's much harder to create that wow moment now.

  • What you say is true, but I'd like to add one point.

    What's toxic for childrens' development is the low bar of entertainment associated with something like an iPad connected to the Web: immediately, without any effort, you get entertained, and there are hundreds of movies that you can watch. Got bored by them? There's millions of funny TikTok clips to view and share with outhers? Or go to the App store and download a million games by just one click each; none of it is any EFFORT.

    In contrast, in the 1980s, you had to put some effort in, in order to eventually harvest your reward:

    You may have wanted to play a game, but you had to type in 16 pages of hexadecimal DATA statements encoding the machine code of the came before being able to type RUN.

    You may have wanted to draw a plane and make it fly across the screen, but you had to calculate the decimal numbers representing the sprite's bitmap by adding some powers of two.

    You may have wanted to write your own game in MOS 6510 assembler, but you had to learn how to code first and what the registers and opcodes are, and you had perhaps a magazine article and a book and an assembler or machine monitor on a tape, and no Web to look up solutions to problems on StackExchange. Heck, you may not even have known anyone to exchange info - coming from a small town of 8000, I didn't even know who else had a computer if anyone. So at the news agent, after bying my monthly homecomputer magazine, I hang around at the newsagent to check who bought the two only other copies of the same magazine and then wave my copy and - shy as I was - and introduce myself.

    The youngsters today - due to no fault of their own - have it too easy, so they naturally don't see the point to make any effort to learn "deep tech" because there's thousands of lower-effort activities that they can pursue first to entertain.

    A device that boots right into µPython and that can make music & be programmed to make music could be one such successful path...

The Basic variant on a TI-84 was magic. One science teacher at my high school allowed us to use any program as long as we wrote it ourselves. I remember making programs with nested menus for all types of physics algorithms. Through debugging, I ended up memorizing them anyway, but it was more fun (and error-free) to use the program itself.

> This kind of limited device is something I've been thinking about with respect to what interactions I want my children to have with computers. I remember when I was 9 years old and we got these computers at the lab at school and we wrote some LOGO and BASIC and it was a mind-blowing experience. We were drawing SQUARES! And we were making TRIANGLES of ASTERISKS! Hahaha, what a glorious thing that felt like.

Well, Minecraft 'redstone' works a bit like that?

Of course, it's embedded in a much bigger program, but I'm not sure that makes a difference to the kids?

  • I don't think comparing redstone to something like BASIC is fair. Redstone is easy to get started with but actually making something interesting with it is significantly more complicated. Minecraft Education edition is a better example where you can use Python or something like Scratch to interact with the game.

I'm not a parent, but I think encouraging anything creative is probably just as good.

Half the fun was not knowing how to do something. There was no other way to satisfy curiosity than to tinker endlessly and constantly seek out information. Stumbling upon unusually good programs made it seem like anything was possible regardless of the machine it ran on. Video games and the demoscene were like that for me, and now any modern machine really can run almost anything.

Programming can still be fun like that, but often in the context of existing ideas. My parents had similar feelings about new music and cars. The sense of wonder decreases when the bar is raised. That's not to say there isn't a ton left to explore, but that's the impression when curiosity is too easily satisfied. You have to keep up and find new ways to stay curious. We consume way more than we create these days.

The New Yorker had a memorable single panel comic by David Sipress with an old man saying "everything was better back when everything was worse". I just had to mention it.

  • Same here.

    I remember a large part of the fun was that we couldn't just look something up on the web - it didn't exist in our home in the eighties. Instead we'd pore over BASIC on floppies. Changing a thing, for example GRAVITY=0.1, and finding out the banana now flies almost straight up.

    Or meticulously typing over the source code printed in hand-me-down magazines. Evenings of me and friend one typing the other reading out loud. Then way more evenings of finding all the typos and bugs. And then one or two evenings running the game we just "wrote" and get bored immediately. And start changing things.

    This is how I learned programming and what has paid my bills for over 25 years now. There were few university careers for programming, but mostly, young-arrogant-me was like "well, I have learned myself programming, so I'd better follow a university program that teaches me stuff I don't already know XD".

    The tinkering and creative part has been lost by now for me. I lament that. So I've put aside a fund, finishing off some contracts now and from this summer on, will do unpaid work of "creative coding". Making "art" with software - something I now do in spare time, fulltime. Because that tinkering is what drew me in. Not the scrum-rituals, spagetti-code-wrangling or layers of architectural enterprise abstractions. But the fun of nesting nested loops and seeing my name fly over the screen in weird patterns, or the joy of making the matrix printer play a "song" by letting it print weird ascii strings.

    • When I was kid thinking about this as a career, I knew what I was getting into. I had the internet (when it was a lot smaller). I had seen big tech flop hard. I saw companies like Apple for what they were before and after their iDevices. By the time I was wrapping up my CS degree I had seen social media destroy itself and legacy media, the rise of web 2.0, SaaS startups, mobile apps, etc.

      I've been working professionally now for over a decade, but got started long before that as a child. Despite the endless negative things I could say about the modern era, I don't feel like any of it impacts my enjoyment of my work or gets in the way of my creativity.

      I think this is because the closest I've ever been to truly being alone with the machine is writing programs for my TI calculators, but even then I still had ticalc.org. Some programs on there were brilliant, but most were awful. It was the perfect balance for people my age at the time. Despite what people believe today, especially with their LLMs, I don't think the landscape has changed much in that regard. There's still a lot of awful code with few brilliant examples. That leaves room for me to work on new interesting stuff or improve what's there without having too much help spoiling it.