To me, computers used to be fun when you commanded them what to do, they did it, then they prompted you for another command.
Now, more and more, computers are trying to tell us what to do. Notifications, unwanted ads, spam, recommendations, pop ups, accept this, subscribe to that, dark patterns trying to get me to do something… I never commanded my computer to do these things. Some product manager at some company 1000 miles away simply decided my computer should do these things, without even my input. Even my operating system! After booting up, it’s running hundreds of programs simultaneously. I did not tell it to run these things! It’s doing it all by itself out of the box. I feel less and less in control of my computer and more and more a bystander.
We (the software industry) have royally screwed up computers. Users used to be in control and now they are the ones being controlled or at least “influenced.”
So since xfce, gnome applets use this library, I either have no battery status applet or intermittently my computer goes into shutdown after resume because the battery falsely appears dead for a few seconds.
(The kicker: it doesn't log why it decided to initiate the shutdown. Took years to find the bloody cause..)
Why don't people fork these terribly managed projects? Even just working on the feature and then submitting the .patch file to be merged in by downstream distros would be a very meaningful signal.
> because the battery falsely appears dead for a few seconds.
Uuughh, I have this with Windows on my Dell XPS as well. Basically every time it comes out of sleep/hibernate, it will briefly think the battery is at 0% and try to shut itself down, and if you boot it up again without it being plugged it, it won't start up at all.
But when plugged in (either coming out of sleep or for the follow-on boot), after a few seconds it'll go "lol yeah no you are actually at 100%, no further charging is required, hooray!"
Would love to know how to disable the critical battery shutdown altogether in order to get around this. It's a bizarre and terrible bug to have in what is supposed to be a flagship developer machine.
> Then they coupled in "shutdown on low battery" and refuse to allow it to be disabled.
Isn't that to protect user's data? There's been numerous reports that modern-day high performance SSD's don't actually neatly write data to physical storage after a flush command; I wouldn't want to lose data if my system unexpectedly shuts down due to power / voltage issues.
Or is there additional low power protections at a hardware level?
I think it's important to always remember it's not the computers that are trying to tell us what to do, it's the people who build those computers and the software running on them that are trying to tell us what to do.
Computers, in their fundamental nature, are exactly as you describe them. Such devices will always be available (if nothing else, in the form of electronic components). We just have to refuse to use the machines that want to control us.
Little side rant here, I think one area where Free Software has failed so far is build systems. You get the source alright and the GPL even requires you to include build instructions ("all the source code needed to generate, install, ..."). But in practical terms the amount of effort it takes to actually build software yourself is often insane, far from being automatic and often requiring a lot of manual work and workarounds (especially when you leave plain Linux and start cross-compile, etc.). Now with Github we even have a lot of the build infrastructure be proprietary, and while it runs automatically on Github CI, there is no way to run Github CI locally.
There is effort put towards reproducible build now and some distros like NixOS seem on the right path. But I think we lost a lot of ground here by having the build process be filled with so much patch work and manual human intervention, instead of being 100% automated right from the start. We really need a "freedom -1" that requires software to be buildable fully automatic.
> computers used to be fun when you commanded them what to do, they did it
> Users used to be in control and now they are the ones being controlled
2013 was the watershed for me. You can read about why here [1]
There's a world of difference between using a tool and being a tool.
That transformation from "It's more fun to compute" to "If you've
nothing to fear you've nothing to hide" took place almost silently in
the first 20 years of this century.
The problem is that as "hackers" we don't understand computers.
Retaking tech, by fully understanding and helping to culturally
redefine computing is both the duty and prerogative of any real
hackers left out there.
As for the fun. It never went away for me. I am more passionate about
technology, coding, networks and electronics than at any time in my
life - precisely because the stakes are now so high.
> Retaking tech, by fully understanding and helping to culturally redefine computing is both the duty and prerogative of any real hackers left out there.
A powerful statement right there. As someone who grew up with computers from before the web, I feel that "cyberspace" has been colonized by business and political interests. It was supposed to be "our" space, I mean, by the people and for the people. Right now it's more useful as a tool for the dark empire.
I agree that the problem and the solution is cultural. It's about having fun, being weird and creative with how we use technology, to reclaim the magic and make it ours. Things like Tor, uBlock Origin, and dare I say some of the cryptocurrency and blockchain stuff, they feel like part of a larger decentralized underground-ish movement that has no name (and probably should remain so).
I just bought a copy, and look forward to reading it. (Unfortunately) I relate to what is described. I get this sense that the infinite possibilities given to us with computers has shifted to focus solely on consumption.
I know this isn't the case for everyone or everything--everything that makes computers special is still out there in one form or another, and arguably tools like YouTube and the like has made creating and sharing new things possible. It still seems you have to stray away from the path you're guided to in order to find them (and know they exist!). I'm thinking of things like microcontrollers, electronics, programming in general.
I would pay twice the going rate for a Macbook that ran a version of OSX that always immediately responded to my commands. When I hit Cmd-Q I want the app to close. Not when it's ready. Not after showing a dialog. Not after asking me if I'm sure. Not after cleaning up. Not after preparing to close. Not after doing some background processing.
Just close. I want to issue a command and I want it carried out immediately.
If the software can't do that gracefully, then it's bad software.
Also, there is no reason for the operating system UI to ever be stuck. If the program fails to redraw, show it blank, but don't prevent me from moving/sizing it. Even if the CPU is at 100%, I want my commands to get 1st priority. Too many times I had to spam Ctrl+alt+del just waiting for something to respond.
Software capable of doing some of the tasks people want done is sufficiently complex that it allows you to issue multiple commands for which immediate action would be contradictory. Telling the application to save state but then also respond to a quit command "immediately" would be one trivial example. Tell an application to quit right after you launched any operation that requires on-disk state to be modified would be the more general case. The software is not bad, it's just got enough power to allow you to make your intent ambiguous.
I think there are legitimate use cases for delayed close. Especially in gaming. Some games play in ironman mode where closing the game could provide a cheat. To avoid this they need to be able to save on exit. Likewise, if the game is in the process of saving an override close could corrupt a save file which is unlikely to be a wanted behaviour for the user. Unfortunately, many companies abuse this feature and give pop-up dialogs akin to "Are you sure you want to close us?". I think the OS is in a difficult place with finding a balance as the feature itself is necessary for some developers.
In another HN thread just the other day (paraphrasing, but the tone is accurate): "macOS is shit because cmd+Q kills my programs and it's too easy to hit accidentally while trying to strike cmd+A"
> If the software can't do that gracefully, then it's bad software.
What I want is that the software itself doesn't even get a chance to interfere with demands like ⌘Q. There's no reason Chrome should get to decide that it doesn't close until I hold ⌘Q; that way lies all sorts of dark patterns. (Adobe's attempt to seize control of basic OS functions is my bugbear here.)
I don't mind the OS having a system-wide setting whereby I can decide how much I want it to protect me from the consequences of my actions, but that should be a decision between me and the OS, enforced by the OS, not something I have to negotiate individually with every app.
(Same with the menu bar. I choose what goes in the menu bar, and macOS should enforce that choice, not tell me that the software has decided what goes there and I have to lump it.)
That’s a very hard constraint for a non real-time system to guarantee. It’s not necessarily a bad thing that your OS has a flexible scheduler. I think you’d find running a real-time OS as your daily development machine would have some of its own quirks you would find distasteful.
i think the reason a lot of software works that way is that most software does not carry out tasks asynchronously and even does blocking or time consuming stuff from the foreground or GUI thread instead of handing it off to a background thread. the reason for that is that doing anything async makes software exponentially more complex to engineer… so hence we have lots of apps that cannot just be stopped easily.
> "I'm not contacting the internet unless you ask me to"
I think there kind of is? Like, most Linux distros actually?
Yes, Ubuntu has snaps which try to talk to the internet and autoupdate, and yes this is absolutely terrible. Yes, sometimes a distro might notify you there are updates available. Yes, sometimes a distro talks to a ntp server to sync the time. But, generally, I don't feel internet usage is inflicted on me, I inflict it on myself.
In what ways which bother you does your Linux distro contact the internet without asking?
How is it that I paid for Windows (begrudgingly) and yet it's constantly undermining my efforts to control it? Microsoft has at least a dozen shitty apps that no matter how many times you disable or uninstall them, always seem to come slinking back into RAM. What the actual fuck?!
Speaking of computers commanding us ... a few years ago, I was visiting a friend's parents (practicing dentists). They told me in horror how Microsoft forcefully upgraded their work machines to Windows 10 overnight. And one of their applications stopped working due to the upgrade.
That's the first time I learnt that Windows was using that approach (I'm blissfully away from Windows; I dwell in the Linux world.) It was painful to watch them ask "how come this machine does such a big task [upgrade] forcefully?"
> To me, computers used to be fun when you commanded them what to do, they did it, then they prompted you for another command.
This is a good portion of why I switched to Linux full time over 10 years ago.
My computer does what I tell it to do, when I tell it to do it, as I tell it to do it. I fuck up? It's my fault and I know it and I have to fix it myself.
It has fewer, but unfortunately the mentality of developers believing they should be able to tell the user what to do is so pervasive that even FOSS developers do it. From what I can tell, GNOME is the most prevalent Linux Desktop environment and it is notorious for this. Ubuntu is, if not the most common distro, the most commonly recommended one and Canonical is also notorious for forcing things on the user they don't want like Snaps and auto updating.
We (the software industry) have royally screwed up computers. Users used to be in control and now they are the ones being controlled or at least “influenced.”
I think we naively believed that an increase of human technical capability would lead to such abundance that we'd achieve post-scarcity (at which point, whether we call it socialism or pretend it is liberalized capitalism, it doesn't matter) and permanent democracy... not the corporatized nightmare we actually got. The capitalists are right about very little, but they got this right: human nature can be shit. Most people are decent or want to be, but the ones who gain power in human organizations, especially organizations without purpose such as private corporations, are cancerous. We thought that problem would magically solve itself if we just made the world (in aggregate terms) richer, and we were wrong.
It's like a martial arts instructor who earnestly but unaccountably believes he's teaching good kids how to fight back against bullies. He may be. Or, he may be teaching the bullies. In our case, though, we weren't training... we were arming... and we didn't always know we were building weapons, but that's absolutely what we were doing... all of our "data science" got turned into decisions that hurt workers and enriched executives.
I'm speaking in past tense because we, as technologists, are no longer relevant. We've sold our souls. Capitalist hogs and their managerial thugs have won. Our moral credibility is deep in the negative territory. Power will either stay with those who currently have it, who have evil intentions, or move toward the set of people who work up the courage to overthrow the current system, who may or may not--it's impossible to know, as it hasn't happened yet--have ugly intentions.
"We" (meaning technologist culture) have never worked to promote centralized, monopoly services. The users did it to themselves, largely in pursuit of short-term convenience. "We" are now building the federated, interoperable platforms that users will hopefully come around to when the obvious problems of centralization (including widespread censorship, non-existent customer service and non-transparent AI's/bots run amok) start seriously biting them in the ass. In many ways, it's already happening.
There is also the situation that is not created out of malice or greed, but out of lack of restraint and divergence of priorities between the authors and the users: compilation speed. When a piece of software makes me wait for it, I feel subservient to it. It feels like sitting in a state office, waiting until a civil servant decides to grant you an audience. Except there is no civil servant, there is only the computer. How much do I have to wait? It depends on the alignment of the stars, air humidity and will of the gods.
When I used to work in office, there was this one time when I had an urgent ticket from customer to resolve, so I made the necessary fix and started building it. Normally builds would take something like 15 minutes, provided they were incremental and this wasn't the first (clean) one. But they could also take hours. It was at the end of the working day, I was working hard on this fix, didn't even take a lunch break. I started building it and waiting, because I wanted to share the fix with the customer as soon as possible. 15 minutes passed... 30 minutes passed... 1 hour passed... 90 minutes passed... 2 hours passed... And it was still compiling. At that point I gave up, went out and started walking home. But due to not eating the whole day and then staying late waiting for the build to finish, now I was so hungry that my hands started trembling on my way home and I felt generally weak, on the edge of being able to reach home. The next day I found out this build took ~4h 30min.
The reason for this state of affairs: of course I shouldn't care that much about my work and just make the customer wait instead. Put my health and time above that. But another important reason is that the build times were so unpredictable: when I hit "compile", it could take anywhere from 3 minutes to 4.5 hours. There is no planning you can do around that. If it was just a fixed 3 hours, it would be even better, because then I could plan my day around it. But it being so unstable destroys everything. Of course, if every build took 3 hours, people making decisions would wake up and see that we've got a pathological situation and there is something seriously wrong with the project. But when you often hit 15 minutes, it's going to be brushed off. And for the C++ committee even though compilation speed may be an issue, it is never a priority. There are always going to be other issues which eclipse it.
Personally, I think a build for even a OS-size project like that shouldn't take more than at most 1 minute. Even the incremental times in this one are a travesty.
What I like about what was planned for Jai (still not released) is that it's designed to always make a clean build. And the clean builds need to be fast. There should be no reason to make incremental builds. They are hacks that make the situation worse, but make it look better. Suddenly you're dealing with weird bugs, because the build system did not detect a necessary recompile and used stale cache entries (happened multiple times to me). The compile times are unpredictable (see the above story of why that matters).
I've seen some Delphi jobs in my country a year ago I think. Maybe I should switch there...
As you age it permeates every interest you had, because you already have taken the low hanging fruits. Not only computers, also books, movies, food and I bet even sex. Everything reminds you of something you have already lived. Every novelty is harder to find, and of course there are novelties, but they require an ever increasing amount of effort and time to find, and you have less and less time free. When you are a kid you are in a constant state of awe.
Computers... I don't care about a lot of the things I cared about. I cannot extract pleasure any longer for studying yet another random language. Now, if I want to truly get impressed learning Kotlin (to say something, fine language) doesn't cut it, I need quantum computing or dependent types or whatever. But it's much harder.
It would perhaps be wise to try to practice deriving happiness from something other than the seeking out of novelty.
Like picking up an instrument and start practicing more advanced music than the pop song of the day. That's really a long-term game.
Programming can be another such activity. I mostly stopped reading programming books some years ago, but I find that deliberating over the meta-game of programming, i.e. not how do I specifically solve this problem, but more how do I structure my solution, how do I simplify it, how do I reduce the problem itself to its core, how do I write the actual text, variable names etc., in such a manner that it is self-evident what is happening. That is truly a long game too. There's even a skill to the deliberation itself - too much deliberation is counterproductive.
I read somewhere that the philosophy in the antiquity defined being good not as based on absolute moral values of say unselfishness, like giving food to starving children, but on simply being really good at what you do. Perhaps that's from a realization of what actually makes people happy? Like the old carpenter expertly fixing a troublesome door while softly whistling to himself, a human being in inner peace.
This is a good point, and, interestingly, I think the author of the original post would agree. He clearly (in 2016) still enjoys his work, and seeks to be excellent at it. He's just come to the conclusion that he can both do that, and not have computers and operating systems and programming be the sole focus of his life.
> Perhaps that's from a realization of what actually makes people happy?
I just want to comment on the irony of praising a philosophy that focuses more on abstract virtue rather than an absolute metric of goodness, by noting that this increases happiness.
The other way to spin this is, maybe you've entered a phase in your professional career or technical interest phase where novelty and maybe "climbing the ladder" are potentially less of a motivator for you.
Now you can focus on more important metrics like: shipping products in the best designed way, putting technology stack choice aside for the sake of what works and what is maintainable. Or having a happy team that works together well.
The industry needs more of us measured, calm, and dispassionate middle-aged folks, not less.
I find it to be almost the opposite for movies. Yes, it's rare to find something unique, but after starting watching at least a movie per week, I'm now getting references I would never have gotten otherwise and put the movies in a completely different light.
For example, you're not gonna fully appreciate the latest spider-man movie unless you already watched the whole 3 series.
Or the other day, I was watching Ted (the teddy bear movie) and they recreated the dancing scene from Airplane. If I had never seen Airplane, I would probably still have laughed, but I wouldn't truly have gotten the joke.
Or in the Doctor Sleep movie, there's a story being told, but the Director also tried to reconcile The Shining movie, The Shining novel with the Doctor Sleep novel. You don't truly experience the movie fully unless you've experienced all of them.
I don't know for other media, but I find movie makers tend to be very meta and you gain as much by having seen all the low fruits as you lose by doing so.
I recently watched the whole Marvel Comics Universe series in chronological order. I'd seen a lot of them, but not all, and everything made so much sense when seeing it this way. WandaVision, of course, makes no sense at all without having seen most, if not all, of the movies.
somehow we only experience the superstitious dream of new
learning programming felt exciting but the more I know the more I realize the underlying principles were older than computers (ordering, algebra, physics)
there was nothing really new in learning java, haskell predated it yet I only got to know about it in 2004, people in the 50s did monoidal modeling of computing
"Nothing really new" - or in other deliberately twisted around but I think still valid as a point words "it's always the same atoms only in a different configuration. This universe is so boring".
Your perception of "new" depends on your accuracy of perception. With a very low-res eye a lot of things look the same and boring that a higher resolution perception sees as very different and interesting.
What's true for eyes and visual perception also is true for brains: If differences in reality always map to the same neurological pathways and create the same wave patterns in the brain it does not mean it actually is the same. More likely is the viewer either lacks the detailed perception and/or the detailed pathways for processing and reacting to it.
Too much abstraction can make things look boring and create a wrong impression of sameness. "History always repeats itself" - only that it never does, unless you filter out everything until what is left matches what your assumptions. Which actually also is what the brain does, once you see a certain pattern the brain steers you towards seeing it. For example, there are pictures where you can see one of two different things, and once you see one pattern you have to make a conscious effort to unsee it and see the other one.
So, a large part of it is that you see what yo expect to see. If you already determined things are the same you will subconsciously filter out that which does not fit your assumption.
.
In any case, I always suggest to programmers interested in something truly new to go into biology and bio-chemistry. All programming we do ends up on similar von Neumann architecture hardware, so it's not wrong that there is not that much difference between programming languages, compared to quantum computing or "biological computing".
Trying to understand - never mind create your own - biological systems is truly something entirely different and should satisfy the bored aging CS and programmer person. Just to clarify: I'm not talking about brains (although that would apply too), but the much more low level biology.
I suggest edx.org for a starting point. Some very good hig-level but introductory and free classes on biochemistry, biology, genetics, statistics (for life sciences with appropriate examples), etc.
> Computers... I don't care about a lot of the things I cared about. I cannot extract pleasure any longer for studying yet another random language. Now, if I want to truly get impressed learning Kotlin (to say something, fine language) doesn't cut it, I need quantum computing or dependent types or whatever.
I feel similarly. It's not actually a problem though, is it? I haven't looked into quantum computing, but dependent types are quite an exciting rabbit hole.
There is this computer science problem that I keep coming back to even though I have found like a half a dozen working solutions. I even reinvent the same solution with a twist most of the time and yet I can't stop but come back to it.
It sounds like the author is getting old. I am not saying that to dismiss him. But when you grow older you have more responsibilities and you start value your own time differently. All this computer hackery at home was fun back when I had lots of time at hand at home. But nowadays I work with computers all day and don’t want to bother with technical issues at home.
For a long time I didn’t even have a real computer at home. I was mostly watching YouTube on my chromebook and checked my bank account. Wrote the occasional letter. That’a about it.
Now I got into casual gaming a bit and enjoy playing games on my linux laptop. But I am using Ubuntu and not Arch or anything weird. I try to keep the tinkering to a low profile to keep it enjoyable.
What I find funny though, is all that time I spent hobbying around my OS back in the day has paid (and continues to pay) extreme dividends.
What I mean by that is my machine is some kind of basic linux desktop with a tiling window manager.. I haven't looked at the configuration in about 8 years, and even then it was probably 5 years before that that I really cared to give a lot of attention to it -- it survives OS upgrades and hardware replacements because what constitutes my HOMEDIR just gets copied over to new machines and I just install all the tools that are missing as part of the setup process.
But now, my machine does not do things randomly, it does not have weirdly undefined behaviours, it does not change it's UX, it does not prompt me unless necessary, it does not do unexpected things on shutdown or boot up or kick me off to perform upgrades.
Everything my computer does, happens at my pace, on my schedule and it's extremely rare I have any noticeable bugs from those updates.
I'm not saying this to brag, but it's interesting how often people who take the "easy" path end up spending so much more time fighting or re-learning their computer than I do.
Same thing here. My own scripts to configure a new Arch install automatically. Much quicker to install with the few things I want to install, than installing Ubuntu (for example) which bundles everything in, and more. It also does exactly what I script it do, so my scripts install specific versions of software whose version really matters. No surprises.
My custom Emacs configuration build? Automated my own way. My AwesomeWM configuration? Automated my own way.
I'm so glad I took the time to learn to do stuff manually, so I could tell them specifically how I want them done automatically. And it should last me quite a while.
I think that's another part of it; the novelty's worn off. When smartphones became ubiquitous, they were exciting; people would tune in on e.g. Apple's keynotes to see what they came up with this time.
But then it became more of the same. The difference between phone generations became more about performance and photography technology than anything new. The iphone hasn't done anything major since the iphone 4 or 5, or whenever they did their big redesign effort - IMO anyway, not to dismiss their efforts because in terms of power and software technology they've pushed the boundaries. Unfortunately it's mainly in areas I don't really care much about, that is, photography. Interesting to see how much the instagram generation directed phone development focus I guess.
The author mentions things like Netflix as well; it was written six years ago (ish), but it's only gotten worse. Netflix is full of what feels like artificial programs now, box ticking exercises; cartoon aimed at ages 3-6, check; segment aimed at PoC and/or women, check; David Attenborough nature documentary, check; political drama, check; desaturated Scandi crime show(s), check; video game tie-in, check; buddy cop film with high profile actors, check; the list goes on.
> I’m sorry, you may want to close your ears now, I want a distribution to be easy to install, so that I can just get on with my life, which is mostly kernel.
I’ve been in computing a long time. In the early days I made build systems to take apart and deliver DOS 5 and Windows 3.1 to student labs based on what they needed in that session. I ran Windows servers by the thousand. I administered thousands of users and thing client terminals. I ran SharePoint farms and used PowerShell scripts to build large farms on the fly. In my day job now I’m a cloud consultant, and I script migrations of users and I install telephony systems.
And nowadays I hate computers.
I can’t bear to use one when I’m relaxing at the end of the day. Now I use an iPad Pro to either consume content or creatively to draw and make videos. Yes it has limitations and restrictions. But I cant bear to use a desktop OS in any shape. It just grinds my gears and feels like work. Automate my house? Hell no. IOT fridge? Please. You have to know when to stop. Plus I’ve been burned by every OS and idiom in computing that you can think of, and I’ve had enough. Just let me browse the web (yes I use adblockers) and watch some vids. Ive gone through everything and now only use ‘computing’ devices creatively to improve my life.
What you really miss is your state of mind when you were young and carefree. Don't blame Netflix or Instagram or VCs. Your youth isn't coming back regardless of how computers may look today. You have aged 20 years. Your friends aren't having LAN parties they all have kids in middle school. Fiddling with computers isn't fun because you have taxes to file. You can't still have Red Bull and coffee-fueled all nighters hacking on that shiny new framework because your heart would literally give out. All the new kids entering the industry have their own culture and their own lingo and you hate it all.
Nothing is ever as it was in the "good old days". The best move is to realize this and adapt and make the best of your current life situation.
I'm a greybeard who wears socks-n-crocs, I have lan parties all the time in my house, and sometimes I nap in the afternoon so I can hack at odd hours. I think you should do what brings you joy. Maybe not all the time, because taxes and your health are important too, but at least some of the time.
> Nothing is ever as it was in the "good old days".
Some things aren't as good, but many things are, and at least some of those things are better.
Maybe not for you though? Everyone gets old, so of course, on the whole, at some point things for you are going to look worse than they used to be, but guess what: you're also going to die, so if you think the whole point of your life is about you, all you'll ever see is decline, so I think the best move is to think the whole point of your life is about something else, that way you can see things getting better and become a part of that.
Maybe you will discover life is about the community we're a part of and you need to get to a lan party: Some of those middle-schoolers might have fun playing games with their parents too.
Speak for yourself. We still have LAN parties. Plenty of times its online and thats ok. But we regularly gather up our computers on wheels and drag them everywhere. Sixteen to twenty xboxes is quite a sight.
If anything its gotten larger. Plenty of us bring kids with their own computers.
We all have dedicated server hardware. Monthly get-togethers at least. Mostly Factorio and similar but also quake/doom and quite a few others. We can still link back home for files and add more server space. We've now got the skills and hardware to do things we previously couldn't.
Why should we only remember the good ol days when we can live them right now? As for the music, the best stuff has just been written. Why limit yourself, anyway? We have a catalog of way more than fifty years!
C'mon. It's doesn't have to be so binary between getting old and things changing for the worse. They can both be partly true - and I think we could all agree that the fundamental character of working with computers has changed immeasurably during the last 20 years.
Computing becoming infrastructure does indeed changes things. Before the web your computer was just a side gig, a purely fun driven, consequence-less thing. It was mostly static, you bought the thing, the software (or hacked the shareware :) and that's it, nothing on your mind, you booted MSDOS without ever thinking about your password. Now it's part of the socio-economic ocean.. what you use changes due to market dynamics or policies.. obligations, privacy, security. It's obviously a different world.
Reminds me of articles on early amateur radio. It was free-spirited, until society decided it wasn't and now you need licences and it's a business first toy.
It's not wrong, but I don't think it's the whole story - otherwise, we'd expect today's youth to be just as happy and excited about that stuff as we were. But they don't seem to be - gen z seems to feel more miserable than previous generations at the same point in life.
It mostly gets different (and largely, I'd argue, better). Yeah, you get older - which should come with experience, self-knowledge, personal growth...
For me - LAN parties are now dinner parties. Hacking all night on that new framework is pondering architectural decisions and seeing them pay off days or weeks later. Young 'uns with their new ideas means freshness, and teaching opportunities, and foisting the stuff that takes that kind of energy off on those who have it ;)
Different is really only bad when you're unable to adapt.
You are putting words into his mouth. He is not blaming Netflix and co for anything, he is just stating that he is personally not interested in doing anything online in his free time and that the reasons he got into FOSS are not valid anymore.
So sad that good memory also sucks (true for me, lots of good memory). LAN parties was the best social movement, it even had some sort of federation like cross internet cafe competition, customers exchange, home game, away game.
No, plenty of people still like computing when it's based on FLOSS and doing more interesting things than just logging on to some huge centralized service like IBM of old. Look at the excitement around new projects in Rust. Even Web3, for all of its silliness, builds on the same aspirations. Your comment is just thinly-veiled ageism, you are saying things about "getting old" that just don't apply.
Sure, but adaption can be a number of things. There is no real reason you cannot ‘go back’ if you adapt that way; it is a choice either way. Ofcourse things are not the same but the feel can be. And your heart definitely is a lot stronger (unless you have obvious defects by birth or lifestyle) than red bull and coffee.
This hits home. 20+ years of being a sysadmin, writing software, I've written and rewritten the same shit over and over. I'm disgusted by the world that was created around these network apps and I'm disgusted by my own minor role in helping the vultures build that world. It's nothing I wanted or imagined. Every time I think about it, I think of the Leonard Cohen lyrics:
Your servant here he has been told /
To say it clear, to say it bold /
it's over, it ain't goin any further.
And now the wheels of heaven stop /
you feel the devil's riding crop /
get ready for the future, it is murder.
I've been worried of feeling the same. Now, I try to do software dev at companies whose product isn't software. Currently, that is souvenir manufacturing automation. The job is producing the simplest software that speeds up the production process. No need for fancy modern UIs, no need for subscriptions, no need for handling large amounts of private user data... it's nice. Much more gratifying than working at <insert-generic-big-tech-company-here>.
I see a lot of disenfrachised, highly skilled software engineers in this thread that I think would be very well welcomed into the electronics industry if they spent a few weeks trying their hand at using those skills for embedded systems.
Although I work for a big tech company now (the fun kind for me at least, making hardware platforms), in a previous life I made consumer products. Hair dryers, hoover's, ovens, lamp's, etc. It's amazing how much more your average person's eyes light up when you say that you helped design their kettle, compared to the chip in their phone or the backend of their social media, and there really is a certain amount of joy in that.
I still like computing but not necessarily computers anymore, and I especially hate when every single thing that is not a computer becomes a computer.
The ratio of extracting utility and fun from computer vs fighting against the system has plummeted. And doubly so in products where computers aren't the main thing but just a means to implement something else which, in the process, in effect becomes――computer software. Add to that the requirement to always connect online and you're double-fucked. I just... don't do that anymore. My hobbies are all non-electronic and on spare time, I just use my laptop basically to watch Youtube and post on forums (while they still exist).
The general usage pattern of using information technology has gradually shifted from "how to get the computer to do stuff for me" to "how do I work around all this shit to do anything at all". This narrative applies both to user interfaces and browsing but also programming. It's all more work for nothing much.
I still enjoy algorithms and programming as long as avoid interfacing with anything that's modern. I don't long for the old days but I want the sense of ownership and power back that I used to have. Using computers today is too often as if the computer was using you.
I can understand the author's reasoning. I guess one can get quite fatigued from constantly being connected! I don't do most of the things the author has listed too. I use computers for work and studies and watch stuff on youtube. I don't have any smart devices at home and I don't think I ever will. But I still like computers! :D
What I don't get is the constant need to share everything on the Internet. I've stopped using every social network. I'm on Reddit and on this platform just to know what's happening in the tech world. I don't even read much news these days.
For me it is just like when I was 10 I had a pocket knife.
As a 10 yo I did not have much real stuff to do with a pocket knife, but it was cool.
Well as a 15yo I thought computers are cool and I was into computers - but still I did not have real stuff to do with those. Just gaming, configuring different linux distros for fun.
Now I am adult and I own knifes, but I don't care much about these. I need to chop a chicken or cut a piece of rope I get the knife but I am not collecting them or holding in my drawer in "special place".
The same with computers, nowadays I use them as tools and they are not meant to be fun. I have my work to do and I am done. Well I still do quite some web browsing and media consumption unlike author, but it is not like I am going to build a PC from scratch and install some distro of the month on it just to see if I can make it work, because I have real stuff to do.
OK sometimes I get raspberry PI from the drawer to see if I can install some new raspian or rasbberry os on it.
I definitely miss the nostalgic past. Not just programming when I was a kid, but when I became an adult as well. Not only was everything new, everything was very...visceral. You didn't have to learn to/pay for/etc 500 different services to get anything running. You downloaded programs that ran on your computer. There was no cloud to speak of. This all changed after ruby on rails took off by my estimation. It made it so the "common man" could code, and with that pandora's box open became the vast simplification of an otherwise artistic industry.
I don't really code at home anymore. At work I do the bare minimum to get by, as I always do. I hit my sprint goals to please the PHBs whose entire job it is to remind you of your failures (PMs). I no longer enjoy fixing bugs, writing new features, or anything else. I just do what I do because the pay is good enough. I'm truly only motivated to do anything, for any company, because otherwise I'd be fired. Thank you for destroying my bright eyed we-can-do-it attitude, SV.
I could've lived with cloud and a thousand frameworks. But the politicization of our industry and the micro-management culture almost every SV company exudes is toxic to anyone who believes in merit and creativity. I've been unable to find a company to work for that won't take an expedient political stance on whatever the "important social issue" is. Entire code bases are upended by entirely arbitrary sets of rules set by people who arent hackers at all. These people are simply social-science sit ins who want to elbow in on a lucrative industry.
The final nail is the coding interviews. Having been in the industry for 15 years, most of that senior, I would expect to be treated with dignity in an interview. It seems like every corp wants to treat me like a junior until proven otherwise. Yet they require me to send a resume anyway. How exhausting.
The industry is exhausting to be in. It takes bright eyed talented and creative individuals and grinds them to dust. The only benefit is it's one of the only ways to broach the upper middle class boundary now. So I can't leave to do something else and maybe find my joy in coding again someday. It's just suffering, and crying over the memory of a simpler, less political, and less vulture-capital driven hobby.
To put it bluntly I'm tired. Yet I can't rest, it's my only way to afford to retire.
> The industry is exhausting to be in. It takes bright eyed talented and creative individuals and grinds them to dust.
Sadly, I think this is true of every industry - with a few rare exceptions, perhaps. Workers are burnt out everywhere, and companies treat them as grist for the mill. It's just how industrial society is organized, it seems.
Elsewhere in the thread, someone mentioned how the school system "completely removed that joy" (in this case of mathematics). That sounds like the same situation, it's not just tech workers, it's teachers and students too.
There definitely are some pockets of hope, niches where creative juices are flowing, where people are enjoying their work. It's so rare to see, but I believe that's actually how things are supposed to be - I mean, the ideal state of society.
The historical period we're in currently is very primitive in some aspects, brutal and ignorant, despite all the advanced science and technology. The hope I see is to nurture and cultivate our "bright eyed talented and creative individuals", so that some of them may survive the grinding gears of Mordor.
> You downloaded programs that ran on your computer. There was no cloud to speak of.
You can still do this, there's probably a FLOSS package for your favorite thing floating around. It might even be packaged by your distro and be just an 'install' command away. Not everything runs on the cloud.
> Entire code bases are upended by entirely arbitrary sets of rules set by people who arent hackers at all. These people are simply social-science sit ins who want to elbow in on a lucrative industry.
If you're curious about exploring computers, doing worthwhile things with them and expanding your skills, you're a hacker. There are no inherent skill requirements, the whole thing is based on personal attitudes and observed results. You don't have to be a "rockstar coder", though it doesn't hurt either.
I have found retro computing to be a good antidote to this. If you loved computers in the 80s, you’ll probably love coding for computers from the 80s with the benefit of source control and a real text editor for your assembly language programs. And emulators make it possible to experiment with computers you never could have afforded, like an Apple I or a Pixar Image Computer.
The great thing about IT and tech is that it's incredibly broad. If you don't like fiddling with computers the way you did in "the golden days", find a different way to fiddle with them. Try machine learning. Robotics. Virtual Reality. Electronics. Create a game. Or don't and just enjoy the non-techythings without regret.
There is a difference in having fun with computers and getting a paycheck for computers and sadly the things you have to do to a computer to get the paycheck is often times a sin against many of the high minded ideas the 90s brought forth.
Let he who has not fudged a react component or lazily copied and pasted a function cast the first keyboard. As for me I find that my free time is much more filling when it doesn't involve a computer and while that's sad it should at least be understandable given the state of the industry and the nature of growing old.
I disconnect all networking to my work laptop and remove it from the dock to make sure using it again is something I have to actually need to do and nothing pops up randomly.
I get up from my desk at work everyday, disconnect and the thought of going back to my computer that same day makes me feel ill.
I will play video games on my switch, or I may use an xbox controller with steam big picture so its like a console.
I don’t use most social media,
I don’t run any OS I can’t control,
I have a plex server but I made it as low maintenance as possible
No smart tech
I have a cell phone that dies occasionally because I use it so infrequently and forget to charge it.
I use an iPad for reading, creating digital art and watching videos, I keep all work related stuff far away from it.
Going for walks, creating art, watching movies, spending time with my family are all things I would rather do that sit at a computer.
Outside of some casual gaming, I really hate computers.
I also really hate that my job involves spending 8 hours interacting with things that don’t actually exist, and that there is always some new thing popping up to replace other things that work really well for a very minuscule return on investment.
I hate how the internet has evolved into a giant meme, torrent of ads, and being hounded to agree to cookie usage.
Working in IT for 17 years has just completely soured me on computers.
Yep, over 30 years in software and I really don't like computers or really anything with a screen. And I don't even use the popular stuff anymore. I have moved to BSD on my work and home machines, and have an Android phone that I really only use for voice calls and text messaging with immediate family and friends. I am not on any social media. I haven't owned a TV in 10 years.
Computers have enabled a lot of convenience and entertainment in our lives, but I'm old enough to clearly remember the days when nobody had a PC or internet at home, or a mobile phone. I think it was a better way to live, even though some things took more time and effort. The pace of things was just more human, and your day unfolded as you planned it most of the time.
I'm aware this could just be the common tendency of an older person to view his youth more favorably, but it is how I feel nevertheless.
I was just on vacation for a week and I did not get on the web or check email at all. It was nice. Time passed more slowly. I would feel that the day should be over and it was only noon. When I retire I plan to stay offline as much as possible.
Is it that we used to think computers would make things better, not in minor ways but in some sort of fundamental, paradigm-shifting, revolutionary way? And then you realise it's a bit like moving to a new town - you bring yourself with you. Humanity with computers is at least as messed up as humanity without computers.
A bit like how the invention of the vacuum cleaner didn't free people from housework. It just led to higher cleanliness standards, so that people have to do just as much housework as before.
I've become the same way. I can't really pinpoint when it stopped being fun, but I know I used to have a hell of a time with a couple of cheap netbooks and a really cheap android phone with an unlocked bootloader. I had a whole setup in my living room on milk crates.
Somewhere along the way I started writing code as a full time job. I enjoyed it. After that I still enjoyed messing around. But I agree with the article in the post, it just became a mess of overcomplicated abstraction, a chore, a hassle, and now I want my (2) computers to just work when I need them and beyond that leave it alone.
I still write code on the side as a hobby, but I don't particularly enjoy it, I just build things I'd like to see exist in the world from time to time.
This matches my own feelings strongly. I think a lot of it has to do with what some other commenters have mentioned, in that I no longer feel in control of the computers I interact with, not really anyway. Some of that is simply that I have a better understanding now of all the factors involved.
I started working with computers when I was very little (4yo), and have really had this as part of my life and career since. Now I look around and other than the computer given to me by my employer, my next newest machine was the last gaming PC I built 11yrs ago, which I don't even use anymore. Outside of work, I'd much rather spend my time in the garage working on cars or spending time with friends or my wife doing something out of the house than stay home on a computer. I haven't completely eschewed computers, I still keep my website reasonably up to date, and I still occasionally play video games, and of course I read HN in my free time. But for the most part, once work is over, I don't even want to touch a computer. I feel like electronics have infected every part of modern life in a way to surveil and manipulate us as a a society, and I feel like if I share my thoughts about it publicly I'd sound like a Luddite. But I've learned most of what there is to know about computers in my life, and it didn't reassure the idealist views I had when I was younger, it shattered them.
Maybe I'm just getting old (I'm nearly 40), but I think we're starting to get to a point as a society where our computers no longer do what they're told, and the computers make decisions most people are forced to follow without any opportunity for their own judgement. It's frankly dystopian, and I desperately try to avoid it outside of work.
What I love about computing and programming is the ability to create something that really helps people, or that they find fun and enjoyable.
The problem is that over time, the opportunities to do this have become harder and harder to find. All the low hanging fruit has gone and more and more people are relying on social media instead of websites to communicate and run their businesses anyway.
Additionally, I'm just not interested in privacy violation, spyware, big data, ML etc.. so that frontier holds no appeal for me.
Ironically it seems like it was easier to innovate in the mobile space and find a market in the days of Palm PDAs than it is today.
I walked away from computers in 1996, not intending to go back. Child-raising and gardening/farming were going to be my future. I hated computers, I hated what computers had done to me, I hated what computers were doing to the world.
Then, somehow, a couple of years later I started writing FLOSS software for music (recording, editing, mixing, composition, processing). My interactions were with users for whom computers were a creative tool, not a business tool. When things worked, my software was helping people feel as if they were able to express themselves in ways they might not otherwise have been able to [0].
In 2022 I don't love computers, but I have a very different relationship to them than I did 2.5 decades ago. I still don't know if on balance they are really a good thing for the world, but for me personally, yeah, sure, I like them.
[0] one might argue that they should have been non-computer-based music making tools, and you could be right.
This is the curse of doing software for a living. When I worked as a dev, I also started to dislike computers. I would come home and not want to even touch my personal machine. Now I'm out of the software industry, and tinkering with my computers is one of my favorite things again. When I was working in software, getting a call from a family member for computer help was a nightmare. Now, I look forward to helping my parents set up a new machine or install some smart home device.
Had a similar experience. Switched to a different field just so I could program as a hobby. Turns out programming skills can be helpful in the most unexpected ways. I've had the opportunity to optimize business processes, for example.
These days, professionally, I am primarily a manager of people. This is the only thing keeping me in tech—my teams are composed of excellent humans and I get to make meaningful contributions to their personal and career development. I'm also able to influence the hiring process so that our incompetent HR team doesn't fuck up the hiring and onboarding experience beyond repair. And, the young people entering the workforce have no idea what to expect, how to act, how to behave, how to communicate like an adult, how to connect ideas—at the very least, I can help with this stuff and that makes me happy.
Outside of work, my computer usage is limited. I no loner maintain a personal blog or website, and I've given up on writing or online community projects simply because I can't stand having to constantly tinker with the tech—WordPress, Jekyll, maintaining domains, security, whatever. I don't have social media accounts. I don't read online or with a Kindle, I continue to buy and read physical books. No smart home, no smart car, no TV in the house, no apps on my phone other than weather and maps. No time spent watching obnoxious SeatGeek, Liberty Mutual, or Grammarly ads on YouTube.
I am a deeply technical person, my skillset makes me employable, technology and computers have made all sorts of things possible in my life, and for that I am thankful. But more and more, the good life is a life without the screen, without the connectivity, without the neurotic tinkering with tools and libraries, without having to spend full days figuring out if my data was exposed in a breach.
I sit on ass all day for work, I will not sit on ass all evening in pursuit of technical hobbies that no longer bring me joy. There are sunsets to see.
Small, indie internet hasn't gone away completely. There's still some discovering to be had. Not everything is a corporatized web service. One tool to help you browse the fun version of the internet in 2022 is Wiby [0]. It's been mentioned on HN many times, and I thought it was cool, but I never took the time to play around with it. Until a couple days ago.
Sometimes it gives you outdated results from the 90s, or the 2000s, but that's okay. It's fun to explore relics. Sometimes it gives you a site you expect to be from 2003, but it was actually just put up last year from another curious hacker. Those are fun to pop in on. That happened to me yesterday, and I just left him an email appreciating his website, and his various projects he posted on there, to hopefully spark a fun conversation.
The modern web is bland and boring. I'll agree to that. I don't like it much. But not all hope is lost. There are plenty of us that want the old internet, that some are building services and networks to revive that. Not all hope is lost.
Seems like almost all devs these days feel this way, and I don't quite get it.
As far as I'm concerned the ideal software ecosystem is one where every manmade object is connected, and the actual code is highly encapsulated where I never have to see any part of it except the part I'm changing.
My idea of an ideal software ecosystem isn't far from what we have today, minus the locked down proprietary protocols and cloud-only software, and with a few less containers.
Everything IoT, software done in high level, encapsulated, reuse heavy, and with an emphasis on extremely safety, and a whole lot less low level code where the compiler can't watch your back.
Things are pretty great now, we just need more performance-focused P2P tech and local APIs for our smart gadgets.
But it seems that other programmers really enjoyed a lot of the stuff that I'm happiest to see gone, the hacks, the low level algorithms work, then clever algorithms.
I'm kind of worried for the future, since it seems the best and brightest are now bored, and we might not get much real innovation without at least a few of them.
Software dev is part function and part art. All of us live somewhere on the reductive 1D dichotomy "function" <-> "art". Some (not all) people more on the "art" side prefer an open canvas which involves having direct access to the computer. Some on the "function" side find that commoditized software is safe and repeatable software. These two perspectives differ but often because of interests. There's more than enough room out there for both types of devs.
It seems like the scene is kind of splitting though.
Commercial and commercial-inspired software is doing a great job of making totally standardized predictable platforms, and the DIY minded people pretty much only do things like Arch, unless they're getting paid a lot.
Doesn't seem like the dev community is really excited about anything safe and repeatable anymore.
I think there is something to be said (by someone willing to put more thought into it) regarding how much we as a society have churned away at computing and turned into something /refined/... Perhaps an analogy of what I'm thinking is people chewing coca leaves vs snorting cocaine? Maybe too extreme, or not extreme enough?
Maybe a slightly less extreme analogy would be highly processed food vs. traditional food... I feel like I'm just learning that white bread isn't really healthy and is nutritionally deficient after many years of vigorous consumption.
Yes, a good chunk do. (You haven't lived till you heard a contractor wax for 30 minutes about why he prefers what hammer for what task ;)
Of course, by far not all. But the thing that truly differentiates computing from most other jobs isn't that people are indifferent to their tools and want work to end when they're home. That's the common case.
What's different is that software engineering seems to be the only job where its practitioners actively resent the tools of their trade, because those tools inevitably bleed from their work life into their private life.
I mean… yeah? Some trade workers I know genuinely do like their job, but they are in the same place in this story that the author is. They like their job, and are even moderately interested in the details of their job, but they get home at the end of the day and don’t want to start laying tile or framing a wall.
Computers might be a tool for you but pure entertainment for others and that’s perfectly fine.
My father had the same framing hammer for about 30 years. He refused to give up that hammer. When he helped my brother and I redo our house we bought, he was prepared to hammer the whole floor down with just his hammer. We convinced him to use a nail gun instead, just to prevent another bout of carpal tunnel syndrome. However, he was attached to that thing, out of all his tools, it was the one I precisely remember, particularly when I had to help him growing up.
I haven't had the same computer for 30 years, but I am attached to what I have as it continues to do exactly I want in terms of its ergonomics. :)
Actually yes, in terms of tools there are always those who care about quality and convenience, and those who don't. There exist premium framing hammers. The margin of difference is low. But if it's the difference between an injury and not, the price is irrelevant in most cases.
I might think that of a builder if he said he used to play with hammers as a kid and well into his adult life for fun and because he believed in hammers.
"The company's hosting a nominally-optional-but-not-really hammer-thon this weekend! Aren't you just so excited! You'll get to swing those hammers any way you like, not just the exact way your manager tells you to! Fun!"
Today they are tools but that is not what they used to be. In late 80 for about a decade later on they used to bring excitement and fun and surprise with a bit of frustration. Today they are just boring tools.
Why would anyone like or dislike tools? I don't like or dislike my toilet plunger. I would prefer it to be designed to get the job done quickly rather than wow me with the experience of using it.
Someone can certainly enjoy cooking in a fancy kitchen or making a table in the garage. That's an equivalent of programming and provides mental and aesthetic challenge. Or one can have a hobby of collecting and appreciating a particular kind of tools like lighters or retro computers.
But the success of an average computer today is to fade into background and be unnoticed and unappreciated, safe for provoking annoyance in rare cases when something breaks, as all tools do sometimes.
Repl.it and children really helped me focus on the bits of computing that I loved, namely theoretical Computer Science and learning/teaching it through functional programming in small steps, in Python.
Then I ended up taking a job as a high school CS teacher. Being around the kids was a huge boost. It really helped me zone in on the joyful parts of discrete mathematical computation.
Ironically, over time being around the school itself (hugely conservative for many reasons, especially their IT) almost completely removed all that joy. Repl.it was such a positive experience that it’s not an understatement to say that it came to the rescue. It really did come along and the most perfect time for me, back in 2020.
Again, not everything was great. Seeing the effects of TikTok on children was pretty depressing. So too was seeing how disrespectful parents are towards teachers, compared to a generation or two ago. Teachers are servants of the parents’ staunch individualist whims, whereas my experience in my own and my peers’ upbringing was that they were deferred to as figures of authority.
Build yourself a ray tracer that renders ASCII art and you can drown out the troubles of the world. Similarly if you get a bunch of 14 year olds to do image edge detection in one page of code. Grade A escapism, all round.
Elm is (maybe) a good choice if you want to focus on the functional aspect. Python is arguably a better choice for high school students as it gives them a large amount of utility. You can use python to do useful things quickly - elm, not so much. Not saying I don’t like elm, but just doesn’t make sense to me for this case.
I don’t like computers, I feel like we’ve lost ownership of our own software. Linux definitely helps with this and I’ve since transitioned.
It’s mobile devices that really bother me, I’ve since got my screen time down to 39m daily and I might just get rid of it and only use a pc for computer activities.
Small tangent, I’m a massive fan of e-ink, I’m waiting for the day where I can get a proper monitor as a reasonable price and use natural or backlighting when programming.
> Small tangent, I’m a massive fan of e-ink, I’m waiting for the day where I can get a proper monitor as a reasonable price and use natural or backlighting when programming.
Me too. Kobo devices run Linux and are easy to hack. Perhaps they can spark some joy or provide you with a cheap prototype of what you want.
One thing that I guess is responsible for some of the excitement around computers is that it's a thing to excel in, a potential career. At least subconsciously we knew we should find something to be good at, and you couldn't go wrong with the arcane knowledge of these new machines. And the deal was implicitly understood: Instead of completing a rigid course for every subskill, a freeform exploration was allowed to acquire the skills, a series of personal projects provided the intangibles, hardened us to face weird errors, to read the logs, to know the commandline, taught us to quickly read up on and understand how everything fits together after we've taken it apart. So let's come up with some love for that Raspberry Pi, shall we?
Now, as the career matures, more Linux won't help us any more, so we focus on team dynamics, social skills, business logic, and, if you're entirely over it, eventually leadership.
But, if the practical quantum computer came along tomorrow, opening up another tier of income and glory if we'd spend our weekends tinkering with the prototypes, we'd sure love it all again!
Computing and what is enabled by computing are different things. I like the computing aspect and continue to do so in my older age even if the interests change, but I also like a lot of stuff enabled by computing. However, the burden remains on me and everyone to take things for what they are. Computing and networking enabled broader access to literature and information, so it is fair to have this analogy: if there a few bad books, particularly ones that create undesirable impulses, I shouldn't hate all books or the printing press that made them. I am not going to associate computing or computers to bad actors or that "computing" for the 2020s means consuming endless timeline feeds. I have accounts on those platforms, but staring at those things too long feels like using a device for the sake of using a device, instead of computing for the sake of computing (writing a program to do a novel thing or finding ways to save old hardware to do 'modern' workloads are things that I think in terms of "computing".)
Yes operating systems are crufty wastelands. Yes the modern internet is a bordello of privacy snooping con men stealing our minds with dopamine engineering.
But most of all, when many of us (not all!) age our passions shift from abstractions to the real. I still think lots of computer ideas are quite neat but it doesn’t fill me with passion like it used to. But that’s the same way for many more abstract things that used to keep my brain and heart churning.
Now I appreciate things like a perfect summer sunset evening more. It’s a goofy evening playing board games and laughing with my family. Maybe it’s a growing sense that these moments aren’t infinite in quantity. Who knows. But it is a natural progression for many of us.
A shout out to Adam Williamson, the author of this post. He's one of the most excellent open source people I've ever worked with. He tirelessly helps run and lead Fedora Linux's QA. It's always fun troubleshooting a gnarly bug with Adam.
I'm in my mid-fifties, and I am still working on programming projects, finding new games to play, learning new things and just screwing around with tech as much as I used to.
Things have definitely changed, I'm no longer staying up all night trying to beat a particular game or doing absolutely nothing for an entire weekend but adding a new feature to a hobby program. Now I just spend most of a weekend doing that.
I think one thing about people who were into tech before the internet that makes us somewhat different is that computers didn't start out as a device for consuming things for us; it was a device to learn and to see what we could make it do. Even games, which were the most consumption-oriented things I used them for, were interactive.
Of course I do consume some content. I'll fall down a youtube rabbit hole on something trivial and I read a lot of tech news on sites where you can discuss things with others if I'm feeling so inclined. I have no interest in reading news on webpages that have no discussion mechanic, though. I think I fall down youtube rabbit holes because I still have that sense of wonder that you can now learn all about anything you really want to as deeply as you want to that the pre-internet me would have killed for.
I abhor most social media mainly because every thing I looked at early on felt toxic to me, so I never got involved enough to get addicted to it. I ditched cable TV a decade or two ago after I spent one weekend mindlessly flipping through channels feeling like a vampire had just taken a huge bite from me. That was a weekend I could have spent sniping people on Quake servers or trying out a new idea for compressing images I had the other day or whatever.
I totally agree about the lack of control. Starting out programming on a Tandy Color Computer where you could change the screen resolution by filling a few memory addresses with data and the OS was a BASIC prompt leaves you feeling a bit helpless in today's computing world. I mitigate that by using linux at home, like others here have mentioned.
I wouldn't want to go back to the old days, though. Pre-internet would kill me knowing what I have access to now.
> I dunno where I'm going with this. I don't have any big thesis. I just wanted to write it down.
the invectives of “what did you expect?” seem at best tangential to the author’s intent here. sounds like they’re just trying to express perspective, not mount an argument with a specific aim.
to the author’s point, wasn’t it great when the internet had convinced itself that “zero sum”/scarcity were an unnecessary constraint of a physically constrained world? that ideal may not have shaken out, but there’s no reason to disparage someone for lamenting it (especially when they’ve actually put sweat into that vision).
not everything is a pitch. speech isn’t always an an advertisement or a polemic.
Thats not at all the vibe I got from reading this. Instead, the author pretty specifically lists the things they instead choose to spend their time on and why it’s not computers _anymore_.
I agree -- it's pretty clear to me that the author has found a difference between their work, which they still find enjoyment from (perhaps because they are good at it?), and their non-work life.
I can relate somewhat -- I love tech -- robotics, in particular -- but have lately become a bit disappointed with the current state it in product world. Thankfully I'm also in a state where I can reflect upon why I feel that way, and then muster the energy to do something about it.
Perhaps this is a reflection upon the age of the author too. I suspect as I get older, I'm going to want towards more "wholesome" things -- as I define that word.
I find working as a software engineer very boring. But my interest in computers is still big. Personally I like to ignore all the "YOU MUST LEARN <trendy language/framework etc>" and instead learn the older but very powerful tools and languages etc, I almost have nostalgia for them even though I was too young when they were first being created - the unix tools like sed, awk etc, perl, and of course Linux itself. I love learning about and using these for small fun tasks. Basically Linux is fun.
All these responses really resonate with me. I'm not burned out on computers, but too often I find my self just doomscrolling and looking for something else to do. So I decided to take up virtual retrocomputing, learning about the Apple II and CP/M. It's not at all useful, but to me it's all new (I come from a Commodore background. It's challenging and keeping me fresh.
This feeling comes to me in waves and it’s been like that since I was still an awkward teen in the late 90s.
I wonder if it’s just because I tend to expend so much energy on a very interesting (to me, at least) computer-related problem that I need to step back afterwards for X number of days/weeks/months before I’m ready for the next one.
I spent most of my spare time either reading books or messing around on the internet - and for all you kids out there, this was the 1990s internet, when 'the internet' was mostly email, usenet and FTP, and you accessed it over a dial-up modem
One of my major criteria for buying things is whether they can do their job without a network connection I use computers for...well, I use them for reading stuff
Somewhere along the way, in the last OH GOD TWENTY YEARS, we - along with a bunch of vulture capitalists and wacky Valley libertarians and government spooks and whoever else - built this whole big crazy thing out of that 1990s Internet and...I don't like it any more
I actually don't use the internet the same way I used to. Now, I typically interact with it through a text-based version of the content collected through a commandline utility integrating Lynx.
tbh, sounds like the user hates the cloud... and rightly so, it's a mess and strips users of more freedoms than just using propietary tools which stop working one day.
the laundry list of things he doesn't like to use computers for were all things that all the forefathers of computing also didn't use computers for
like me I don't think this guy really dislikes computers, but modern information systems and their role in society
which is fine, because it is getting cumbersome now, especially with how smartphones have become so ubiquitous that they're practically a required tool to live a normal human life
I don't like computers because the halting problem has no solution. This is why every software-based product will always have annoying varying latencies. My automatic watch, my piano, and my light switches will always respond immediately because they are not turing complete.
A smartwatch, a virtual synthesizer, and IoT switches will invariably fail to respond immediately one day. And because of that I will always activate them with anxiety.
The author has the luxury of not needing to do constant side projects and professional development. I envy his professional achievement/job security/indifference.
To me, computers used to be fun when you commanded them what to do, they did it, then they prompted you for another command.
Now, more and more, computers are trying to tell us what to do. Notifications, unwanted ads, spam, recommendations, pop ups, accept this, subscribe to that, dark patterns trying to get me to do something… I never commanded my computer to do these things. Some product manager at some company 1000 miles away simply decided my computer should do these things, without even my input. Even my operating system! After booting up, it’s running hundreds of programs simultaneously. I did not tell it to run these things! It’s doing it all by itself out of the box. I feel less and less in control of my computer and more and more a bystander.
We (the software industry) have royally screwed up computers. Users used to be in control and now they are the ones being controlled or at least “influenced.”
Here's one: someone wrote a battery status "middleware" which reports battery status on DBus. Fine.
Then they coupled in "shutdown on low battery" and refuse to allow it to be disabled. https://gitlab.freedesktop.org/upower/upower/-/issues/64
So since xfce, gnome applets use this library, I either have no battery status applet or intermittently my computer goes into shutdown after resume because the battery falsely appears dead for a few seconds.
(The kicker: it doesn't log why it decided to initiate the shutdown. Took years to find the bloody cause..)
> and refuse to allow it to be disabled
Why don't people fork these terribly managed projects? Even just working on the feature and then submitting the .patch file to be merged in by downstream distros would be a very meaningful signal.
14 replies →
> because the battery falsely appears dead for a few seconds.
Uuughh, I have this with Windows on my Dell XPS as well. Basically every time it comes out of sleep/hibernate, it will briefly think the battery is at 0% and try to shut itself down, and if you boot it up again without it being plugged it, it won't start up at all.
But when plugged in (either coming out of sleep or for the follow-on boot), after a few seconds it'll go "lol yeah no you are actually at 100%, no further charging is required, hooray!"
Would love to know how to disable the critical battery shutdown altogether in order to get around this. It's a bizarre and terrible bug to have in what is supposed to be a flagship developer machine.
> Then they coupled in "shutdown on low battery" and refuse to allow it to be disabled.
Isn't that to protect user's data? There's been numerous reports that modern-day high performance SSD's don't actually neatly write data to physical storage after a flush command; I wouldn't want to lose data if my system unexpectedly shuts down due to power / voltage issues.
Or is there additional low power protections at a hardware level?
2 replies →
> Then they coupled in "shutdown on low battery" and refuse to allow it to be disabled
Would you rather have your computer crash when it runs out of battery, or shutdown gracefully at 2 percent battery.
6 replies →
I think it's important to always remember it's not the computers that are trying to tell us what to do, it's the people who build those computers and the software running on them that are trying to tell us what to do.
Computers, in their fundamental nature, are exactly as you describe them. Such devices will always be available (if nothing else, in the form of electronic components). We just have to refuse to use the machines that want to control us.
Also relevant: https://www.gnu.org/philosophy/free-sw.html
> https://www.gnu.org/philosophy/free-sw.html
Little side rant here, I think one area where Free Software has failed so far is build systems. You get the source alright and the GPL even requires you to include build instructions ("all the source code needed to generate, install, ..."). But in practical terms the amount of effort it takes to actually build software yourself is often insane, far from being automatic and often requiring a lot of manual work and workarounds (especially when you leave plain Linux and start cross-compile, etc.). Now with Github we even have a lot of the build infrastructure be proprietary, and while it runs automatically on Github CI, there is no way to run Github CI locally.
There is effort put towards reproducible build now and some distros like NixOS seem on the right path. But I think we lost a lot of ground here by having the build process be filled with so much patch work and manual human intervention, instead of being 100% automated right from the start. We really need a "freedom -1" that requires software to be buildable fully automatic.
7 replies →
> computers used to be fun when you commanded them what to do, they did it > Users used to be in control and now they are the ones being controlled
2013 was the watershed for me. You can read about why here [1]
There's a world of difference between using a tool and being a tool.
That transformation from "It's more fun to compute" to "If you've nothing to fear you've nothing to hide" took place almost silently in the first 20 years of this century.
The problem is that as "hackers" we don't understand computers. Retaking tech, by fully understanding and helping to culturally redefine computing is both the duty and prerogative of any real hackers left out there.
As for the fun. It never went away for me. I am more passionate about technology, coding, networks and electronics than at any time in my life - precisely because the stakes are now so high.
[1] https://digitalvegan.net
> Retaking tech, by fully understanding and helping to culturally redefine computing is both the duty and prerogative of any real hackers left out there.
A powerful statement right there. As someone who grew up with computers from before the web, I feel that "cyberspace" has been colonized by business and political interests. It was supposed to be "our" space, I mean, by the people and for the people. Right now it's more useful as a tool for the dark empire.
A Declaration of the Independence of Cyberspace - https://www.eff.org/cyberspace-independence
I agree that the problem and the solution is cultural. It's about having fun, being weird and creative with how we use technology, to reclaim the magic and make it ours. Things like Tor, uBlock Origin, and dare I say some of the cryptocurrency and blockchain stuff, they feel like part of a larger decentralized underground-ish movement that has no name (and probably should remain so).
1 reply →
I just bought a copy, and look forward to reading it. (Unfortunately) I relate to what is described. I get this sense that the infinite possibilities given to us with computers has shifted to focus solely on consumption.
I know this isn't the case for everyone or everything--everything that makes computers special is still out there in one form or another, and arguably tools like YouTube and the like has made creating and sharing new things possible. It still seems you have to stray away from the path you're guided to in order to find them (and know they exist!). I'm thinking of things like microcontrollers, electronics, programming in general.
Hey man, haven't read that book, but I love your other one. Thanks for writing it.
1 reply →
I would pay twice the going rate for a Macbook that ran a version of OSX that always immediately responded to my commands. When I hit Cmd-Q I want the app to close. Not when it's ready. Not after showing a dialog. Not after asking me if I'm sure. Not after cleaning up. Not after preparing to close. Not after doing some background processing.
Just close. I want to issue a command and I want it carried out immediately.
If the software can't do that gracefully, then it's bad software.
Same problem in Windows.
Also, there is no reason for the operating system UI to ever be stuck. If the program fails to redraw, show it blank, but don't prevent me from moving/sizing it. Even if the CPU is at 100%, I want my commands to get 1st priority. Too many times I had to spam Ctrl+alt+del just waiting for something to respond.
17 replies →
Software capable of doing some of the tasks people want done is sufficiently complex that it allows you to issue multiple commands for which immediate action would be contradictory. Telling the application to save state but then also respond to a quit command "immediately" would be one trivial example. Tell an application to quit right after you launched any operation that requires on-disk state to be modified would be the more general case. The software is not bad, it's just got enough power to allow you to make your intent ambiguous.
5 replies →
A "you have spent 3 hours working on this document without saving, are you sure you want to close?" windows used to be a good thing.
Nowadays the software should just auto-save it for you. But not all software is that well written, and the OS can't tell it apart.
1 reply →
I think there are legitimate use cases for delayed close. Especially in gaming. Some games play in ironman mode where closing the game could provide a cheat. To avoid this they need to be able to save on exit. Likewise, if the game is in the process of saving an override close could corrupt a save file which is unlikely to be a wanted behaviour for the user. Unfortunately, many companies abuse this feature and give pop-up dialogs akin to "Are you sure you want to close us?". I think the OS is in a difficult place with finding a balance as the feature itself is necessary for some developers.
1 reply →
In another HN thread just the other day (paraphrasing, but the tone is accurate): "macOS is shit because cmd+Q kills my programs and it's too easy to hit accidentally while trying to strike cmd+A"
3 replies →
> If the software can't do that gracefully, then it's bad software.
What I want is that the software itself doesn't even get a chance to interfere with demands like ⌘Q. There's no reason Chrome should get to decide that it doesn't close until I hold ⌘Q; that way lies all sorts of dark patterns. (Adobe's attempt to seize control of basic OS functions is my bugbear here.)
I don't mind the OS having a system-wide setting whereby I can decide how much I want it to protect me from the consequences of my actions, but that should be a decision between me and the OS, enforced by the OS, not something I have to negotiate individually with every app.
(Same with the menu bar. I choose what goes in the menu bar, and macOS should enforce that choice, not tell me that the software has decided what goes there and I have to lump it.)
That’s a very hard constraint for a non real-time system to guarantee. It’s not necessarily a bad thing that your OS has a flexible scheduler. I think you’d find running a real-time OS as your daily development machine would have some of its own quirks you would find distasteful.
2 replies →
This is why I love Debian linux. It’s not nearly as polished as MacOS but dammit it’s quick and does what I tell it to do.
i think the reason a lot of software works that way is that most software does not carry out tasks asynchronously and even does blocking or time consuming stuff from the foreground or GUI thread instead of handing it off to a background thread. the reason for that is that doing anything async makes software exponentially more complex to engineer… so hence we have lots of apps that cannot just be stopped easily.
kill -9
Ads in Windows. Just...the insane amounts of ads.
There's probably a distro offering for Linux that should be made where is promise is just "I'm not contacting the internet unless you ask me too".
> "I'm not contacting the internet unless you ask me to"
I think there kind of is? Like, most Linux distros actually?
Yes, Ubuntu has snaps which try to talk to the internet and autoupdate, and yes this is absolutely terrible. Yes, sometimes a distro might notify you there are updates available. Yes, sometimes a distro talks to a ntp server to sync the time. But, generally, I don't feel internet usage is inflicted on me, I inflict it on myself.
In what ways which bother you does your Linux distro contact the internet without asking?
2 replies →
I miss when I could put Wireshark on an interface and nothing would show up until I pressed a button somewhere.
This.
How is it that I paid for Windows (begrudgingly) and yet it's constantly undermining my efforts to control it? Microsoft has at least a dozen shitty apps that no matter how many times you disable or uninstall them, always seem to come slinking back into RAM. What the actual fuck?!
Every distro has privacy issues, here are just the ones that we know about in Debian:
https://wiki.debian.org/PrivacyIssues
2 replies →
Speaking of computers commanding us ... a few years ago, I was visiting a friend's parents (practicing dentists). They told me in horror how Microsoft forcefully upgraded their work machines to Windows 10 overnight. And one of their applications stopped working due to the upgrade.
That's the first time I learnt that Windows was using that approach (I'm blissfully away from Windows; I dwell in the Linux world.) It was painful to watch them ask "how come this machine does such a big task [upgrade] forcefully?"
> To me, computers used to be fun when you commanded them what to do, they did it, then they prompted you for another command.
This is a good portion of why I switched to Linux full time over 10 years ago.
My computer does what I tell it to do, when I tell it to do it, as I tell it to do it. I fuck up? It's my fault and I know it and I have to fix it myself.
No more "What the fuck just happened?" bullshit.
Have you tried using a Linux desktop? I personally find that it doesn't have all the negative things you mention.
It has fewer, but unfortunately the mentality of developers believing they should be able to tell the user what to do is so pervasive that even FOSS developers do it. From what I can tell, GNOME is the most prevalent Linux Desktop environment and it is notorious for this. Ubuntu is, if not the most common distro, the most commonly recommended one and Canonical is also notorious for forcing things on the user they don't want like Snaps and auto updating.
3 replies →
It needs to be a library and it's turned into a framework.
Yes, and now everything is on the cloud so you don't even own your files anymore.
We (the software industry) have royally screwed up computers. Users used to be in control and now they are the ones being controlled or at least “influenced.”
I think we naively believed that an increase of human technical capability would lead to such abundance that we'd achieve post-scarcity (at which point, whether we call it socialism or pretend it is liberalized capitalism, it doesn't matter) and permanent democracy... not the corporatized nightmare we actually got. The capitalists are right about very little, but they got this right: human nature can be shit. Most people are decent or want to be, but the ones who gain power in human organizations, especially organizations without purpose such as private corporations, are cancerous. We thought that problem would magically solve itself if we just made the world (in aggregate terms) richer, and we were wrong.
It's like a martial arts instructor who earnestly but unaccountably believes he's teaching good kids how to fight back against bullies. He may be. Or, he may be teaching the bullies. In our case, though, we weren't training... we were arming... and we didn't always know we were building weapons, but that's absolutely what we were doing... all of our "data science" got turned into decisions that hurt workers and enriched executives.
I'm speaking in past tense because we, as technologists, are no longer relevant. We've sold our souls. Capitalist hogs and their managerial thugs have won. Our moral credibility is deep in the negative territory. Power will either stay with those who currently have it, who have evil intentions, or move toward the set of people who work up the courage to overthrow the current system, who may or may not--it's impossible to know, as it hasn't happened yet--have ugly intentions.
"We" (meaning technologist culture) have never worked to promote centralized, monopoly services. The users did it to themselves, largely in pursuit of short-term convenience. "We" are now building the federated, interoperable platforms that users will hopefully come around to when the obvious problems of centralization (including widespread censorship, non-existent customer service and non-transparent AI's/bots run amok) start seriously biting them in the ass. In many ways, it's already happening.
There is also the situation that is not created out of malice or greed, but out of lack of restraint and divergence of priorities between the authors and the users: compilation speed. When a piece of software makes me wait for it, I feel subservient to it. It feels like sitting in a state office, waiting until a civil servant decides to grant you an audience. Except there is no civil servant, there is only the computer. How much do I have to wait? It depends on the alignment of the stars, air humidity and will of the gods.
When I used to work in office, there was this one time when I had an urgent ticket from customer to resolve, so I made the necessary fix and started building it. Normally builds would take something like 15 minutes, provided they were incremental and this wasn't the first (clean) one. But they could also take hours. It was at the end of the working day, I was working hard on this fix, didn't even take a lunch break. I started building it and waiting, because I wanted to share the fix with the customer as soon as possible. 15 minutes passed... 30 minutes passed... 1 hour passed... 90 minutes passed... 2 hours passed... And it was still compiling. At that point I gave up, went out and started walking home. But due to not eating the whole day and then staying late waiting for the build to finish, now I was so hungry that my hands started trembling on my way home and I felt generally weak, on the edge of being able to reach home. The next day I found out this build took ~4h 30min.
The reason for this state of affairs: of course I shouldn't care that much about my work and just make the customer wait instead. Put my health and time above that. But another important reason is that the build times were so unpredictable: when I hit "compile", it could take anywhere from 3 minutes to 4.5 hours. There is no planning you can do around that. If it was just a fixed 3 hours, it would be even better, because then I could plan my day around it. But it being so unstable destroys everything. Of course, if every build took 3 hours, people making decisions would wake up and see that we've got a pathological situation and there is something seriously wrong with the project. But when you often hit 15 minutes, it's going to be brushed off. And for the C++ committee even though compilation speed may be an issue, it is never a priority. There are always going to be other issues which eclipse it.
Personally, I think a build for even a OS-size project like that shouldn't take more than at most 1 minute. Even the incremental times in this one are a travesty.
What I like about what was planned for Jai (still not released) is that it's designed to always make a clean build. And the clean builds need to be fast. There should be no reason to make incremental builds. They are hacks that make the situation worse, but make it look better. Suddenly you're dealing with weird bugs, because the build system did not detect a necessary recompile and used stale cache entries (happened multiple times to me). The compile times are unpredictable (see the above story of why that matters).
I've seen some Delphi jobs in my country a year ago I think. Maybe I should switch there...
That's a great explanation and one I'm going to steal.
As you age it permeates every interest you had, because you already have taken the low hanging fruits. Not only computers, also books, movies, food and I bet even sex. Everything reminds you of something you have already lived. Every novelty is harder to find, and of course there are novelties, but they require an ever increasing amount of effort and time to find, and you have less and less time free. When you are a kid you are in a constant state of awe.
Computers... I don't care about a lot of the things I cared about. I cannot extract pleasure any longer for studying yet another random language. Now, if I want to truly get impressed learning Kotlin (to say something, fine language) doesn't cut it, I need quantum computing or dependent types or whatever. But it's much harder.
It would perhaps be wise to try to practice deriving happiness from something other than the seeking out of novelty.
Like picking up an instrument and start practicing more advanced music than the pop song of the day. That's really a long-term game.
Programming can be another such activity. I mostly stopped reading programming books some years ago, but I find that deliberating over the meta-game of programming, i.e. not how do I specifically solve this problem, but more how do I structure my solution, how do I simplify it, how do I reduce the problem itself to its core, how do I write the actual text, variable names etc., in such a manner that it is self-evident what is happening. That is truly a long game too. There's even a skill to the deliberation itself - too much deliberation is counterproductive.
I read somewhere that the philosophy in the antiquity defined being good not as based on absolute moral values of say unselfishness, like giving food to starving children, but on simply being really good at what you do. Perhaps that's from a realization of what actually makes people happy? Like the old carpenter expertly fixing a troublesome door while softly whistling to himself, a human being in inner peace.
This is a good point, and, interestingly, I think the author of the original post would agree. He clearly (in 2016) still enjoys his work, and seeks to be excellent at it. He's just come to the conclusion that he can both do that, and not have computers and operating systems and programming be the sole focus of his life.
> Perhaps that's from a realization of what actually makes people happy?
I just want to comment on the irony of praising a philosophy that focuses more on abstract virtue rather than an absolute metric of goodness, by noting that this increases happiness.
Do you have more on this philosophy of being good? I’m interested
9 replies →
Welcome to the start of your mid-life crisis. :-)
The other way to spin this is, maybe you've entered a phase in your professional career or technical interest phase where novelty and maybe "climbing the ladder" are potentially less of a motivator for you.
Now you can focus on more important metrics like: shipping products in the best designed way, putting technology stack choice aside for the sake of what works and what is maintainable. Or having a happy team that works together well.
The industry needs more of us measured, calm, and dispassionate middle-aged folks, not less.
I find it to be almost the opposite for movies. Yes, it's rare to find something unique, but after starting watching at least a movie per week, I'm now getting references I would never have gotten otherwise and put the movies in a completely different light.
For example, you're not gonna fully appreciate the latest spider-man movie unless you already watched the whole 3 series.
Or the other day, I was watching Ted (the teddy bear movie) and they recreated the dancing scene from Airplane. If I had never seen Airplane, I would probably still have laughed, but I wouldn't truly have gotten the joke.
Or in the Doctor Sleep movie, there's a story being told, but the Director also tried to reconcile The Shining movie, The Shining novel with the Doctor Sleep novel. You don't truly experience the movie fully unless you've experienced all of them.
I don't know for other media, but I find movie makers tend to be very meta and you gain as much by having seen all the low fruits as you lose by doing so.
I recently watched the whole Marvel Comics Universe series in chronological order. I'd seen a lot of them, but not all, and everything made so much sense when seeing it this way. WandaVision, of course, makes no sense at all without having seen most, if not all, of the movies.
2 replies →
somehow we only experience the superstitious dream of new
learning programming felt exciting but the more I know the more I realize the underlying principles were older than computers (ordering, algebra, physics)
there was nothing really new in learning java, haskell predated it yet I only got to know about it in 2004, people in the 50s did monoidal modeling of computing
"Nothing really new" - or in other deliberately twisted around but I think still valid as a point words "it's always the same atoms only in a different configuration. This universe is so boring".
Your perception of "new" depends on your accuracy of perception. With a very low-res eye a lot of things look the same and boring that a higher resolution perception sees as very different and interesting.
What's true for eyes and visual perception also is true for brains: If differences in reality always map to the same neurological pathways and create the same wave patterns in the brain it does not mean it actually is the same. More likely is the viewer either lacks the detailed perception and/or the detailed pathways for processing and reacting to it.
Too much abstraction can make things look boring and create a wrong impression of sameness. "History always repeats itself" - only that it never does, unless you filter out everything until what is left matches what your assumptions. Which actually also is what the brain does, once you see a certain pattern the brain steers you towards seeing it. For example, there are pictures where you can see one of two different things, and once you see one pattern you have to make a conscious effort to unsee it and see the other one.
So, a large part of it is that you see what yo expect to see. If you already determined things are the same you will subconsciously filter out that which does not fit your assumption.
.
In any case, I always suggest to programmers interested in something truly new to go into biology and bio-chemistry. All programming we do ends up on similar von Neumann architecture hardware, so it's not wrong that there is not that much difference between programming languages, compared to quantum computing or "biological computing".
Trying to understand - never mind create your own - biological systems is truly something entirely different and should satisfy the bored aging CS and programmer person. Just to clarify: I'm not talking about brains (although that would apply too), but the much more low level biology.
I suggest edx.org for a starting point. Some very good hig-level but introductory and free classes on biochemistry, biology, genetics, statistics (for life sciences with appropriate examples), etc.
5 replies →
When you are tired of learning and being passive, it's time to start making and being active.
There is something special about making things. If they are physical it's even better.
> Computers... I don't care about a lot of the things I cared about. I cannot extract pleasure any longer for studying yet another random language. Now, if I want to truly get impressed learning Kotlin (to say something, fine language) doesn't cut it, I need quantum computing or dependent types or whatever.
I feel similarly. It's not actually a problem though, is it? I haven't looked into quantum computing, but dependent types are quite an exciting rabbit hole.
There is this computer science problem that I keep coming back to even though I have found like a half a dozen working solutions. I even reinvent the same solution with a twist most of the time and yet I can't stop but come back to it.
I don't mind the lack of novelty at all.
It sounds like the author is getting old. I am not saying that to dismiss him. But when you grow older you have more responsibilities and you start value your own time differently. All this computer hackery at home was fun back when I had lots of time at hand at home. But nowadays I work with computers all day and don’t want to bother with technical issues at home.
For a long time I didn’t even have a real computer at home. I was mostly watching YouTube on my chromebook and checked my bank account. Wrote the occasional letter. That’a about it.
Now I got into casual gaming a bit and enjoy playing games on my linux laptop. But I am using Ubuntu and not Arch or anything weird. I try to keep the tinkering to a low profile to keep it enjoyable.
What I find funny though, is all that time I spent hobbying around my OS back in the day has paid (and continues to pay) extreme dividends.
What I mean by that is my machine is some kind of basic linux desktop with a tiling window manager.. I haven't looked at the configuration in about 8 years, and even then it was probably 5 years before that that I really cared to give a lot of attention to it -- it survives OS upgrades and hardware replacements because what constitutes my HOMEDIR just gets copied over to new machines and I just install all the tools that are missing as part of the setup process.
But now, my machine does not do things randomly, it does not have weirdly undefined behaviours, it does not change it's UX, it does not prompt me unless necessary, it does not do unexpected things on shutdown or boot up or kick me off to perform upgrades.
Everything my computer does, happens at my pace, on my schedule and it's extremely rare I have any noticeable bugs from those updates.
I'm not saying this to brag, but it's interesting how often people who take the "easy" path end up spending so much more time fighting or re-learning their computer than I do.
Same thing here. My own scripts to configure a new Arch install automatically. Much quicker to install with the few things I want to install, than installing Ubuntu (for example) which bundles everything in, and more. It also does exactly what I script it do, so my scripts install specific versions of software whose version really matters. No surprises.
My custom Emacs configuration build? Automated my own way. My AwesomeWM configuration? Automated my own way.
I'm so glad I took the time to learn to do stuff manually, so I could tell them specifically how I want them done automatically. And it should last me quite a while.
I think that's another part of it; the novelty's worn off. When smartphones became ubiquitous, they were exciting; people would tune in on e.g. Apple's keynotes to see what they came up with this time.
But then it became more of the same. The difference between phone generations became more about performance and photography technology than anything new. The iphone hasn't done anything major since the iphone 4 or 5, or whenever they did their big redesign effort - IMO anyway, not to dismiss their efforts because in terms of power and software technology they've pushed the boundaries. Unfortunately it's mainly in areas I don't really care much about, that is, photography. Interesting to see how much the instagram generation directed phone development focus I guess.
The author mentions things like Netflix as well; it was written six years ago (ish), but it's only gotten worse. Netflix is full of what feels like artificial programs now, box ticking exercises; cartoon aimed at ages 3-6, check; segment aimed at PoC and/or women, check; David Attenborough nature documentary, check; political drama, check; desaturated Scandi crime show(s), check; video game tie-in, check; buddy cop film with high profile actors, check; the list goes on.
Reminds me of this quote from Linus Torvald
> I’m sorry, you may want to close your ears now, I want a distribution to be easy to install, so that I can just get on with my life, which is mostly kernel.
Also sounds like the author needs a reality check. Pointless whining about a device which helps you earn hundreds of thousands of dollars.
I’ve been in computing a long time. In the early days I made build systems to take apart and deliver DOS 5 and Windows 3.1 to student labs based on what they needed in that session. I ran Windows servers by the thousand. I administered thousands of users and thing client terminals. I ran SharePoint farms and used PowerShell scripts to build large farms on the fly. In my day job now I’m a cloud consultant, and I script migrations of users and I install telephony systems.
And nowadays I hate computers.
I can’t bear to use one when I’m relaxing at the end of the day. Now I use an iPad Pro to either consume content or creatively to draw and make videos. Yes it has limitations and restrictions. But I cant bear to use a desktop OS in any shape. It just grinds my gears and feels like work. Automate my house? Hell no. IOT fridge? Please. You have to know when to stop. Plus I’ve been burned by every OS and idiom in computing that you can think of, and I’ve had enough. Just let me browse the web (yes I use adblockers) and watch some vids. Ive gone through everything and now only use ‘computing’ devices creatively to improve my life.
But that was 2016, so much has changed since then. I'm sure he's much happier with what's going on with computers in 2022.
What a gem of sarcasm.
He's probably got a bag of money right now ..
m e t a v e r s e
What you really miss is your state of mind when you were young and carefree. Don't blame Netflix or Instagram or VCs. Your youth isn't coming back regardless of how computers may look today. You have aged 20 years. Your friends aren't having LAN parties they all have kids in middle school. Fiddling with computers isn't fun because you have taxes to file. You can't still have Red Bull and coffee-fueled all nighters hacking on that shiny new framework because your heart would literally give out. All the new kids entering the industry have their own culture and their own lingo and you hate it all.
Nothing is ever as it was in the "good old days". The best move is to realize this and adapt and make the best of your current life situation.
That sounds terrible for you.
I'm a greybeard who wears socks-n-crocs, I have lan parties all the time in my house, and sometimes I nap in the afternoon so I can hack at odd hours. I think you should do what brings you joy. Maybe not all the time, because taxes and your health are important too, but at least some of the time.
> Nothing is ever as it was in the "good old days".
Some things aren't as good, but many things are, and at least some of those things are better.
Maybe not for you though? Everyone gets old, so of course, on the whole, at some point things for you are going to look worse than they used to be, but guess what: you're also going to die, so if you think the whole point of your life is about you, all you'll ever see is decline, so I think the best move is to think the whole point of your life is about something else, that way you can see things getting better and become a part of that.
Maybe you will discover life is about the community we're a part of and you need to get to a lan party: Some of those middle-schoolers might have fun playing games with their parents too.
The infinite is great.
I love you, and I'm putting this comment up on my wall. Thank you.
Speak for yourself. We still have LAN parties. Plenty of times its online and thats ok. But we regularly gather up our computers on wheels and drag them everywhere. Sixteen to twenty xboxes is quite a sight.
If anything its gotten larger. Plenty of us bring kids with their own computers.
We all have dedicated server hardware. Monthly get-togethers at least. Mostly Factorio and similar but also quake/doom and quite a few others. We can still link back home for files and add more server space. We've now got the skills and hardware to do things we previously couldn't.
Why should we only remember the good ol days when we can live them right now? As for the music, the best stuff has just been written. Why limit yourself, anyway? We have a catalog of way more than fifty years!
C'mon. It's doesn't have to be so binary between getting old and things changing for the worse. They can both be partly true - and I think we could all agree that the fundamental character of working with computers has changed immeasurably during the last 20 years.
Computing becoming infrastructure does indeed changes things. Before the web your computer was just a side gig, a purely fun driven, consequence-less thing. It was mostly static, you bought the thing, the software (or hacked the shareware :) and that's it, nothing on your mind, you booted MSDOS without ever thinking about your password. Now it's part of the socio-economic ocean.. what you use changes due to market dynamics or policies.. obligations, privacy, security. It's obviously a different world.
Reminds me of articles on early amateur radio. It was free-spirited, until society decided it wasn't and now you need licences and it's a business first toy.
It's not wrong, but I don't think it's the whole story - otherwise, we'd expect today's youth to be just as happy and excited about that stuff as we were. But they don't seem to be - gen z seems to feel more miserable than previous generations at the same point in life.
"nothing ever gets worse, you just get older", is just as bad a rule of thumb as, "how it used to be is how it ought to be."
Ehhh
> nothing ever gets worse
It mostly gets different (and largely, I'd argue, better). Yeah, you get older - which should come with experience, self-knowledge, personal growth...
For me - LAN parties are now dinner parties. Hacking all night on that new framework is pondering architectural decisions and seeing them pay off days or weeks later. Young 'uns with their new ideas means freshness, and teaching opportunities, and foisting the stuff that takes that kind of energy off on those who have it ;)
Different is really only bad when you're unable to adapt.
You are putting words into his mouth. He is not blaming Netflix and co for anything, he is just stating that he is personally not interested in doing anything online in his free time and that the reasons he got into FOSS are not valid anymore.
So sad that good memory also sucks (true for me, lots of good memory). LAN parties was the best social movement, it even had some sort of federation like cross internet cafe competition, customers exchange, home game, away game.
I still like computer for numbers of reason.
> You can't still have Red Bull and coffee-fueled all nighters hacking on that shiny new framework because your heart would literally give out.
This is the worst part of being old.
No, plenty of people still like computing when it's based on FLOSS and doing more interesting things than just logging on to some huge centralized service like IBM of old. Look at the excitement around new projects in Rust. Even Web3, for all of its silliness, builds on the same aspirations. Your comment is just thinly-veiled ageism, you are saying things about "getting old" that just don't apply.
Sure, but adaption can be a number of things. There is no real reason you cannot ‘go back’ if you adapt that way; it is a choice either way. Ofcourse things are not the same but the feel can be. And your heart definitely is a lot stronger (unless you have obvious defects by birth or lifestyle) than red bull and coffee.
The good old days were measurably better than what we have now. Decade by decade, software bogs down.
This hits home. 20+ years of being a sysadmin, writing software, I've written and rewritten the same shit over and over. I'm disgusted by the world that was created around these network apps and I'm disgusted by my own minor role in helping the vultures build that world. It's nothing I wanted or imagined. Every time I think about it, I think of the Leonard Cohen lyrics:
Your servant here he has been told / To say it clear, to say it bold / it's over, it ain't goin any further.
And now the wheels of heaven stop / you feel the devil's riding crop / get ready for the future, it is murder.
I've been worried of feeling the same. Now, I try to do software dev at companies whose product isn't software. Currently, that is souvenir manufacturing automation. The job is producing the simplest software that speeds up the production process. No need for fancy modern UIs, no need for subscriptions, no need for handling large amounts of private user data... it's nice. Much more gratifying than working at <insert-generic-big-tech-company-here>.
I see a lot of disenfrachised, highly skilled software engineers in this thread that I think would be very well welcomed into the electronics industry if they spent a few weeks trying their hand at using those skills for embedded systems.
Although I work for a big tech company now (the fun kind for me at least, making hardware platforms), in a previous life I made consumer products. Hair dryers, hoover's, ovens, lamp's, etc. It's amazing how much more your average person's eyes light up when you say that you helped design their kettle, compared to the chip in their phone or the backend of their social media, and there really is a certain amount of joy in that.
Do you also feel that you are on the wrong side of history? Not on the side of the active evil, but on the side of the passive spectator?
I still like computing but not necessarily computers anymore, and I especially hate when every single thing that is not a computer becomes a computer.
The ratio of extracting utility and fun from computer vs fighting against the system has plummeted. And doubly so in products where computers aren't the main thing but just a means to implement something else which, in the process, in effect becomes――computer software. Add to that the requirement to always connect online and you're double-fucked. I just... don't do that anymore. My hobbies are all non-electronic and on spare time, I just use my laptop basically to watch Youtube and post on forums (while they still exist).
The general usage pattern of using information technology has gradually shifted from "how to get the computer to do stuff for me" to "how do I work around all this shit to do anything at all". This narrative applies both to user interfaces and browsing but also programming. It's all more work for nothing much.
I still enjoy algorithms and programming as long as avoid interfacing with anything that's modern. I don't long for the old days but I want the sense of ownership and power back that I used to have. Using computers today is too often as if the computer was using you.
I can understand the author's reasoning. I guess one can get quite fatigued from constantly being connected! I don't do most of the things the author has listed too. I use computers for work and studies and watch stuff on youtube. I don't have any smart devices at home and I don't think I ever will. But I still like computers! :D
What I don't get is the constant need to share everything on the Internet. I've stopped using every social network. I'm on Reddit and on this platform just to know what's happening in the tech world. I don't even read much news these days.
For me it is just like when I was 10 I had a pocket knife.
As a 10 yo I did not have much real stuff to do with a pocket knife, but it was cool.
Well as a 15yo I thought computers are cool and I was into computers - but still I did not have real stuff to do with those. Just gaming, configuring different linux distros for fun.
Now I am adult and I own knifes, but I don't care much about these. I need to chop a chicken or cut a piece of rope I get the knife but I am not collecting them or holding in my drawer in "special place".
The same with computers, nowadays I use them as tools and they are not meant to be fun. I have my work to do and I am done. Well I still do quite some web browsing and media consumption unlike author, but it is not like I am going to build a PC from scratch and install some distro of the month on it just to see if I can make it work, because I have real stuff to do.
OK sometimes I get raspberry PI from the drawer to see if I can install some new raspian or rasbberry os on it.
I definitely miss the nostalgic past. Not just programming when I was a kid, but when I became an adult as well. Not only was everything new, everything was very...visceral. You didn't have to learn to/pay for/etc 500 different services to get anything running. You downloaded programs that ran on your computer. There was no cloud to speak of. This all changed after ruby on rails took off by my estimation. It made it so the "common man" could code, and with that pandora's box open became the vast simplification of an otherwise artistic industry.
I don't really code at home anymore. At work I do the bare minimum to get by, as I always do. I hit my sprint goals to please the PHBs whose entire job it is to remind you of your failures (PMs). I no longer enjoy fixing bugs, writing new features, or anything else. I just do what I do because the pay is good enough. I'm truly only motivated to do anything, for any company, because otherwise I'd be fired. Thank you for destroying my bright eyed we-can-do-it attitude, SV.
I could've lived with cloud and a thousand frameworks. But the politicization of our industry and the micro-management culture almost every SV company exudes is toxic to anyone who believes in merit and creativity. I've been unable to find a company to work for that won't take an expedient political stance on whatever the "important social issue" is. Entire code bases are upended by entirely arbitrary sets of rules set by people who arent hackers at all. These people are simply social-science sit ins who want to elbow in on a lucrative industry.
The final nail is the coding interviews. Having been in the industry for 15 years, most of that senior, I would expect to be treated with dignity in an interview. It seems like every corp wants to treat me like a junior until proven otherwise. Yet they require me to send a resume anyway. How exhausting.
The industry is exhausting to be in. It takes bright eyed talented and creative individuals and grinds them to dust. The only benefit is it's one of the only ways to broach the upper middle class boundary now. So I can't leave to do something else and maybe find my joy in coding again someday. It's just suffering, and crying over the memory of a simpler, less political, and less vulture-capital driven hobby.
To put it bluntly I'm tired. Yet I can't rest, it's my only way to afford to retire.
> The industry is exhausting to be in. It takes bright eyed talented and creative individuals and grinds them to dust.
Sadly, I think this is true of every industry - with a few rare exceptions, perhaps. Workers are burnt out everywhere, and companies treat them as grist for the mill. It's just how industrial society is organized, it seems.
Elsewhere in the thread, someone mentioned how the school system "completely removed that joy" (in this case of mathematics). That sounds like the same situation, it's not just tech workers, it's teachers and students too.
There definitely are some pockets of hope, niches where creative juices are flowing, where people are enjoying their work. It's so rare to see, but I believe that's actually how things are supposed to be - I mean, the ideal state of society.
The historical period we're in currently is very primitive in some aspects, brutal and ignorant, despite all the advanced science and technology. The hope I see is to nurture and cultivate our "bright eyed talented and creative individuals", so that some of them may survive the grinding gears of Mordor.
> You downloaded programs that ran on your computer. There was no cloud to speak of.
You can still do this, there's probably a FLOSS package for your favorite thing floating around. It might even be packaged by your distro and be just an 'install' command away. Not everything runs on the cloud.
> Entire code bases are upended by entirely arbitrary sets of rules set by people who arent hackers at all. These people are simply social-science sit ins who want to elbow in on a lucrative industry.
If you're curious about exploring computers, doing worthwhile things with them and expanding your skills, you're a hacker. There are no inherent skill requirements, the whole thing is based on personal attitudes and observed results. You don't have to be a "rockstar coder", though it doesn't hurt either.
I have found retro computing to be a good antidote to this. If you loved computers in the 80s, you’ll probably love coding for computers from the 80s with the benefit of source control and a real text editor for your assembly language programs. And emulators make it possible to experiment with computers you never could have afforded, like an Apple I or a Pixar Image Computer.
The great thing about IT and tech is that it's incredibly broad. If you don't like fiddling with computers the way you did in "the golden days", find a different way to fiddle with them. Try machine learning. Robotics. Virtual Reality. Electronics. Create a game. Or don't and just enjoy the non-techythings without regret.
I feel this hard.
There is a difference in having fun with computers and getting a paycheck for computers and sadly the things you have to do to a computer to get the paycheck is often times a sin against many of the high minded ideas the 90s brought forth.
Let he who has not fudged a react component or lazily copied and pasted a function cast the first keyboard. As for me I find that my free time is much more filling when it doesn't involve a computer and while that's sad it should at least be understandable given the state of the industry and the nature of growing old.
I have felt this way for a little while now.
I disconnect all networking to my work laptop and remove it from the dock to make sure using it again is something I have to actually need to do and nothing pops up randomly.
I get up from my desk at work everyday, disconnect and the thought of going back to my computer that same day makes me feel ill.
I will play video games on my switch, or I may use an xbox controller with steam big picture so its like a console. I don’t use most social media, I don’t run any OS I can’t control, I have a plex server but I made it as low maintenance as possible No smart tech I have a cell phone that dies occasionally because I use it so infrequently and forget to charge it. I use an iPad for reading, creating digital art and watching videos, I keep all work related stuff far away from it. Going for walks, creating art, watching movies, spending time with my family are all things I would rather do that sit at a computer.
Outside of some casual gaming, I really hate computers. I also really hate that my job involves spending 8 hours interacting with things that don’t actually exist, and that there is always some new thing popping up to replace other things that work really well for a very minuscule return on investment.
I hate how the internet has evolved into a giant meme, torrent of ads, and being hounded to agree to cookie usage.
Working in IT for 17 years has just completely soured me on computers.
Yep, over 30 years in software and I really don't like computers or really anything with a screen. And I don't even use the popular stuff anymore. I have moved to BSD on my work and home machines, and have an Android phone that I really only use for voice calls and text messaging with immediate family and friends. I am not on any social media. I haven't owned a TV in 10 years.
Computers have enabled a lot of convenience and entertainment in our lives, but I'm old enough to clearly remember the days when nobody had a PC or internet at home, or a mobile phone. I think it was a better way to live, even though some things took more time and effort. The pace of things was just more human, and your day unfolded as you planned it most of the time.
I'm aware this could just be the common tendency of an older person to view his youth more favorably, but it is how I feel nevertheless.
I was just on vacation for a week and I did not get on the web or check email at all. It was nice. Time passed more slowly. I would feel that the day should be over and it was only noon. When I retire I plan to stay offline as much as possible.
Is it that we used to think computers would make things better, not in minor ways but in some sort of fundamental, paradigm-shifting, revolutionary way? And then you realise it's a bit like moving to a new town - you bring yourself with you. Humanity with computers is at least as messed up as humanity without computers.
A bit like how the invention of the vacuum cleaner didn't free people from housework. It just led to higher cleanliness standards, so that people have to do just as much housework as before.
</ramble>
I've become the same way. I can't really pinpoint when it stopped being fun, but I know I used to have a hell of a time with a couple of cheap netbooks and a really cheap android phone with an unlocked bootloader. I had a whole setup in my living room on milk crates.
Somewhere along the way I started writing code as a full time job. I enjoyed it. After that I still enjoyed messing around. But I agree with the article in the post, it just became a mess of overcomplicated abstraction, a chore, a hassle, and now I want my (2) computers to just work when I need them and beyond that leave it alone.
I still write code on the side as a hobby, but I don't particularly enjoy it, I just build things I'd like to see exist in the world from time to time.
This matches my own feelings strongly. I think a lot of it has to do with what some other commenters have mentioned, in that I no longer feel in control of the computers I interact with, not really anyway. Some of that is simply that I have a better understanding now of all the factors involved.
I started working with computers when I was very little (4yo), and have really had this as part of my life and career since. Now I look around and other than the computer given to me by my employer, my next newest machine was the last gaming PC I built 11yrs ago, which I don't even use anymore. Outside of work, I'd much rather spend my time in the garage working on cars or spending time with friends or my wife doing something out of the house than stay home on a computer. I haven't completely eschewed computers, I still keep my website reasonably up to date, and I still occasionally play video games, and of course I read HN in my free time. But for the most part, once work is over, I don't even want to touch a computer. I feel like electronics have infected every part of modern life in a way to surveil and manipulate us as a a society, and I feel like if I share my thoughts about it publicly I'd sound like a Luddite. But I've learned most of what there is to know about computers in my life, and it didn't reassure the idealist views I had when I was younger, it shattered them.
Maybe I'm just getting old (I'm nearly 40), but I think we're starting to get to a point as a society where our computers no longer do what they're told, and the computers make decisions most people are forced to follow without any opportunity for their own judgement. It's frankly dystopian, and I desperately try to avoid it outside of work.
What I love about computing and programming is the ability to create something that really helps people, or that they find fun and enjoyable.
The problem is that over time, the opportunities to do this have become harder and harder to find. All the low hanging fruit has gone and more and more people are relying on social media instead of websites to communicate and run their businesses anyway.
Additionally, I'm just not interested in privacy violation, spyware, big data, ML etc.. so that frontier holds no appeal for me.
Ironically it seems like it was easier to innovate in the mobile space and find a market in the days of Palm PDAs than it is today.
I walked away from computers in 1996, not intending to go back. Child-raising and gardening/farming were going to be my future. I hated computers, I hated what computers had done to me, I hated what computers were doing to the world.
Then, somehow, a couple of years later I started writing FLOSS software for music (recording, editing, mixing, composition, processing). My interactions were with users for whom computers were a creative tool, not a business tool. When things worked, my software was helping people feel as if they were able to express themselves in ways they might not otherwise have been able to [0].
In 2022 I don't love computers, but I have a very different relationship to them than I did 2.5 decades ago. I still don't know if on balance they are really a good thing for the world, but for me personally, yeah, sure, I like them.
[0] one might argue that they should have been non-computer-based music making tools, and you could be right.
2016 discussion (263 comments): https://news.ycombinator.com/item?id=12878706
The top comments are very much worth it.
Brilliant comments, indeed. Thanks for pointing out.
This is the curse of doing software for a living. When I worked as a dev, I also started to dislike computers. I would come home and not want to even touch my personal machine. Now I'm out of the software industry, and tinkering with my computers is one of my favorite things again. When I was working in software, getting a call from a family member for computer help was a nightmare. Now, I look forward to helping my parents set up a new machine or install some smart home device.
Had a similar experience. Switched to a different field just so I could program as a hobby. Turns out programming skills can be helpful in the most unexpected ways. I've had the opportunity to optimize business processes, for example.
These days, professionally, I am primarily a manager of people. This is the only thing keeping me in tech—my teams are composed of excellent humans and I get to make meaningful contributions to their personal and career development. I'm also able to influence the hiring process so that our incompetent HR team doesn't fuck up the hiring and onboarding experience beyond repair. And, the young people entering the workforce have no idea what to expect, how to act, how to behave, how to communicate like an adult, how to connect ideas—at the very least, I can help with this stuff and that makes me happy.
Outside of work, my computer usage is limited. I no loner maintain a personal blog or website, and I've given up on writing or online community projects simply because I can't stand having to constantly tinker with the tech—WordPress, Jekyll, maintaining domains, security, whatever. I don't have social media accounts. I don't read online or with a Kindle, I continue to buy and read physical books. No smart home, no smart car, no TV in the house, no apps on my phone other than weather and maps. No time spent watching obnoxious SeatGeek, Liberty Mutual, or Grammarly ads on YouTube.
I am a deeply technical person, my skillset makes me employable, technology and computers have made all sorts of things possible in my life, and for that I am thankful. But more and more, the good life is a life without the screen, without the connectivity, without the neurotic tinkering with tools and libraries, without having to spend full days figuring out if my data was exposed in a breach.
I sit on ass all day for work, I will not sit on ass all evening in pursuit of technical hobbies that no longer bring me joy. There are sunsets to see.
Small, indie internet hasn't gone away completely. There's still some discovering to be had. Not everything is a corporatized web service. One tool to help you browse the fun version of the internet in 2022 is Wiby [0]. It's been mentioned on HN many times, and I thought it was cool, but I never took the time to play around with it. Until a couple days ago.
Sometimes it gives you outdated results from the 90s, or the 2000s, but that's okay. It's fun to explore relics. Sometimes it gives you a site you expect to be from 2003, but it was actually just put up last year from another curious hacker. Those are fun to pop in on. That happened to me yesterday, and I just left him an email appreciating his website, and his various projects he posted on there, to hopefully spark a fun conversation.
The modern web is bland and boring. I'll agree to that. I don't like it much. But not all hope is lost. There are plenty of us that want the old internet, that some are building services and networks to revive that. Not all hope is lost.
[0]: https://wiby.me
Seems like almost all devs these days feel this way, and I don't quite get it.
As far as I'm concerned the ideal software ecosystem is one where every manmade object is connected, and the actual code is highly encapsulated where I never have to see any part of it except the part I'm changing.
My idea of an ideal software ecosystem isn't far from what we have today, minus the locked down proprietary protocols and cloud-only software, and with a few less containers.
Everything IoT, software done in high level, encapsulated, reuse heavy, and with an emphasis on extremely safety, and a whole lot less low level code where the compiler can't watch your back.
Things are pretty great now, we just need more performance-focused P2P tech and local APIs for our smart gadgets.
But it seems that other programmers really enjoyed a lot of the stuff that I'm happiest to see gone, the hacks, the low level algorithms work, then clever algorithms.
I'm kind of worried for the future, since it seems the best and brightest are now bored, and we might not get much real innovation without at least a few of them.
Software dev is part function and part art. All of us live somewhere on the reductive 1D dichotomy "function" <-> "art". Some (not all) people more on the "art" side prefer an open canvas which involves having direct access to the computer. Some on the "function" side find that commoditized software is safe and repeatable software. These two perspectives differ but often because of interests. There's more than enough room out there for both types of devs.
It seems like the scene is kind of splitting though.
Commercial and commercial-inspired software is doing a great job of making totally standardized predictable platforms, and the DIY minded people pretty much only do things like Arch, unless they're getting paid a lot.
Doesn't seem like the dev community is really excited about anything safe and repeatable anymore.
3 replies →
I really agree with this. I would also like to stop observing the world through computer or phone screens.
I think there is something to be said (by someone willing to put more thought into it) regarding how much we as a society have churned away at computing and turned into something /refined/... Perhaps an analogy of what I'm thinking is people chewing coca leaves vs snorting cocaine? Maybe too extreme, or not extreme enough?
Something really changed when the Internet stopped being a physical place you had to go to (a computer, at a desk) and remain at to keep using it.
Maybe a slightly less extreme analogy would be highly processed food vs. traditional food... I feel like I'm just learning that white bread isn't really healthy and is nutritionally deficient after many years of vigorous consumption.
2 replies →
I can highly recommend this book for some inspiration: https://digitalvegan.net/
It helped me put things into perspective and reclaim some sense of agency.
Computers are a tool, do you think builders like hammers?
Yes, a good chunk do. (You haven't lived till you heard a contractor wax for 30 minutes about why he prefers what hammer for what task ;)
Of course, by far not all. But the thing that truly differentiates computing from most other jobs isn't that people are indifferent to their tools and want work to end when they're home. That's the common case.
What's different is that software engineering seems to be the only job where its practitioners actively resent the tools of their trade, because those tools inevitably bleed from their work life into their private life.
I mean… yeah? Some trade workers I know genuinely do like their job, but they are in the same place in this story that the author is. They like their job, and are even moderately interested in the details of their job, but they get home at the end of the day and don’t want to start laying tile or framing a wall.
Computers might be a tool for you but pure entertainment for others and that’s perfectly fine.
My father had the same framing hammer for about 30 years. He refused to give up that hammer. When he helped my brother and I redo our house we bought, he was prepared to hammer the whole floor down with just his hammer. We convinced him to use a nail gun instead, just to prevent another bout of carpal tunnel syndrome. However, he was attached to that thing, out of all his tools, it was the one I precisely remember, particularly when I had to help him growing up.
I haven't had the same computer for 30 years, but I am attached to what I have as it continues to do exactly I want in terms of its ergonomics. :)
Actually yes, in terms of tools there are always those who care about quality and convenience, and those who don't. There exist premium framing hammers. The margin of difference is low. But if it's the difference between an injury and not, the price is irrelevant in most cases.
Most people enjoying their profession also enjoy their tools, often times simply because of said tools.
I might think that of a builder if he said he used to play with hammers as a kid and well into his adult life for fun and because he believed in hammers.
"The company's hosting a nominally-optional-but-not-really hammer-thon this weekend! Aren't you just so excited! You'll get to swing those hammers any way you like, not just the exact way your manager tells you to! Fun!"
Today they are tools but that is not what they used to be. In late 80 for about a decade later on they used to bring excitement and fun and surprise with a bit of frustration. Today they are just boring tools.
Perhaps using them, but outside of work, yeah -- good point.
Unless you get the dude who's /reaaaaaallllyyyyy/ into hammers... Yeah - Paul, he's a bit odd, but we love him.
Hammers don’t (yet?) come with a little screen to show ads and a buzzer to beep at you every few hours because you haven’t “engaged” with it enough.
I can't speak for others, but generally I like my tools, computers or hammers.
Why would anyone like or dislike tools? I don't like or dislike my toilet plunger. I would prefer it to be designed to get the job done quickly rather than wow me with the experience of using it.
Someone can certainly enjoy cooking in a fancy kitchen or making a table in the garage. That's an equivalent of programming and provides mental and aesthetic challenge. Or one can have a hobby of collecting and appreciating a particular kind of tools like lighters or retro computers.
But the success of an average computer today is to fade into background and be unnoticed and unappreciated, safe for provoking annoyance in rare cases when something breaks, as all tools do sometimes.
I hate working for companies especially programming for them. I enjoy computing outside of that way more.
Repl.it and children really helped me focus on the bits of computing that I loved, namely theoretical Computer Science and learning/teaching it through functional programming in small steps, in Python.
Then I ended up taking a job as a high school CS teacher. Being around the kids was a huge boost. It really helped me zone in on the joyful parts of discrete mathematical computation.
Ironically, over time being around the school itself (hugely conservative for many reasons, especially their IT) almost completely removed all that joy. Repl.it was such a positive experience that it’s not an understatement to say that it came to the rescue. It really did come along and the most perfect time for me, back in 2020.
Again, not everything was great. Seeing the effects of TikTok on children was pretty depressing. So too was seeing how disrespectful parents are towards teachers, compared to a generation or two ago. Teachers are servants of the parents’ staunch individualist whims, whereas my experience in my own and my peers’ upbringing was that they were deferred to as figures of authority.
Build yourself a ray tracer that renders ASCII art and you can drown out the troubles of the world. Similarly if you get a bunch of 14 year olds to do image edge detection in one page of code. Grade A escapism, all round.
> functional programming in small steps, in Python
Care to elaborate? Python would not be the first language I'd choose for learning/teaching functional programming. (I'd choose Elm!)
Elm is (maybe) a good choice if you want to focus on the functional aspect. Python is arguably a better choice for high school students as it gives them a large amount of utility. You can use python to do useful things quickly - elm, not so much. Not saying I don’t like elm, but just doesn’t make sense to me for this case.
I don’t like computers, I feel like we’ve lost ownership of our own software. Linux definitely helps with this and I’ve since transitioned.
It’s mobile devices that really bother me, I’ve since got my screen time down to 39m daily and I might just get rid of it and only use a pc for computer activities.
Small tangent, I’m a massive fan of e-ink, I’m waiting for the day where I can get a proper monitor as a reasonable price and use natural or backlighting when programming.
> Small tangent, I’m a massive fan of e-ink, I’m waiting for the day where I can get a proper monitor as a reasonable price and use natural or backlighting when programming.
Me too. Kobo devices run Linux and are easy to hack. Perhaps they can spark some joy or provide you with a cheap prototype of what you want.
One thing that I guess is responsible for some of the excitement around computers is that it's a thing to excel in, a potential career. At least subconsciously we knew we should find something to be good at, and you couldn't go wrong with the arcane knowledge of these new machines. And the deal was implicitly understood: Instead of completing a rigid course for every subskill, a freeform exploration was allowed to acquire the skills, a series of personal projects provided the intangibles, hardened us to face weird errors, to read the logs, to know the commandline, taught us to quickly read up on and understand how everything fits together after we've taken it apart. So let's come up with some love for that Raspberry Pi, shall we?
Now, as the career matures, more Linux won't help us any more, so we focus on team dynamics, social skills, business logic, and, if you're entirely over it, eventually leadership.
But, if the practical quantum computer came along tomorrow, opening up another tier of income and glory if we'd spend our weekends tinkering with the prototypes, we'd sure love it all again!
Computing and what is enabled by computing are different things. I like the computing aspect and continue to do so in my older age even if the interests change, but I also like a lot of stuff enabled by computing. However, the burden remains on me and everyone to take things for what they are. Computing and networking enabled broader access to literature and information, so it is fair to have this analogy: if there a few bad books, particularly ones that create undesirable impulses, I shouldn't hate all books or the printing press that made them. I am not going to associate computing or computers to bad actors or that "computing" for the 2020s means consuming endless timeline feeds. I have accounts on those platforms, but staring at those things too long feels like using a device for the sake of using a device, instead of computing for the sake of computing (writing a program to do a novel thing or finding ways to save old hardware to do 'modern' workloads are things that I think in terms of "computing".)
"the older I get, the better it was"
It's interesting to me how much of "I got over X" is framed as "X is no longer interesting".
It reminds me of how "younger generations are going to ruin everything" has been said by older generations since at least the ancient Greeks.
It's a rather self-centered way of looking at things, IMO.
Yes operating systems are crufty wastelands. Yes the modern internet is a bordello of privacy snooping con men stealing our minds with dopamine engineering.
But most of all, when many of us (not all!) age our passions shift from abstractions to the real. I still think lots of computer ideas are quite neat but it doesn’t fill me with passion like it used to. But that’s the same way for many more abstract things that used to keep my brain and heart churning.
Now I appreciate things like a perfect summer sunset evening more. It’s a goofy evening playing board games and laughing with my family. Maybe it’s a growing sense that these moments aren’t infinite in quantity. Who knows. But it is a natural progression for many of us.
A shout out to Adam Williamson, the author of this post. He's one of the most excellent open source people I've ever worked with. He tirelessly helps run and lead Fedora Linux's QA. It's always fun troubleshooting a gnarly bug with Adam.
I'm in my mid-fifties, and I am still working on programming projects, finding new games to play, learning new things and just screwing around with tech as much as I used to.
Things have definitely changed, I'm no longer staying up all night trying to beat a particular game or doing absolutely nothing for an entire weekend but adding a new feature to a hobby program. Now I just spend most of a weekend doing that.
I think one thing about people who were into tech before the internet that makes us somewhat different is that computers didn't start out as a device for consuming things for us; it was a device to learn and to see what we could make it do. Even games, which were the most consumption-oriented things I used them for, were interactive.
Of course I do consume some content. I'll fall down a youtube rabbit hole on something trivial and I read a lot of tech news on sites where you can discuss things with others if I'm feeling so inclined. I have no interest in reading news on webpages that have no discussion mechanic, though. I think I fall down youtube rabbit holes because I still have that sense of wonder that you can now learn all about anything you really want to as deeply as you want to that the pre-internet me would have killed for.
I abhor most social media mainly because every thing I looked at early on felt toxic to me, so I never got involved enough to get addicted to it. I ditched cable TV a decade or two ago after I spent one weekend mindlessly flipping through channels feeling like a vampire had just taken a huge bite from me. That was a weekend I could have spent sniping people on Quake servers or trying out a new idea for compressing images I had the other day or whatever.
I totally agree about the lack of control. Starting out programming on a Tandy Color Computer where you could change the screen resolution by filling a few memory addresses with data and the OS was a BASIC prompt leaves you feeling a bit helpless in today's computing world. I mitigate that by using linux at home, like others here have mentioned.
I wouldn't want to go back to the old days, though. Pre-internet would kill me knowing what I have access to now.
This article should be called "I don't like how I chose to spend my own time."
FTA:
> I dunno where I'm going with this. I don't have any big thesis. I just wanted to write it down.
the invectives of “what did you expect?” seem at best tangential to the author’s intent here. sounds like they’re just trying to express perspective, not mount an argument with a specific aim.
to the author’s point, wasn’t it great when the internet had convinced itself that “zero sum”/scarcity were an unnecessary constraint of a physically constrained world? that ideal may not have shaken out, but there’s no reason to disparage someone for lamenting it (especially when they’ve actually put sweat into that vision).
not everything is a pitch. speech isn’t always an an advertisement or a polemic.
Thats not at all the vibe I got from reading this. Instead, the author pretty specifically lists the things they instead choose to spend their time on and why it’s not computers _anymore_.
I agree -- it's pretty clear to me that the author has found a difference between their work, which they still find enjoyment from (perhaps because they are good at it?), and their non-work life.
I can relate somewhat -- I love tech -- robotics, in particular -- but have lately become a bit disappointed with the current state it in product world. Thankfully I'm also in a state where I can reflect upon why I feel that way, and then muster the energy to do something about it.
Perhaps this is a reflection upon the age of the author too. I suspect as I get older, I'm going to want towards more "wholesome" things -- as I define that word.
That being said, to each their own.
Yeah it's more like, "Here's why I was pushed away from what I used to really like."
I find working as a software engineer very boring. But my interest in computers is still big. Personally I like to ignore all the "YOU MUST LEARN <trendy language/framework etc>" and instead learn the older but very powerful tools and languages etc, I almost have nostalgia for them even though I was too young when they were first being created - the unix tools like sed, awk etc, perl, and of course Linux itself. I love learning about and using these for small fun tasks. Basically Linux is fun.
I think the frustration is misguided.
Maybe it's not computers that he doesn't like — it's what people today do with those computers these days that frustrates him.
All these responses really resonate with me. I'm not burned out on computers, but too often I find my self just doomscrolling and looking for something else to do. So I decided to take up virtual retrocomputing, learning about the Apple II and CP/M. It's not at all useful, but to me it's all new (I come from a Commodore background. It's challenging and keeping me fresh.
This feeling comes to me in waves and it’s been like that since I was still an awkward teen in the late 90s.
I wonder if it’s just because I tend to expend so much energy on a very interesting (to me, at least) computer-related problem that I need to step back afterwards for X number of days/weeks/months before I’m ready for the next one.
My key takeaways:
I spent most of my spare time either reading books or messing around on the internet - and for all you kids out there, this was the 1990s internet, when 'the internet' was mostly email, usenet and FTP, and you accessed it over a dial-up modem
One of my major criteria for buying things is whether they can do their job without a network connection I use computers for...well, I use them for reading stuff
Somewhere along the way, in the last OH GOD TWENTY YEARS, we - along with a bunch of vulture capitalists and wacky Valley libertarians and government spooks and whoever else - built this whole big crazy thing out of that 1990s Internet and...I don't like it any more
I actually don't use the internet the same way I used to. Now, I typically interact with it through a text-based version of the content collected through a commandline utility integrating Lynx.
tbh, sounds like the user hates the cloud... and rightly so, it's a mess and strips users of more freedoms than just using propietary tools which stop working one day.
the laundry list of things he doesn't like to use computers for were all things that all the forefathers of computing also didn't use computers for
like me I don't think this guy really dislikes computers, but modern information systems and their role in society
which is fine, because it is getting cumbersome now, especially with how smartphones have become so ubiquitous that they're practically a required tool to live a normal human life
>I don't listen to podcasts.
Well, I do. One from a Spanish scientific foundation (Naukas), and another one from libre culture based ones, such as The Red Panda.
I don't like computers because the halting problem has no solution. This is why every software-based product will always have annoying varying latencies. My automatic watch, my piano, and my light switches will always respond immediately because they are not turing complete.
A smartwatch, a virtual synthesizer, and IoT switches will invariably fail to respond immediately one day. And because of that I will always activate them with anxiety.
That's why I don't like computers.
Ever heard of real-time systems? edit: man this place is filling up with pseudo-intelligent naysayers at an alarming pace.
How can a system solve the halting problem?
7 replies →
The author has the luxury of not needing to do constant side projects and professional development. I envy his professional achievement/job security/indifference.
To be fair I don’t know a single software engineer above 25 with a side project… and I do know many software engineers!
Tron still relevant.