Comment by kybernetikos
13 years ago
The main thing that's changed for what you call infrastructure is that a massive proportion of people are connected from a plethora of different devices on different kinds of networks.
That wouldn't have been possible without the web. The web technologies you decry are more irritating in many ways than any one specific proprietary system from 20 years ago, but that ignores some of the huge benefits of openness and standards and platform independence.
I can write a program and have it run without modification on my phone, on an embedded computer running sensors, on a tablet, on my laptop, my desktop computer, my hifi, my network hard drive, a huge server being rented to me on the other side of the world. I can even make it easily distributable so it can run safely on other peoples computers who don't fully trust me. It's unimaginable that any proprietary system from 20 years ago would have been able to produce that state of affairs.
I do agree that open standards are necessary to make the modern web work, but those aren't advances in technology per se, they're advances in organization. There's no technology in HTML/JS/WebGL that didn't exist in SGML/Obj-C/OpenGL. There are no breakthrough new concepts that "old programmers" have to wrap their minds around. Rather, it's just the organizational process of agreeing on the color of the bike shed (in this case puke green). Don't get me wrong, organization is important, but standardization isn't technological change.
I agree with the core point about experience remaining relevant, but I think you underestimate the changes that got us here. For example, we've learnt a lot about jits in the last 20 years. The fact that we can write interpreted code that can achieve near compiled speeds in certain situations is amazing.
Speed of interpreted Javascript has improved by two orders of magnitude on the same hardware in the last ten years, and you don't see technological advance? I can't think of another field that has advanced so quickly.
I don't want to diss other fields, but an awful lot of the most valuable improvements in infrastructure in the last ten years have been about standardization and bringing technology that existed years ago to the masses.
On top of that, the inconvenience of the Web platform is real but also overhyped. Imagine you're writing a network app in another platform. It's extremely unlikely you have access to an integrated network analyzer as good as chromes. How many other systems allow you to completely modify the look of your application while it's running just to see how it looks by playing with the developer tools? Having a repl that allows you to interact with the running system has been standard on the Web platform for ever.
I'm not even sure what you're looking for. What would constitute new technology if 100x speed ups don't? Almost all of software is implied in the concept of the Turing machine so complaining that you can't achieve anything you couldn't have in the past with large, expensive, proprietary systems and specialized knowledge seems unfair. It's been true since Babbage at least.
Things like the CAP theorem, monads, functional programming and distributed systems knowledge, programmable graphics pipelines, and the pervasiveness of unit testing and CI infrastructure certainly look like advances in mainstream programming technology to me. Are you sure you're not defining "mainstream programming technology" to mean "what I know", and arguing that you don't feel like you know much more than when you started programming?
Functional programming is neither mainstream nor is it new. It dates to Lisp in the 1960's and was refined by SML in the 1970. What's new in the mainstrea of distributed systems? Programmable rendering pipelines date to renderman in 1988 if not earlier. I don't know what to tell you if you think unit testing and continuous integration are new.
"I can write a program and have it run without modification on my phone, on an embedded computer running sensors, on a tablet, on my laptop, my desktop computer, my hifi, my network hard drive, a huge server being rented to me on the other side of the world. I can even make it easily distributable so it can run safely on other peoples computers who don't fully trust me. It's unimaginable that any proprietary system from 20 years ago would have been able to produce that state of affairs."
That's what Java was about. According to http://en.wikipedia.org/wiki/Java_(programming_language) it appeared 1995 - 18 years ago.
As someone who has written j2me applications, I can tell you that you couldn't run them on the desktop except in an emulator, so no, you couldn't take one program and run it across all those systems.
Java failed badly at its dream and ended up adopting an entirely different niche to what was expected.
But you're right, I shouldn't have said 'unimaginable'. I suppose it's just about possible that if java had 'won', we might have ended up with something not massively dissimilar to what I described.