I hate to sound hyperbolic, but I can't overstate how impressive this work is. For me, it evokes nothing so much as Tracy Kidder's The Soul of A New Machine [0] for opening up an obscure world (the one many HN posters live in, but obscure to most people). I am amazed both by the technical fidelity and by the quality of the story telling.
I agree. This is the best piece of writing I think I've ever read on the web. This touches on everything, and so accurately, and so concisely... this article is giving me a stroke I think.
Same, one of the best things I've read. It may rank above Programming Sucks[1] which is my go to reference for friends when they ask me to explain to them what it is I do.
The big red semicolon picture near the bottom had me cracking up so hard. I will need to re-read this article every week for the rest of my life, in order to fully enjoy it.
> Writing this article was a nightmare because I know that no matter how many people review it, I’ll have missed something, some thread that a reader can pull and say, “He missed the essence of the subject.” I know you’re out there, ashamed to have me as an advocate. I accept that. I know that what I describe as “compilation” is but a tiny strand of that subject; I know that the ways I characterize programming languages are reductive. This was supposed to be a brief article, and it became a brief book. So my apologies for anything that absolutely should have been here, but isn’t. They gave me only one magazine.
Just to clarify, my earlier comment is a direct quote from the article. I am not the author, just thought it was an apt anticipation of some of criticisms in this thread.
Isn't that a seriously mind-bendy kind of article to appear on Bloomberg? Also, isn't it very cool that a whole class of people who may not know a thing about coding (but may be interested) might get to know something about the craft and culture?
And it's presented in a very fun, off-kilter sort of way. That must have been a hell of a lot of work. I actually skimmed the second half and the little robot told me I read it all in 16 minutes which was not possible and who was I kidding!
I had a thought the other day while browsing Etsy. If software really is a craft, could I fashion a bespoke software creation and sell it on Etsy? I know this might seem like a non sequitur. But, you know, what is code? Why couldn't I do something like that?
It's such a strange but vital profession. (Seriously, I would have thought there are a _lot_ more than 11,000,000 professional coders worldwide) and one that is still coming to terms with itself. Inspiring. Note to self, do not think outside the box, code your way out of the box.
could I fashion a bespoke software creation and sell it on Etsy
There's tindie (etsy for electronics), but due to the infinitely cloneable nature of code giving it away works much better than trying to sell it for tiny amounts. In someways the demoscene is this area of software craft for the sake of it.
Thanks for the heads-up on Tindie. Looks interesting.
And I agree with you about the Demoscene. Very much one off creations which is more what I had in mind. I'm imagining extending this idea to software objects that people would like to own, that was personalised to them, that had a strong crafting element, and so on. The reason I'm having trouble articulating it is because I don't think the category of thing exists (yet?)
I was thinking that. But then why is software-dev-as-craft cordoned off from jewellery-making-as-craft and print-making-as-craft and so on. What makes software so special it needs its own little commercial corner of the world? Serious question :)
The activity on the article's accompanying github (https://github.com/BloombergMedia/whatiscode) is really interesting. Users have suggested edits not only to the code in the article but even to add citation.
This adds another dimension to the content by including the open source community such that the subject matter (coders) can influence (and improve!) their article's content.
This is supposed to be an introduction just to the abstract concept of code, yet it includes a section that asks the reader to take a test on whether or not they agree with the author on the effectiveness of domain-specific snippets of JavaScript (http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod...) - one that replies to your selections with obtuse references to the code's use of promises and callbacks.
As an outsider, I just love it when I read something presented as an introductory text and I'm confronted with an elaborate series of self-serving in-jokes that go "ha ha ha, ha ha ha, you don't know what I'm talking about!"
It's just a fun little quiz. I kind of like it as a "reality check" to show the reader that while they may understand the concepts, the reality is much more difficult and fraught with subtle considerations. It also serves as a subtle reminder to readers (who may be the frustrated business-type from the opening of the article) that there's a reason software projects are so hard and cost so much money. Software development isn't something you can grok from reading an article, even a book-length one.
"This is supposed to be an introduction just to the abstract concept of code"
Are you sure that's what it's supposed to be? I mean, the author didn't supply unit tests, all we have to go on is the specification, which was that the editor of BusinessWeek asked Paul Ford "Can you tell me what code is," and Paul Ford said "No", and instead wrote this.
My father sent a small comment chain to me as well on a topic like this on one of his blog posts.
<First guy>
June 3, 2015 at 10:12 am
I think software developers like to impress people with how many lines of code they can write.
<Second guy>
June 3, 2015 at 3:31 pm
That is not true. A good day is when you leave the office with more powerful software, but fewer lines of code.
<First guy>
June 4, 2015 at 4:31 am
So why is software always getting bigger ? Is it because the marketing people want to add new features all the time ? Does this even apply to free software like browsers and email clients ?
-----
Personally, I like writing less code, or reducing code to less code. Less to think about.
> Does this even apply to free software like browsers and email clients ?
The more I get involved in open source the more I think most code bloat is due to people needing their egos validated by getting a commit into a project, regardless of whether the commit is all that useful or not.
Same. I sent this to my boyfriend yesterday: he's trying to learn programming (I'm trying to teach him Python; in response, I'm trying to learn some foreign languages) and I'm hoping that this gives him the lay of the land of how our strange world works.
Wow, I did the exact same thing, for exactly the same reason. My dad called me a few hours later and said, "I finished the 38000 word article you sent.". I checked and he wasn't far off; it's around 29000 words.
> There have been countless attempts to make software easier to write...Decades of efforts have gone into helping civilians write code...Nothing yet has done away with developers, developers, developers, developers.
I still believe. Someday, somewhere, something incredible will emerge for the right-brained bourgeoisie and literati.
Theres tons of successes, we just refuse to count them. Photoshop (as hinted in the article), is a super special purpose language for doing image operations. It no longer "looks" like coding, so we don't count it as coding for the masses. Excel is a much more general purpose language used by tons of "non-coders" (and arguably the most popular programming language on earth). Again, doesn't (often) look like normal programming, but then again, shouldn't this be expected? If it looked like normal programming, it would be normal programming and not successful.
Excel is a perfect example. The core Excel experience is basically functional programming on a virtual machine with a matrix address space laid out visually right in front of you. It even looks like traditional programming if you dive into the VBA stuff, which plenty of non-technical specialists, including MBAs and managers, do on a regular basis in the pursuit of solving their problems.
Any specialist user willing to invest some time in learning their tools can do this. A culture develops around it.
And replying to parent: those efforts around teaching 'civilians' to code are probably misguided. The investment needs to be in adding scripting and programmability into existing line of business tool, not on encouraging people to sit in front of an isolated REPL disconnected from any business value or context.
Two other more contemporary examples are the Android app Tasker, and the website IFTTT ( https://ifttt.com/ ).
There's something about calling it programming that turns certain people off. I remember a story about a freshman in a physical mechanics class that complained about all the MATLAB code they had to write. The professors retort was that they were free to use a slide rule instead, and that particular freshman stopped complaining.
But you're right. The mere act of calling it programming is somehow a problem. It's as if doing programming pigeonholes you into being a programmer until the end of days.
I concur with your assesment. The point is not to turn a syntax into an AST into processor code. The point is to provide things of value and 'easy' computing platforms targeting users who are not professional programmers create tremendous amounts of end user value.
Sadly, when I point this out professional programmers often go 'pffft - that's not real programmming' as if being knees deep in stack traces and gigantic code bases was a something with intrinsic value.
I'm not sure I agree with you about Photoshop. Perhaps (probably) there are photoshop macros or pipelines that are closer to programming, but most people use Photoshop purely in an interactive mode. They enter commands directly, and the logic stays in the users' heads, not in the computer.
Photoshop is more like a REPL tied to an image-processing library than it is a programming language.
I would say that Photoshop is a toolbox of predefined tools. If using that is coding, then people are also coding when they choose and use a screwdriver, some sandpaper, a hammer and some glue in succession from their physical toolbox to get a job done. That the operations happen electronically and that they are implemented as complex mathematical transformations of pixels doesn't change that.
Excel isn't much different in my view: most people are only using a very limited set of predefined tools to get a job done. Often badly: it is well known that there are many bugs in important, company critical Excel sheets. Excel seems like coding because it is mostly used to perform the fundamental mathematical operations we all associate with coding. But if that is coding, then so is constructing a Rube-Goldberg machine for a specific task from the parts you happen to have available. A nice exercise in problem solving under constraints. Which certainly has something in common with coding. But that doesn't make it coding.
Yep, great examples. Also, pretty much any experienced Photoshop user will create their own actions to automate common operations. And then you have things like workflows in Alfred.
The trouble is that so much of what we call "programming" is actually the process of identifying all the implicit assumptions that go along with an idea and making them explicit. In other words, if you knew what to ask for in an unambiguous way then most of the "programming" would be done already.
I'm working on a longer essay but that's the short version.
This is a driving force behind the pedagogy of SICP -- to think in recursive functions. If you can break down your task into sub-problems and describe them clearly you just have to put a few parenthesis around it.
Programming isn't so much about the "code" or syntax; it's semantics and intent aligned with the machine.
If you can express your problem clearly then the program practically writes itself.
Not just the right-brained. I want that as a professional software developer. I want a computer that will do what I thought, instead of what I foolishly typed.
Basically, I want a computer as smart as a good junior dev so I can just yell my brilliant ideas at it, and it will do the dirty work for me.
That's only because you think your thought exists and is correct. Programming forces you to confront the fact that it isn't, and that there are many aspects of it that you've overlooked.
As someone who has worked with junior devs before, that sounds like an absolute nightmare.
There is, of course, nothing wrong with being junior. But the rate that requirements are misinterpreted even by intelligent humans is, I think, a fundamental reason why programming isn't doable by the masses yet.
It's not because computers are hard, it's because knowing what we actually specifically want them to do is.
I remember someone referring to this as "intent driven programming". I once worked for a company that made a business rules engine. The idea was to describe a flowchart to the system and the software would ask what you wanted to do at each step of the flowchart in a top down manner till you fleshed out the whole program ..
That probably took millions of dollars and a team of engineers back in the 2000. In 2015, a reasonably computer-literate person could do something close to that with SquareSpace or Wordpress.com in probably.... a week?[1]
But that site would never pass muster in 2015. Something 10x (if not 100x) more complex is required for NYTimes.com in 2015, plus various native apps, plus a subscription service, and so forth. So you still need a team of engineers...
____
[1] I'm talking about the act of putting the articles onto a website, not reporting and writing the articles, obviously.
I don't understand why people believe 'everyone' should be able to write code without much trouble. Virtually every activity requires focus and exercise to learn. Example: you can't just take a hammer and start building furniture or you will create a mess. If you want something nice and useful, you need to think about what you want to build, how you are going to construct it, which materials you need, which tools you need. You need to experience how the materials behave, try out certain subconstructions, research what specialized tools exist. I believe this idea that 'everyone' should be able to code is like expecting everyone to be able to build furniture. If you want to, you can learn how to build furniture, but it's not easy and will never be easy. Why would coding be any different?
I believe one area of improvement for the excessive need for precision (most of us can agree that it is excessive) is that our current tools don't use context information enough. Humans deal a lot in uncertain areas and context is what helps us.
The wrong people are working on this problem. The only ones who think about it are typically programmers who have made their peace with the machines as they are today. The interface of code seems simple and logical with little reason to try to improve on it. It would take a team of artists, musicians, human factors engineers, ethnographers and some clever computer scientists to do it. Such an enterprise would be high risk and very difficult to fund because of it.
I'm reminded of rms's anecdote about secretarial staff learning to write Emacs macros at MIT because they didn't realise it was programming.
This suggests that current, important and well-meaning attempts to get non-programmers to meet code head-on as code may be misguided. Programming is generally easier if you're not thinking about how much it isn't something you do.
I don't believe that there is no such thing as a right brained person. I think this is a cultural myth. Same as, "you only use 10% of your brain's power". These, and similar, are false memes - iow hokum.
Precision is not the only important element in programming. There's a level of abstraction, such as found in the field of semiotics, that is highly important to the world of computer science.
I've seen non developers try to write specs in whichever format they like: word, excel, drawings, hand written, in speech, mockup tools, anything. They decide exactly how they want to express their idea without any constraints. And yet, they always fail.
There are always too many edge cases they do not think of. They only cover the "happy path" and quite often not even that. Just take the email conversation from the article as example, they didnt even touch the subject of implementation and it was already jibberish even for a developer. You need someone to actually sit down and squint their eyes over something, do research and run some test cases for a few hours before these emerge. Once you start doing this you are already by defition a software developer.
Taken to the extreme, could you not consider raising a child the ultimate programming exercise for humans?
Perhaps the "right-brained" are already very good at programming other people, working with faulty, non-deterministic, somewhat chaotic computing environments where "left-brained" patterns of software development fall short....
"something incredible will emerge for the right-brained bourgeoisie and literati."
Yes, a real quantum computer. As long as we're dealing with 1's and 0's, there's an insurmountable barrier for those who would get creative with computing.
I've always wanted to attempt this piece: to take all the many layers of abstraction that we deal with, parse them, convert them, and render them through my formidable linguistic talents into one elegant, beautifully constructed piece of prose that magically makes it all comprehensible to lay readers. I haven't yet attempted it, but I give props to Mr. Ford for trying. I'm not surprised he ended up with a novella.
Oh, and why does bloomberg.com want to use my web cam?
A quick google search showed that about 300 words per minute is average for an adult reading pace. I'm a slow reader so I'm probably right around there. So that's ~127 minutes to read all of this, not including time spent playing with the great animations. Probably better for me to get a bit more work done before I tackle the rest of this one (only read section 1 so far).
> I've always wanted to attempt this piece: to take all the many layers of abstraction that we deal with, parse them, convert them, and render them through my formidable linguistic talents into one elegant, beautifully constructed piece of prose that magically makes it all comprehensible to lay readers.
I agree with this and below comments. It took... considerable dedication... to keep reading it all the way to the end. I can't even comment on it directly as it was all over the place but with a nice integration/flow of the sub-topics. All I can call it is an experience lol.
I downloaded my certificate. Might put it on my resume. Evidence of willpower if nothing else. :)
That was an interesting piece of journalism. I think though that more and more VPs and SVPs actually do know what 'code' is. While there was a time when it was all mysterious to upper management we're rapidly filling in folks in senior positions (up to CEO even). What new coders don't always get though is that there aren't a lot of new problems in computer science or developing code. Sure people have tried different methodologies to get around the problems in different ways, but the problems were the same (how do I test, how do I track defects, how do I schedule, Etc.) This is good and bad, good because companies get more effective at getting things done, and bad because it can lead to some tension if you can't explain in a quantifiable way why the new way is better than the old way.
A story like this is probably dangerous - it touches on so many ideas everyone will find something to gripe with, and it's hard to make a comprehensive and consistent story.
The last time I read something that so awesomely bridged high level abstractions and low-level implementations with a human touch was Godel, Escher, Bach (albeit with a very different feel). Well done.
Interesting, I wouldn't put this anywhere near the level of Godel, Escher, Bach.
If you've read GEB and do software development, what actually did you find interesting and beautiful in the article? I've just finished reading the whole piece and I don't think I've learned much at all. Presumably as someone who already knows programming and theoretical CS I'm not the target audience, but then I'm also surprised why this has so many upvotes here on HN.
This article is a sociology of code, not a technical manual. You absolutely could get a lot out of it.
I get the GP's point about this being reminiscent of GEB, not in the sense that it covers the same topics or is at the same 'level', but in that it describes an intangible idea by approaching it from different angles and describing that same core concept from the shadow it casts in different directions. In GEB that core concept of self-reference was tackled from multiple perspectives so that an image of this common theme emerges as you read these different views onto it. Similarly, this article tries to conjure an image of 'code' as a cultural artifact, by portraying the shapes it casts in different directions - on the people who create it, the people who have to fund it, the tools and artifacts it generates. And it does so, like GEB, with wit and intelligence.
It's 2015. The audience of this article shouldn't even exist. The reader, as described in the article, is a VP who has so little understanding about what it is his company does, that the only meaningful abstraction he can mentally picture is that of his employees "burning barrels of money".
Imagine an auto company VP who says "I don't know anything about engines and drivetrains and all that technical stuff. All I know is that when you guys are in a meeting talking about your variable valve timing system, all I smell is money burning!"
That would not be acceptable. Yet, here we are, over 30 years after the original IBM PC was released, and there's still a corner-office audience for "what is a computer?"
You're living in a pretty isolated world, my friend. I'd say 90% of my friends would be the audience for this. I've been coding 20 years, most of my friends are successful, grad-degree educated people in a variety of fields, and some of them are even my coworkers.
Who is supposed to teach people what code is? Our schools? Who with a CS degree and programming experience would willfully choose to teach in the USA's education system?
Or maybe the companies who make all their money from code? I think not- it wouldn't help the economic position of Apple, Google, FB, or Microsoft if everyone knew what code is and how it works. It strengthens the tech economy's stranglehold on society when code is treated as something inscrutable.
So there's really very few resources for people- even educated, successful, technically literate folk- to grok "what is code?"
Many coders would do well to read a similar article, if there was one, called, "What is Society?"
> Many coders would do well to read a similar article, if there was one, called, "What is Society?"
There was a thread a while back where software developers told me I was unreasonable to expect them to know who the vice president of the country they lived in was. I feel you here a whole bunch.
I think that Parent made the observation that it is sad that in our information society people (in responsible positions (regarding ICT)) don't know about the fundamentals of the information society. At least, that's what I took from it and I concur.
We, as a society, should have started integrating computational thinking (Wing, 2006) as a core competency in the k-12 curriculum from the late 1980s onwards. We didn't.
anecdote
In 1991, I was in 5th grade, I saw the first computer enter the classroom in my primary school. It wasn't used but for some remedial mathematics training for a students or two and I believe the teacher did a computer course with it.
In 2010 I became a high school computer science teacher. There were three computer rooms (about 30 computers each) for the whole school (of about 1500 students) running windows XP + IE 6. Besides my class, the computer rooms were mostly used for making reports and "searching for information". Some departments did have specialized software installed (most of which came with the text books), but used it sparingly at best. On top of that, these software was mostly simple, inflexible, mostly non-interactive, non-collaborative, and "pre-fab" instructional materials. Often this software was not much more than a "digitized" version of parts of the text book with some animations, games, and procedural trainers mixed in.
"it wouldn't help the economic position of Apple, Google, FB, or Microsoft if everyone knew what code is and how it works. It strengthens the tech economy's stranglehold on society when code is treated as something inscrutable."
Is just cynical. Tech companies aren't so fragile as to depend on general ignorance among the human population. If more people understood code, these companies could create more code. I think you point to an underlying misconception that some how technology is just a barrier to entry, and doesn't provide intrinsic value. But I do not believe this to be true.
> Who is supposed to teach people what code is? Our schools? Who with a CS degree and programming experience would willfully choose to teach in the USA's education system?
Grade school teachers aren't expected to be specialists in the field they teach. They're expected to be specialists in teaching. Usually they have a bachelors (or masters) in education and maybe another degree, but it may or may not be the thing they teach (if they even teach only one thing at all).
To put it another way, this is a bit like asking what physicists would willingly teach in the USA's education system? The answer is obviously not very many, but that's beside the point. Physics still gets taught.
'But no one told me I needed to know this' stops being an excuse as of adulthood, if not earlier. Anyone who cares what coding is only needs curiosity and an internet connection, the web is full of introductory material for almost every level from almost every angle. If a professional in the current world doesn't know 'what coding is', they just don't care.
What? I imagine there are many, many auto manufacturing VP's who think, "I don't know anything about engines and drivetrains and all that technical stuff [– and I don't need to]."
Why should the VP of Human Resources need to know how a drivetrain works? Or the CTO? Or the CFO?
They are experts in their area focus. It's ridiculous to expect every manager to understand everything about their business. Would you expect the CTO of Starbucks to be able to tell you how all of their drinks are made?
I wouldn't. And I wouldn't care if they could.
In general, a good executive doesn't need to know the minutiae. They need to know how to motivate people, how to keep projects on track, how to recognize talent, how to delegate, how to budget, how to distill information for other executives, etc.
Sure, knowing the minutiae usually helps. It's easier to sniff out all the BS people feed you, etc. But it's far from the most important knowledge and skills a great leader needs.
I've only skimmed the article so far, but the part that stuck out to me was this (technical manager talking to the VP):
“My people are split on platform,” he continues. “Some want to use Drupal 7 and make it work with Magento—which is still PHP.” He frowns. “The other option is just doing the back end in Node.js with Backbone in front.”
Now, that's an example of a terrible trait for an executive. TMitTB clearly has very little ability to communicate with people outside of his area of expertise. The ability to convey complex ideas simply is crucial. Why would a non-technical executive care about the framework you're using? That's asinine. Worrying about the implementation is TMitTB's job. When meeting with the VP, TMitTB should talk about the business impact of options. This option is cheapest but doesn't give us these features that the marketing department says they must have. This option is best, but it's much more expensive to hire developers with those skills right now.
"When meeting with the VP, TMitTB should talk about the business impact of options."
I don't think he even knows what the business impacts are. TMitTB is just a tech guy who works for the new CTO. Presumably, the CTO (being an executive in charge of technology) can speak both the language of tech and the language of business and could make a business case to the VP, in terms he understands, as to why the company needs the new software. The CTO should not have sent her tech guy to talk to the VP.
There are always going to be people like this, in any field. TSR (of Dungeons & Dragons fame) actually had a CEO who forbid her employees from playtesting their products during work hours, calling it "playing games on company time." http://1d4chan.org/wiki/Lorraine_Williams
Dear God, what a disconnect. How does someone with that line of thinking even get a job in the gaming industry?
Then again, I'm reminded of a boss I used to have who would ask me to fix the shipping calculator on our web server, then ten minutes later he would poke his head in the door and tell me to "quit playing on the goddamn computer and get some work done!" He didn't realize that "working on the web server" is done at a workstation, not physically taking the server (remotely hosted of course) apart and putting it back together. All he knew was the customers were complaining about the shopping cart module not calculating shipping correctly.
Yet, they do: many VP's in this position are promoted from parts of the company that have nothing to do with tech. I even wonder if this is the majority. Not sure. A huge chunk of VP's out there, though.
It's got to be the majority, since most non-tech companies (i.e., the vast majority of companies in the world) only use tech as a tool - it's not the focus of their business. The executives of banks, retail businesses, airlines, etc. are not likely to have been promoted from the IT Department.
The VP is not in charge of a software company. Presumably some sort of widget/manufacturing operation ("cycle reduction"), so it is unfair to accuse him of not knowing what is going on in IT (at the level of engines/drivetrains for an auto company).
This comment falls neatly into the category of my favorite trendy term of 2014/2015, the "hot take". No matter how informative or well-intentioned a piece of writing may be, some people will just immediately go looking for something wrong with it, reason or authorial intention be damned.
Maybe instead of ridiculing the majority of people who don't understand what we do, we should celebrate a piece like this that makes an effort to educate. Maybe instead of lamenting their ignorance, we should commend the VPs and everyone else with enough curiosity about code to make it through this behemoth of an article.
It's perfectly acceptable. Business optimises for least-knowledge-you-can-get-away-with.
Three generations of my dad's side of the family were in the print industry: everything from printing Vogue and Playboy to fancy art books to dull but well-paid corporate stuff (annual reports, mergers and stock issue documents—500 page books that the SEC make you print filled with legalese that nobody reads).
Most people working in big print companies know nothing about print. They don't know about how paper works or how ink works. They have no understanding of how colour works or why you can't print certain colours on certain materials, or how long certain types of print work takes. Not at the junior level and not at the management level.
Hell, if you took half the people in a big print management company and asked them to explain the basics of offset printing, they couldn't give you a "lead paragraph of Wikipedia"-level description. And that technology has been around since 1875.
For all but a small set of technical and management roles, a lot of businesses are far less interested in technical know-how than "soft skills". In a shocking number of places, the ability to build a tower out of rolled up newspaper and sticky tape in a team building exercise is valued over an ability to know the details of how the industry or its core technologies work.
I think you're taking their conceit of a VP audience too seriously. To me that just feels like a fun narrative choice, and the article is just intended for any adult dealing with and slightly bewildered by code.
A lot of people find themselves in this role, people who specialize in other fields but still need to interact with developers because software is eating the world, but more quickly it's eating their role.
Actually, I've met a lot of programmers who are deeply resistant to learning anything about the core business of the firm they work for, especially if it can't be automated of if it resists automation because of strongly established legacy practices, as well as the tech short-sightedness that so often prevails - eg saying that the putative listener won't need $85,000 in Oracle licensing any more, oblivious to the fact that such licenses involve periodic contractual obligations.
> Imagine an auto company VP who says "I don't know anything about engines and drivetrains and all that technical stuff. All I know is that when you guys are in a meeting talking about your variable valve timing system, all I smell is money burning!"
His job isn't to know about drivetrains and engines and stuff. His job is to manage products, cashflow, audit requirements, stock market regulations, and marketing. That's the stuff he has to deal with day to day.
Most VP's do not sell computers (or even software). They sell goods and services that depend on computers. What you're saying is more like "a VP at UPS or FedEx should understand everything about engines." That's ridiculous, you don't need to understand how trucks work to know they move goods from point A to point B.
Writing software is not what (many) companies do. Companies exist to create things of value for their clients. Software is one aspect of the secondary functions that enhance this value creation.
In the article, the VP worked at a company that sold things on the internet. The things they produced were their primary function; internet platform development is a secondary function, much like marketing, hiring, business development etc etc.
So the theoretical VP's core competency should not have been software dev, or even technology - that's what CTOs and technical leads are for. In fact his core competency probably wasn't product development anymore, if it ever was. He was a manager, and his role was managing resources within the company to optimise their primary function.
Hence the disconnect, and hence why articles like this (and audiences for them) exist.
This article doesn't attempt to answer the dull, dry, boring, well-understood question "What is code?" in the sense of "What is it that computers use to make them do what they do?". It tries to answer the much more interesting, intangible, existential question "what is code?", in the sense of what does it mean for the world to depend on a culturally isolated priesthood of technologists who control what computers do? How does that work?
I've seen a lot of 50something execs who have been passed by on the technical superhighway.
I met a CIO of a large insurance company who didn't know what Python was.
I know a senior banker and educator who struggles with the basics of Excel and Word. (Yes, there are still people out there who are used to the days when secretaries did all the work.)
I knew an IT exec at a large consumer products company who didn't like using computers at all.
It should make sense that this article & VP exists. 7 billion people in the world and only 11 million professionally code, and 7 million are hobby coders, according to the author of the article and the IDC.
After auditing the software of at least 20 different startups, I'd have to say there will always be people in positions of power who know nothing. Just look at our politicians.
I though the imaginary VP was pretty clearly not working in a software company. What if random companies suddenly needed to design and build their own automobiles in house? There would be a similar lack of comprehension about just what the automotive engineers were actually doing all day.
At least, I know I would be lost and would greatly appreciate a guide like this.
> Yet, here we are, over 30 years after the original IBM PC was released, and there's still a corner-office audience for "what is a computer?"
As someone who got their start on DOS and BASIC back in the 80s, I say you raise a pretty good point. There are so few languages depicted in this web brochure that it does not illuminate anything.
So many people stopped learning in the 80s that the 50s are starting to catch up with them again.
This is the stuff that cardboard box forts are made of.
I think of Minecraft as a visual representation of a database. Every block you see has a set of values, starting with the 3 that determine its location within the world (coordinates) and extending to include block type, which determines other values.
And too, even the open spaces. For Minecraft reminds you that a block can occupy any space. Indeed, an open space is a set of blocks whose block type is "open", which makes it both transparent to light sent from neighboring blocks, as well as not blocking player movement.
Huh, is this what caused the whole page to go pixelated and ask for permission to use my camera each time I changed orientation on my tablet? I had to reload the page each time this happened and scroll down to where I was before. Very annoying.
We make the vast majority of our money from selling our software subscriptions and have ~4k employees in R&D. Depending on what data you use[1], if we were a public company we'd be the 4th largest in the world by revenue.
"C is a simple language, simple like a shotgun that can blow off your foot... Think of C as sort of a plain-spoken grandfather who grew up trapping beavers and served in several wars but can still do 50 pullups." Most certainly.
The whole post is just a stream of consciousness brain dump that a layman would never understand. I believe it's possible to explain these things without circular reasoning.
"What are the major programming languages, and what are they used for?" is a different question than "Can you tell me what code is?" The second can include the first. But if your and Paul Ford's answers were swapped, both questioners would have good reason to say, "That's helpful but not what I was looking for." To extend your kitchen metaphor, you haven't mentioned anything about why or how your instructions would ever work. You've left out the compiler, interpreter, executor: the human.
Granted, I think Ford expanded the domain of his question a little further than he needed to, for the sake of what looks to be fun. And I think he occasionally picks a piece of jargon where a clearer, more ordinary word would have done just as well—though I'm not in the mood to dive back into the article to find a case.
In my opinion, this is what society today needs. I don't feel like we need everyone to be able to code, but rather just have a sense on some level that computers are nothing mysterious or magical, unconquerable or incomprehensible, but rather just machines of human creation.
Not sure if we were reading the same article or not, but this is not "what society needs." While there is a part of me that loves computing and wants to share it with the world, I also realize that the nuances of compilation or futures are entirely inside baseball and irrelevant to the vast majority of society at large.
The computer, for most people, is a tool. A means to an entirely unrelated end.
You'll see similar resource consumption when using event listeners tied to the mouse movement. It's generally not noticed by the general populace, but gives every developer a pause. The page does seem to struggle at times.
Very informative, thanks. In native land we can listen to mouse motion but it is more CPU friendly to have a timer and poll the mouse position periodically, particularly if the location of the cursor causes further processing (like working out what data to display in a popup hint). The good thing with the mouseEnter / mouseLeave is that you can stop the timer and only restart polling when they enter again.
Is there a way of doing this on web pages or is it really still just callbacks for mouse motion?
It's like Myspace and Tumblr barfed all over Businessweek. I opened the article in Firefox with the mobile-emulation feature turned on. Because good god almighty this thing is a trainwreck otherwise.
Interesting, I was able to read the article just fine on my 2011 Kobo touch. 800 MHz ARM Cortex A8 and whatever Webkit was around in early 2011. The border animations are off but all the text and plain images work.
Whatever effects they're running, they did an impressive job with graceful degradation.
Aggregate sites like Hacker News and Reddit make distribution of news and ideas very quick and viral compared to more organic growth such as word of mouth and google.
I had no idea they made an hour long educational video on windows 95 with the cast of Friends! That is awesomely 90's. This is a really cool write up, clearly a lot of work went into it
Imagine a world where Wikipedia isn't an encyclopedia, but a crowd-sourced collection of all code meticulously indexed and documented that could be written for one language.
> You know what, though? Cobol has a great data-description language. If you spend a lot of time formatting dates and currency, and so forth, it’s got you. (If you’re curious, search for “Cobol Picture clause.”)
Paul Ford really is in a class all by himself. Everything I've read by him is truly wonderful.
As a writer, it's both inspiring ("look how amazing nerdy non-fiction can be!") and soul-crushing ("look how much better someone else is at writing!"). I try to focus on the former, but, man, he really makes the rest of us look like Celene Dion showing up at your dive bar's shitty karaoke night.
There are so many ways to mute my computer, I wouldn't even know how to list them all. But on top of the list I would start with the volume keys on the keyboard and then with the volume panel in the menubar and then with the audio panel of the system preferences. Towards the far end of the list I would cut the wires to the speakers.
Well now you've got me to investigate. There seems to be a video in the article. However, it is only displayed as a white box in my browser (Safari), without any controls. I can't stop it, I don't even know it's there, except it it playing audio. Strange...
Part of me looks at this and thinks, "This is preaching to the choir"...because while the engineer in me appreciates all the layers and explorations...It must be incredibly bewildering to anyone who is not a coder, which is the ostensible audience given that the story starts off with, 'We are here because the editor of this magazine asked me, “Can you tell me what code is?”'
But then I see the interactive circuit simulation and think "Fuck it, who cares, this is awesome!". Designing circuits is one of those things that, if I were a self-learned coder instead of a comp. eng major, I would've never delved into...yet learning how to build an adder circuit and getting an appreciation of the most basic building block of computation (and how surprisingly complex it is to just add 1s and 0s) is a profound lesson that I think is essential for me, personally, to really grok programming. All the sections about culture and conferences and etc. are a little bit off-field for me...it's not that I don't think that code and life and human thought and behavior aren't intertwined... * I just think the discussion about conferences reads as if the author doesn't realize that all disciplines spawn conferences and conferences culture. There's nothing particularly unique about code conferences. Not the sexism, not even the nerdiness.
I would love to see the OP's editor respond in a not-quite-as-length essay. What did they learn about code after reading the piece that they didn't understand before?
edit: * I'm emphatically not arguing "Oh but everyone does conferences shittily so tech conferences shouldn't be shamed". Just that having it in this "What is Code" essay makes it seem as if it's a notable "feature" of programming...but that understates the problem by an order of magnitude. Sadly, it's a feature in most every discipline, and the inherent feature is the gender imbalance, not the topic of the conference.
edit: Also, I wished that the section on Debugging was much higher than it is...Robert Read's "How to be a Programmer" [1] makes it the first skill, and that's about the right spot for it in the hierarchy of things. Maybe it gets overlooked because it has the connotation of something you do after you've fucked up. But, besides the fact that programming is almost inherently about fucking up, the skill of debugging really underscores the deterministic, logical nature of programming, the idea that if we have to, we can trace things down to the bit to know exactly what has been fucked up in even the most complex of programs. And that's an incredibly powerful feature of programming...and not very well-emphasized to most non-coders.
Not to your main point, but the circuit simulation reminds me of Silon by SLaks: http://silon.slaks.net/
Edit: Also, as a late-bloomer and self-taught (self-teaching) programmer, I am on the other side of the paradigm you're talking about. Petzold's Code is one of the first books a self-taught programmer should pick up. It is an awesome introduction.
One of the few worthy things I felt I got out of school was the moment I grokked the whole stack from sequential logic to the program counter and control logic from a cpu, how each clock tick formed a new circuit. That was really mentally expanding. I got it from reading a prescribed book for a class I wasn't taking from a professor who was a tool, so it is possible to learn these things outside of class. In fact, that's where the real learning, IMO, happens.
One of the most memorable weeks in my Engineering degree was using Cadence to build a CPU from the ground up. Every transistor, every connection, the ALU, etc was laid down by someone in our little group of students, and then wired together to make a thing with a few thousand transistors. And it friggin worked.
It also showed how the chip itself would be laid out, where the dopants would be and such.
> Part of me looks at this and thinks, "This is preaching to the choir"...because while the engineer in me appreciates all the layers and explorations...It must be incredibly bewildering to anyone who is not a coder, which is the ostensible audience given that the story starts off with, 'We are here because the editor of this magazine asked me, “Can you tell me what code is?”'
I completely agree. I got a third of the way through it before I just couldn't stand the obfuscation and decoration any further.
What's sad (as I [tweeted][1]) was that there's a 1972 article by Stewart Brand, published in Rolling Stone of all places, that does a better job of actually explaining what computers can do, without resorting to jargon and jive: http://stuartpb.github.io/spacewar-article/spacewar.html
Was the bit about PHP standing for Personal Home Page a joke? I always thought it was "PHP Hypertext Preprocessor". The coolest thing about PHP is the infinite recursion in its name!
The article ended up just yanking my Firefox session to somewhere in the middle of the page, followed by some Clippy expy nagging me about how fast I'm supposedly reading the article.
One thing I noticed though is that the author is definitely stuck in the old "Microsoft is the great Satan" mindset. If he ever finds out about all the open-source stuff MS is doing these days under Satya Nadella, I think his head would probably explode.
He doesn't know what to say to a C# developer (nothing in common), but automatically trusts a Python developer? Really? sigh
The scroll performance was bothering me so much i had to add transform: translateZ(0); to the #background-canvas element of the page to stop the screen painting on every fking scroll; to continue to read in peace without my eyes bleeding. Great article though :)
> Smalltalk’s history is often described as slightly tragic, because many of its best ideas never permeated the culture of code. But it’s still around, still has users, and anyone can use Squeak or Pharo. Also—
>> 1. Java is an object-oriented language, influenced by C++, that runs on a virtual machine (just like Smalltalk).
> 2. Objective-C, per its name, jammed C and Smalltalk together with no apologies.
> 3. C# (pronounced “C sharp”) is based on C and influenced by Java, but it was created by Microsoft for use in its .NET framework.
> 4. C++ is an object-oriented version of C, although its roots are more in Simula.
>> The number of digital things conceived in 1972 that are still under regular discussion is quite small. (It was the year of The Godfather and Al Green’s Let’s Stay Together.) The world corrupts the pure vision of great ideas. I pronounce Smalltalk a raging, wild, global success.
The specious reasoning on display in this paragraph is almost offensive in its glib uncomprehension. Calling Smalltalk "a raging, wild, global success" because modern programming languages call themselves "object-oriented" is like saying women in technology are well-represented because Ada Lovelace was the first programmer.
I get that it's supposed to be tongue-in-cheek, but like the rest of the writing in this article, it's supposed to be tongue-in-cheek in a way that gestures toward what the author actually thinks. In this case, what it's gesturing at is the notion that Smalltalk has had a large-scale tangible influence (if not wholesale adoption) on modern programming languages, which, if you actually take the time to understand the subject, is just not true.
> “No,” I said. “First of all, I’m not good at the math. I’m a programmer, yes, but I’m an East Coast programmer, not one of these serious platform people from the Bay Area.”
1. This clearly took A LOT of work, and I have not finished reading it. I intend to, but as another comment calculated below, that will take around 127 minutes. This comment is simply about the beginning.
2. I'm not 100% certain yet what the intended goal of this article is, so I may just be off base. That being said, my criticisms should be interpreted more as questions, since I'm deeply fascinated with how to make programming more accessible. I hope they are taken as such, and people share their experiences/successes/failures in getting people to understand "what we do". Again, like other commenters here I have suffered the fate of parents not really understanding what you do (unlike the even superficial understanding of what a physicist does).
3. People learn differently, this is me pretending to not know anything and reading this article. It is thus flawed on two axises: I can't know for sure how I would have taken it in, and even if I did, it may be great for most people but bad for me.
All that being said, I had a few issues with this article('s beginning) if the goal is to make programming seem understandable to non-programmers. It seems to jump around a lot at the beginning and focus on just how complex everything is. If the goal is "programmers are justified in their work, look how complex everything they deal with is!", then this may be an OK approach. However, if the goal is to help them understand what we do day to day, it may not.
Some examples:
1. The early references to math. I once upon a time thought math was a pre-requisite to programming. I have now met enough awesome programmers that are absolute rubbish at math that I no longer believe that to be true. I believe referring to the "math" of things a lot scares people off (makes it seem like "one of those math things math people do" and inaccessible, when in reality your everyday programmer does not do a lot of (complex) math).
2. The early reference to circuits, compilation, and keyboard codes. This is a tremendous amount of scope that is unnecessary in my opinion, and just makes everything seem so obtuse. Showing keyboard codes goes a long way in conveying how much a computer does, but I feel is very confusing in relation to programming. I don't deal with "keyboard codes". We could also get into for example the actual hardware and how even having to deal with denouncing a key is hard! But I think everyone would see why that isn't great for the (introduction) of a programming explanation.
3. The circuits I believe are pretty and let you do things interactively, but I have a hard time believing they convey any information to people not familiar with programming. No one knows what XOR means (which you can flip the gates to), and just furthers the idea that code is this weird incantation we do. More putting them in "awe" of programming than understanding it.
Then again, I've been criticized for relying to heavily on analogy. My explanation would probably start with a lot of hand waiving: "lets tell the computer to get a sandwhich shall we?", then trying to get deeper bit by bit, etc. Others have probably tried this and failed, so I am genuinely curious if people walk away from this article feeling like they have a better understanding of things.
You may be surprised how many programmers don't use CLI. I have clients on windows machines that saw me running git commands in terminal and said "So glad we have a gui to manage this for us"
I would try to explain it as levels of abstraction and how they extend beyond the computers that execute the code. You can go down through the levels of abstraction, 1 by 1, until the point is made rather than attempting to start from the bottom and work up.
So, for example, when talking to the non-technical executive, the first level of abstraction is the technical expert that tries to explain complex technical issues. Below that, there might be a technical management layer that deals with technical issues on a more granular level, but still isn't looking at the code. Below that there's the actual developers who are writing code and are concerned with the actual logic the computer is executing. Below that are the framework authors that abstract away the common parts of writing an application of a certain type. Below that are the language platform authors who write compilers or interpreters that translate the code typed by the programmers into a format that either the computer or a lower-level abstraction (LLMV, etc) deals with. At this point, it's probably not necessary to go any lower, but you can go all the way down to CPU/machine architecture level, if necessary.
The key point is that even highly-technical people have to trust the layers of abstraction below the point where they have full understanding. I've been coding for over 20 years and I still only have a cursory understanding of how my compiler is translating the code I write into machine code, let alone how the actual hardware that runs the code. I took EE courses in college and understand the theory, but the implementation by the folks at Intel and other hardware vendors is opaque to me and I'm forced to trust that it works.
The coders employed by your company may be able to dig into framework code, but the chances are that they're fully trusting the runtimes that they work with. That trust may be the result of a well-earned reputation or through testing that the claims made by the language runtime are empirically true, but it's still trusting something that they're unequipped to verify themselves. This need to trust bubbles all the way up to senior management. The systems are just too complex for anyone concerned with the finished product to understand the whole picture.
That means that, as an executive, you're likely trusting your senior technical leadership. The only way you avoid doing that is to dig in and better understand the abstraction layer they're providing. You can also make that trust easier by doing the same sorts of things that a coder does with their language runtime...give tasks to your abstraction layer and test whether they're completed successfully. And, when those tasks are not completed successfully, don't accept techno-babble responses, dig in to understand the wheres and whys of where things broke down. Likely, the chain of trust of those abstraction layers was broken at some point...figure out where that point was so you can prevent it from happening again.
Every abstract layer adds uncertainty to the system. A CPU engineer can tell you how long a small task will take within a ns or so. A compiler engineer can tell you how many CPU cycles an expression will result in and compute an approximate time for a given processor within microseconds. And it continues as you go up the chain until you're talking to senior management and he's giving you swags with a margin of error of months. Understanding this goes a long way towards explaining the behaviors that are so confusing to the non-technical executive. It's intimidating, but the good news is that many of the skills of a good manager are what's necessary to achieve the necessary level of demystifying. The way that you begin to understand these layers of abstraction is through inquiry. Ask the right questions and, over time, you'll understand more and more of how software development happens.
Ignoring the content, the structure of this article is amazing. It feels like an entire magazine in a single essay. The background animations that change as you scroll, the contextual content (try scrolling really fast). I'm not even all that keen on the bright oversaturated aesthetic, but it's just so cool. I'd love to see a short piece on how they made it.
Intro articles like this do a lot to reveal biases and misunderstandings. Like with Java.
The article says "Java= enterprise" but I can tell you the best user experiences I ever saw delivered over the web were those done with Java Web Start (not applets- applications launched in a JVM from the web). I developed several in the day that continued to run for years- because users loved them and they were safe and secure.
Why Web Start didn't take over, I have no idea. It was also a superb platform for mobile delivery.
> Intro articles like this do a lot to reveal biases and misunderstandings.
This is one of the reasons I barely recommend any intro articles in Lean Notes (http://www.leannotes.com/): almost every single one is just a stream of incomplete and incorrect statements about how the world works, based on the author's myopic personal experiences.
Rather than properly generalizing and consolidating what needs to be said to convey a full understanding of the topic, most intros settle for the first example they can think of that could be remotely construed as related to the words they've previously used for whatever subject, regardless of whether it has meaning in any context. (Example: saying that type safety prevents you from trying to "multiply seven by cats".)
It seems like a pretty Dunning-Kruger thing: the less broad your knowledge is, the more justified you feel in writing an introductory text to the field.
The only time I've ever seen somebody actually qualified to write an introductory text actually doing so (as I can immediately recall) is Charles Petzold's [Code: The Hidden Language of Computer Hardware and Software][Code] (although I suspect, from the few excerpts of it I've seen, that Brian Kernighan's "D is for Digital" is good, too).
I can't understand the rationale behind this gaudy redesign job that Bloomberg carried out. I just can't wrap my head around it. It just violates everything that I know about web design and usability for news/corporate websites/portals.
Maybe they were trying to pull off a Craigslist here but still I can't really stomach these changes.
This is not the main BBG website, but yeah I don't like the new BBG Home page design either, this post design (its a post for bbg/graphics) is actually really awesome and its main purpose is to be stereotypical "nerdy" .
Watch it when capitalists pushing incessantly people to learn coding. They're trying very hard to cut the costs of their input "materials" and they will do everything that they could to devalue us in every way possible.
So, if you're a talented and competent dev, be super aggressive with these predators and take everything your hands can grab before they have the upper hand and show us their true colors.
This is a really ugly, selfish attitude. It's like opposing literacy because it will put pressure on jobs for those who can read and write. It's circling the wagons around people who had the privilege and opportunity to learn these things before everybody else.
"...the balance of power between investors and entrepreneurs that marks the early, frontier days of a major technology wave (Moore’s Law and the Internet in this case) has fallen apart. Investors have won, and their dealings with the entrepreneur class now look far more like the dealings between management and labor (with overtones of parent/child and teacher/student). Those who are attracted to true entrepreneurship are figuring out new ways to work around the traditional investor class. The investor class in turn is struggling to deal with the unpleasant consequences of an outright victory..."
This sounds contradicted to the mathematics/empirical/quantitative observations. Startups built on texting two alphabetic characters are attracting million dollar financing rounds. Startups based on erased timeouts of texts are declining 3 billion dolllar acquisition offers. VCs are trying to get deal flow by building the reputation of being the most helpful to entreprenuers. Interest rates are at historic lows and hundred billion dollar pension and mutual funds are pouring money into every 1st to 3rd tier vc to chase returns. tl;dr...this sounds like some bs.
I don't see that... or rather, if it's true here it's true to a significantly greater extent in most other industries.
Money continues to be available, and often lots of it. It's available on better terms than most others in most other professions can even imagine receiving.
To put it bluntly: in most industries you are meat and own nothing and never have any chance of owning anything. This has been the condition for nearly all human beings who have ever lived, today and in the past.
There are also more alternatives to VC today: larger angel rounds, crowd funding, etc. It's also easier to bootstrap since everything (but people) has fallen in price. Those two things together have made the funding environment more competitive for VCs -- they have to offer more value or compete at the higher end.
EDIT: Just being sarcastic. You are clearly incompetent, coasting along in your job, and afraid of someone with 6 months of experience being better than you.
Seems to have worked well enough for doctors and lawyers. They're unionized (through the AMA and ABA), upper-class professionals who command much more respect from the general public than we do, and whereas our salaries tend to max out at around $150k, theres can easily exceed $500k (in the case of medical specialists or law firm partners).
Capitalists only want to reduce unnecessary costs, not all costs.
A programmer is more like factory equipment than a factory worker. If a company invests in superior equipment, they can produce better quality products and net higher profits.
If we're going down in the name of efficiency, the managerial and legal professionals are going first.
Watch it when capitalists pushing incessantly people to learn coding. They're trying very hard to cut the costs of their input "materials" and they will do everything that they could to devalue us in every way possible.
Did you actually read this article? The article doesn't aim to teach Bloomberg's audience (which consists of VPs, SVPs, and managers, as implied in the first couple of paragraphs) how to code or replace the average developer.
Capitalists want people to be generally happy. Happy people buy more things. People who have useful well-paid jobs are generally happy (and can buy more things). Everybody knows that there will be huge drop in available jobs next few years, but software development skills will still be in high demand. Hence, capitalists want more people to learn how to code, so they could keep their jobs, so they can be generally happy and buy more things.
Capitalism _is_ profit-oriented, but happy people bring more profit.
"How often are you going to be multiplying sevens and cats? Soooo much."
Where the fuck does this meme of "fundamental type mismatches come up all the time in ordinary code" come from? What kind of defective system are people writing where it's normal for strings and numbers to be interpreted relationally (even accidentally)?
It sounds like the author is trying to demonstrate the significance of things like syntax transformations and format conversions (like transforming an email address to a mailto link), but that's nothing like "multiplying sevens and cats". It's manipulating things that aren't inherently incompatible - if anything, it's multiplying sevens and "7"s.
All these batshit insane contrived examples in asides like http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod... do is make code seem less accessible and comprehensible to anybody who isn't already intimately familiar with what's safe to interpret as sarcasm or hyperbole and what's not, which goes exactly contrary to the stated thesis of the article.
It can happen accidentally quite easily. Someone new to a codebase starts hacking in a feature and mistypes a variable as 'value' instead of 'values'. They fail to realize there's already a 'value' variable in the global namespace (perhaps it's a gigantic spaghetti code mess of a file). They don't have good test cases that exercise this exact line and fail to see the bug. Code ships to production, three months later the line runs and explodes.
On the web it's sort of all strings, so it's not hard to be in a situation where you have "length=7" & "cat=tabby" and get into a problem. Beyond that, many developers are in the habit of using primitives for everything, which makes these sorts of errors much more common.
I'm glad I came here to read the comments that urged me to read on, because I stopped at the point where the VP was whining that his job was on the line and the software guy's wasn't. Made me a little sick to my stomach. In what company is that ever the case? Even if the VP's job is lost (rare occurrence in my experience), the severance package is more than the software person's salary for a year is.
My understanding was that the development manager in the taup blazer was an IT consultant brought in to run the project making it a little easier for that person to disappear to the next gig no matter how disastrous the project turned out.
I hate to sound hyperbolic, but I can't overstate how impressive this work is. For me, it evokes nothing so much as Tracy Kidder's The Soul of A New Machine [0] for opening up an obscure world (the one many HN posters live in, but obscure to most people). I am amazed both by the technical fidelity and by the quality of the story telling.
[0] http://www.amazon.com/Soul-New-Machine-Tracy-Kidder/dp/03164...
I agree. This is the best piece of writing I think I've ever read on the web. This touches on everything, and so accurately, and so concisely... this article is giving me a stroke I think.
I've never seen anything like this on a website before. The writing, the formatting, the structure, the animations; it's near perfect.
4 replies →
Same, one of the best things I've read. It may rank above Programming Sucks[1] which is my go to reference for friends when they ask me to explain to them what it is I do.
1. http://www.stilldrinking.org/programming-sucks
The big red semicolon picture near the bottom had me cracking up so hard. I will need to re-read this article every week for the rest of my life, in order to fully enjoy it.
> Writing this article was a nightmare because I know that no matter how many people review it, I’ll have missed something, some thread that a reader can pull and say, “He missed the essence of the subject.” I know you’re out there, ashamed to have me as an advocate. I accept that. I know that what I describe as “compilation” is but a tiny strand of that subject; I know that the ways I characterize programming languages are reductive. This was supposed to be a brief article, and it became a brief book. So my apologies for anything that absolutely should have been here, but isn’t. They gave me only one magazine.
Keep writing. The space is there for you.
Just to clarify, my earlier comment is a direct quote from the article. I am not the author, just thought it was an apt anticipation of some of criticisms in this thread.
This is too freaking awesome!
Isn't that a seriously mind-bendy kind of article to appear on Bloomberg? Also, isn't it very cool that a whole class of people who may not know a thing about coding (but may be interested) might get to know something about the craft and culture?
And it's presented in a very fun, off-kilter sort of way. That must have been a hell of a lot of work. I actually skimmed the second half and the little robot told me I read it all in 16 minutes which was not possible and who was I kidding!
I had a thought the other day while browsing Etsy. If software really is a craft, could I fashion a bespoke software creation and sell it on Etsy? I know this might seem like a non sequitur. But, you know, what is code? Why couldn't I do something like that?
It's such a strange but vital profession. (Seriously, I would have thought there are a _lot_ more than 11,000,000 professional coders worldwide) and one that is still coming to terms with itself. Inspiring. Note to self, do not think outside the box, code your way out of the box.
could I fashion a bespoke software creation and sell it on Etsy
There's tindie (etsy for electronics), but due to the infinitely cloneable nature of code giving it away works much better than trying to sell it for tiny amounts. In someways the demoscene is this area of software craft for the sake of it.
Thanks for the heads-up on Tindie. Looks interesting.
And I agree with you about the Demoscene. Very much one off creations which is more what I had in mind. I'm imagining extending this idea to software objects that people would like to own, that was personalised to them, that had a strong crafting element, and so on. The reason I'm having trouble articulating it is because I don't think the category of thing exists (yet?)
Apple's App Store is the Etsy of software.
I was thinking that. But then why is software-dev-as-craft cordoned off from jewellery-making-as-craft and print-making-as-craft and so on. What makes software so special it needs its own little commercial corner of the world? Serious question :)
The activity on the article's accompanying github (https://github.com/BloombergMedia/whatiscode) is really interesting. Users have suggested edits not only to the code in the article but even to add citation.
This adds another dimension to the content by including the open source community such that the subject matter (coders) can influence (and improve!) their article's content.
Thanks for the link! I had no idea it could have been open sourced!
This is supposed to be an introduction just to the abstract concept of code, yet it includes a section that asks the reader to take a test on whether or not they agree with the author on the effectiveness of domain-specific snippets of JavaScript (http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod...) - one that replies to your selections with obtuse references to the code's use of promises and callbacks.
As an outsider, I just love it when I read something presented as an introductory text and I'm confronted with an elaborate series of self-serving in-jokes that go "ha ha ha, ha ha ha, you don't know what I'm talking about!"
It's just a fun little quiz. I kind of like it as a "reality check" to show the reader that while they may understand the concepts, the reality is much more difficult and fraught with subtle considerations. It also serves as a subtle reminder to readers (who may be the frustrated business-type from the opening of the article) that there's a reason software projects are so hard and cost so much money. Software development isn't something you can grok from reading an article, even a book-length one.
"This is supposed to be an introduction just to the abstract concept of code"
Are you sure that's what it's supposed to be? I mean, the author didn't supply unit tests, all we have to go on is the specification, which was that the editor of BusinessWeek asked Paul Ford "Can you tell me what code is," and Paul Ford said "No", and instead wrote this.
My Dad always tells me he flat out does not understand what I do. He respects it, knows it's challenging and fun, but just doesn't get it.
I've sent this to him -- he's about 1/4 of the way through and thoroughly enjoying it.
This is a very fun read that's worth leafing through
My father sent a small comment chain to me as well on a topic like this on one of his blog posts.
<First guy> June 3, 2015 at 10:12 am I think software developers like to impress people with how many lines of code they can write.
<Second guy> June 3, 2015 at 3:31 pm That is not true. A good day is when you leave the office with more powerful software, but fewer lines of code.
<First guy> June 4, 2015 at 4:31 am So why is software always getting bigger ? Is it because the marketing people want to add new features all the time ? Does this even apply to free software like browsers and email clients ?
-----
Personally, I like writing less code, or reducing code to less code. Less to think about.
My personal favourite is talking The Business out of things. Solving issues without any coding at all!
1 reply →
Many years ago I managed to convince my management that, if they had to use deltaLOC as a performance metric, that they at least use abs(deltaLoC).
I then spent the next year cutting huge chunks of crap out of a C++ application that I had inherited.
Was a most satisfying experience.
2 replies →
> Does this even apply to free software like browsers and email clients ?
The more I get involved in open source the more I think most code bloat is due to people needing their egos validated by getting a commit into a project, regardless of whether the commit is all that useful or not.
1 reply →
Same. I sent this to my boyfriend yesterday: he's trying to learn programming (I'm trying to teach him Python; in response, I'm trying to learn some foreign languages) and I'm hoping that this gives him the lay of the land of how our strange world works.
Wow, I did the exact same thing, for exactly the same reason. My dad called me a few hours later and said, "I finished the 38000 word article you sent.". I checked and he wasn't far off; it's around 29000 words.
The certificate of participation at the end claims it's 38,000 words...
1 reply →
> There have been countless attempts to make software easier to write...Decades of efforts have gone into helping civilians write code...Nothing yet has done away with developers, developers, developers, developers.
I still believe. Someday, somewhere, something incredible will emerge for the right-brained bourgeoisie and literati.
Theres tons of successes, we just refuse to count them. Photoshop (as hinted in the article), is a super special purpose language for doing image operations. It no longer "looks" like coding, so we don't count it as coding for the masses. Excel is a much more general purpose language used by tons of "non-coders" (and arguably the most popular programming language on earth). Again, doesn't (often) look like normal programming, but then again, shouldn't this be expected? If it looked like normal programming, it would be normal programming and not successful.
Excel is a perfect example. The core Excel experience is basically functional programming on a virtual machine with a matrix address space laid out visually right in front of you. It even looks like traditional programming if you dive into the VBA stuff, which plenty of non-technical specialists, including MBAs and managers, do on a regular basis in the pursuit of solving their problems.
Any specialist user willing to invest some time in learning their tools can do this. A culture develops around it.
And replying to parent: those efforts around teaching 'civilians' to code are probably misguided. The investment needs to be in adding scripting and programmability into existing line of business tool, not on encouraging people to sit in front of an isolated REPL disconnected from any business value or context.
5 replies →
Two other more contemporary examples are the Android app Tasker, and the website IFTTT ( https://ifttt.com/ ).
There's something about calling it programming that turns certain people off. I remember a story about a freshman in a physical mechanics class that complained about all the MATLAB code they had to write. The professors retort was that they were free to use a slide rule instead, and that particular freshman stopped complaining.
But you're right. The mere act of calling it programming is somehow a problem. It's as if doing programming pigeonholes you into being a programmer until the end of days.
3 replies →
I concur with your assesment. The point is not to turn a syntax into an AST into processor code. The point is to provide things of value and 'easy' computing platforms targeting users who are not professional programmers create tremendous amounts of end user value.
Sadly, when I point this out professional programmers often go 'pffft - that's not real programmming' as if being knees deep in stack traces and gigantic code bases was a something with intrinsic value.
I'm not sure I agree with you about Photoshop. Perhaps (probably) there are photoshop macros or pipelines that are closer to programming, but most people use Photoshop purely in an interactive mode. They enter commands directly, and the logic stays in the users' heads, not in the computer.
Photoshop is more like a REPL tied to an image-processing library than it is a programming language.
7 replies →
Remember when search engines had Boolean operators?
7 replies →
I would say that Photoshop is a toolbox of predefined tools. If using that is coding, then people are also coding when they choose and use a screwdriver, some sandpaper, a hammer and some glue in succession from their physical toolbox to get a job done. That the operations happen electronically and that they are implemented as complex mathematical transformations of pixels doesn't change that.
Excel isn't much different in my view: most people are only using a very limited set of predefined tools to get a job done. Often badly: it is well known that there are many bugs in important, company critical Excel sheets. Excel seems like coding because it is mostly used to perform the fundamental mathematical operations we all associate with coding. But if that is coding, then so is constructing a Rube-Goldberg machine for a specific task from the parts you happen to have available. A nice exercise in problem solving under constraints. Which certainly has something in common with coding. But that doesn't make it coding.
5 replies →
Yep, great examples. Also, pretty much any experienced Photoshop user will create their own actions to automate common operations. And then you have things like workflows in Alfred.
The trouble is that so much of what we call "programming" is actually the process of identifying all the implicit assumptions that go along with an idea and making them explicit. In other words, if you knew what to ask for in an unambiguous way then most of the "programming" would be done already.
I'm working on a longer essay but that's the short version.
This is a driving force behind the pedagogy of SICP -- to think in recursive functions. If you can break down your task into sub-problems and describe them clearly you just have to put a few parenthesis around it.
Programming isn't so much about the "code" or syntax; it's semantics and intent aligned with the machine.
If you can express your problem clearly then the program practically writes itself.
1 reply →
Not just the right-brained. I want that as a professional software developer. I want a computer that will do what I thought, instead of what I foolishly typed.
Basically, I want a computer as smart as a good junior dev so I can just yell my brilliant ideas at it, and it will do the dirty work for me.
That's only because you think your thought exists and is correct. Programming forces you to confront the fact that it isn't, and that there are many aspects of it that you've overlooked.
5 replies →
As someone who has worked with junior devs before, that sounds like an absolute nightmare.
There is, of course, nothing wrong with being junior. But the rate that requirements are misinterpreted even by intelligent humans is, I think, a fundamental reason why programming isn't doable by the masses yet.
It's not because computers are hard, it's because knowing what we actually specifically want them to do is.
I remember someone referring to this as "intent driven programming". I once worked for a company that made a business rules engine. The idea was to describe a flowchart to the system and the software would ask what you wanted to do at each step of the flowchart in a top down manner till you fleshed out the whole program ..
2 replies →
That's likely AI-complete.
I don't see it ever happening: because the bar of expectations rises at the same rate at which the tools improve.
For example, think about what the NYTimes website looked like in February 2000:
That probably took millions of dollars and a team of engineers back in the 2000. In 2015, a reasonably computer-literate person could do something close to that with SquareSpace or Wordpress.com in probably.... a week?[1]
But that site would never pass muster in 2015. Something 10x (if not 100x) more complex is required for NYTimes.com in 2015, plus various native apps, plus a subscription service, and so forth. So you still need a team of engineers...
____ [1] I'm talking about the act of putting the articles onto a website, not reporting and writing the articles, obviously.
Scratch seems to me like an example of what a "real programming language for everyone" could look like.
Indeed, Lego Mindstorms is based on a similar principle, and it's used for programming robots!
I don't understand why people believe 'everyone' should be able to write code without much trouble. Virtually every activity requires focus and exercise to learn. Example: you can't just take a hammer and start building furniture or you will create a mess. If you want something nice and useful, you need to think about what you want to build, how you are going to construct it, which materials you need, which tools you need. You need to experience how the materials behave, try out certain subconstructions, research what specialized tools exist. I believe this idea that 'everyone' should be able to code is like expecting everyone to be able to build furniture. If you want to, you can learn how to build furniture, but it's not easy and will never be easy. Why would coding be any different?
I believe one area of improvement for the excessive need for precision (most of us can agree that it is excessive) is that our current tools don't use context information enough. Humans deal a lot in uncertain areas and context is what helps us.
Love it or hate it, Meteor has empowered a number of sales-types and small-business owners to create real tools to solve their own real problems.
I've seen it happen in front of my own eyes!
I omitted: ...and create so many new problems there will be years of work in it to clean the mess up.
Been in "IT" for a few decades, I've seen it happen in front of my own eyes time and time again.
The only self-help tool that has lasted so far has been the spreadsheet, and even that has gone horribly off the rails in many companies.
1 reply →
The wrong people are working on this problem. The only ones who think about it are typically programmers who have made their peace with the machines as they are today. The interface of code seems simple and logical with little reason to try to improve on it. It would take a team of artists, musicians, human factors engineers, ethnographers and some clever computer scientists to do it. Such an enterprise would be high risk and very difficult to fund because of it.
The precision required in programming makes it hard for the right brained person who won't meet the computer at least part of the way.
I'm reminded of rms's anecdote about secretarial staff learning to write Emacs macros at MIT because they didn't realise it was programming.
This suggests that current, important and well-meaning attempts to get non-programmers to meet code head-on as code may be misguided. Programming is generally easier if you're not thinking about how much it isn't something you do.
I don't believe that there is no such thing as a right brained person. I think this is a cultural myth. Same as, "you only use 10% of your brain's power". These, and similar, are false memes - iow hokum.
1 reply →
Precision is not the only important element in programming. There's a level of abstraction, such as found in the field of semiotics, that is highly important to the world of computer science.
Not likely.
I've seen non developers try to write specs in whichever format they like: word, excel, drawings, hand written, in speech, mockup tools, anything. They decide exactly how they want to express their idea without any constraints. And yet, they always fail.
There are always too many edge cases they do not think of. They only cover the "happy path" and quite often not even that. Just take the email conversation from the article as example, they didnt even touch the subject of implementation and it was already jibberish even for a developer. You need someone to actually sit down and squint their eyes over something, do research and run some test cases for a few hours before these emerge. Once you start doing this you are already by defition a software developer.
Taken to the extreme, could you not consider raising a child the ultimate programming exercise for humans?
Perhaps the "right-brained" are already very good at programming other people, working with faulty, non-deterministic, somewhat chaotic computing environments where "left-brained" patterns of software development fall short....
AI that takes natural language as input as spits out binaries for you =P
"something incredible will emerge for the right-brained bourgeoisie and literati."
Yes, a real quantum computer. As long as we're dealing with 1's and 0's, there's an insurmountable barrier for those who would get creative with computing.
I've always wanted to attempt this piece: to take all the many layers of abstraction that we deal with, parse them, convert them, and render them through my formidable linguistic talents into one elegant, beautifully constructed piece of prose that magically makes it all comprehensible to lay readers. I haven't yet attempted it, but I give props to Mr. Ford for trying. I'm not surprised he ended up with a novella.
Oh, and why does bloomberg.com want to use my web cam?
Worth noting -- it is roughly 38k words and is the longest piece ever published by Bloomberg.
A quick google search showed that about 300 words per minute is average for an adult reading pace. I'm a slow reader so I'm probably right around there. So that's ~127 minutes to read all of this, not including time spent playing with the great animations. Probably better for me to get a bit more work done before I tackle the rest of this one (only read section 1 so far).
Yes, it will tell you that at the end, and mock you if you arrive there too quickly to have read it all :).
5 replies →
> Oh, and why does bloomberg.com want to use my web cam?
To capture your photo in the certificate of completion.
> I've always wanted to attempt this piece: to take all the many layers of abstraction that we deal with, parse them, convert them, and render them through my formidable linguistic talents into one elegant, beautifully constructed piece of prose that magically makes it all comprehensible to lay readers.
This is my own, personal, incomplete work in progress in that vein: http://www.leannotes.com/
I think I like your approach better... and no Clippy.
I agree with this and below comments. It took... considerable dedication... to keep reading it all the way to the end. I can't even comment on it directly as it was all over the place but with a nice integration/flow of the sub-topics. All I can call it is an experience lol.
I downloaded my certificate. Might put it on my resume. Evidence of willpower if nothing else. :)
That was an interesting piece of journalism. I think though that more and more VPs and SVPs actually do know what 'code' is. While there was a time when it was all mysterious to upper management we're rapidly filling in folks in senior positions (up to CEO even). What new coders don't always get though is that there aren't a lot of new problems in computer science or developing code. Sure people have tried different methodologies to get around the problems in different ways, but the problems were the same (how do I test, how do I track defects, how do I schedule, Etc.) This is good and bad, good because companies get more effective at getting things done, and bad because it can lead to some tension if you can't explain in a quantifiable way why the new way is better than the old way.
What an ambitious and beautiful piece!
A story like this is probably dangerous - it touches on so many ideas everyone will find something to gripe with, and it's hard to make a comprehensive and consistent story.
The last time I read something that so awesomely bridged high level abstractions and low-level implementations with a human touch was Godel, Escher, Bach (albeit with a very different feel). Well done.
Interesting, I wouldn't put this anywhere near the level of Godel, Escher, Bach.
If you've read GEB and do software development, what actually did you find interesting and beautiful in the article? I've just finished reading the whole piece and I don't think I've learned much at all. Presumably as someone who already knows programming and theoretical CS I'm not the target audience, but then I'm also surprised why this has so many upvotes here on HN.
This article is a sociology of code, not a technical manual. You absolutely could get a lot out of it.
I get the GP's point about this being reminiscent of GEB, not in the sense that it covers the same topics or is at the same 'level', but in that it describes an intangible idea by approaching it from different angles and describing that same core concept from the shadow it casts in different directions. In GEB that core concept of self-reference was tackled from multiple perspectives so that an image of this common theme emerges as you read these different views onto it. Similarly, this article tries to conjure an image of 'code' as a cultural artifact, by portraying the shapes it casts in different directions - on the people who create it, the people who have to fund it, the tools and artifacts it generates. And it does so, like GEB, with wit and intelligence.
He does mention a "golden braid forever weaving", which is reminiscent of: "Gödel, Escher, Bach. An Eternal Golden Braid"
or 'a mind forever voyaging'.
It's 2015. The audience of this article shouldn't even exist. The reader, as described in the article, is a VP who has so little understanding about what it is his company does, that the only meaningful abstraction he can mentally picture is that of his employees "burning barrels of money".
Imagine an auto company VP who says "I don't know anything about engines and drivetrains and all that technical stuff. All I know is that when you guys are in a meeting talking about your variable valve timing system, all I smell is money burning!"
That would not be acceptable. Yet, here we are, over 30 years after the original IBM PC was released, and there's still a corner-office audience for "what is a computer?"
You're living in a pretty isolated world, my friend. I'd say 90% of my friends would be the audience for this. I've been coding 20 years, most of my friends are successful, grad-degree educated people in a variety of fields, and some of them are even my coworkers.
Who is supposed to teach people what code is? Our schools? Who with a CS degree and programming experience would willfully choose to teach in the USA's education system?
Or maybe the companies who make all their money from code? I think not- it wouldn't help the economic position of Apple, Google, FB, or Microsoft if everyone knew what code is and how it works. It strengthens the tech economy's stranglehold on society when code is treated as something inscrutable.
So there's really very few resources for people- even educated, successful, technically literate folk- to grok "what is code?"
Many coders would do well to read a similar article, if there was one, called, "What is Society?"
> Many coders would do well to read a similar article, if there was one, called, "What is Society?"
There was a thread a while back where software developers told me I was unreasonable to expect them to know who the vice president of the country they lived in was. I feel you here a whole bunch.
14 replies →
I think that Parent made the observation that it is sad that in our information society people (in responsible positions (regarding ICT)) don't know about the fundamentals of the information society. At least, that's what I took from it and I concur.
We, as a society, should have started integrating computational thinking (Wing, 2006) as a core competency in the k-12 curriculum from the late 1980s onwards. We didn't.
anecdote
In 1991, I was in 5th grade, I saw the first computer enter the classroom in my primary school. It wasn't used but for some remedial mathematics training for a students or two and I believe the teacher did a computer course with it.
In 2010 I became a high school computer science teacher. There were three computer rooms (about 30 computers each) for the whole school (of about 1500 students) running windows XP + IE 6. Besides my class, the computer rooms were mostly used for making reports and "searching for information". Some departments did have specialized software installed (most of which came with the text books), but used it sparingly at best. On top of that, these software was mostly simple, inflexible, mostly non-interactive, non-collaborative, and "pre-fab" instructional materials. Often this software was not much more than a "digitized" version of parts of the text book with some animations, games, and procedural trainers mixed in.
7 replies →
You make a good, if harsh point.
But this:
"it wouldn't help the economic position of Apple, Google, FB, or Microsoft if everyone knew what code is and how it works. It strengthens the tech economy's stranglehold on society when code is treated as something inscrutable."
Is just cynical. Tech companies aren't so fragile as to depend on general ignorance among the human population. If more people understood code, these companies could create more code. I think you point to an underlying misconception that some how technology is just a barrier to entry, and doesn't provide intrinsic value. But I do not believe this to be true.
> Who is supposed to teach people what code is? Our schools? Who with a CS degree and programming experience would willfully choose to teach in the USA's education system?
Grade school teachers aren't expected to be specialists in the field they teach. They're expected to be specialists in teaching. Usually they have a bachelors (or masters) in education and maybe another degree, but it may or may not be the thing they teach (if they even teach only one thing at all).
To put it another way, this is a bit like asking what physicists would willingly teach in the USA's education system? The answer is obviously not very many, but that's beside the point. Physics still gets taught.
11 replies →
> Who is supposed to teach people what code is?
'But no one told me I needed to know this' stops being an excuse as of adulthood, if not earlier. Anyone who cares what coding is only needs curiosity and an internet connection, the web is full of introductory material for almost every level from almost every angle. If a professional in the current world doesn't know 'what coding is', they just don't care.
What do you mean? Theres tons, TONS of resources for learning the basics of programming.
4 replies →
What? I imagine there are many, many auto manufacturing VP's who think, "I don't know anything about engines and drivetrains and all that technical stuff [– and I don't need to]."
Why should the VP of Human Resources need to know how a drivetrain works? Or the CTO? Or the CFO?
They are experts in their area focus. It's ridiculous to expect every manager to understand everything about their business. Would you expect the CTO of Starbucks to be able to tell you how all of their drinks are made?
I wouldn't. And I wouldn't care if they could.
In general, a good executive doesn't need to know the minutiae. They need to know how to motivate people, how to keep projects on track, how to recognize talent, how to delegate, how to budget, how to distill information for other executives, etc.
Sure, knowing the minutiae usually helps. It's easier to sniff out all the BS people feed you, etc. But it's far from the most important knowledge and skills a great leader needs.
I've only skimmed the article so far, but the part that stuck out to me was this (technical manager talking to the VP): “My people are split on platform,” he continues. “Some want to use Drupal 7 and make it work with Magento—which is still PHP.” He frowns. “The other option is just doing the back end in Node.js with Backbone in front.”
Now, that's an example of a terrible trait for an executive. TMitTB clearly has very little ability to communicate with people outside of his area of expertise. The ability to convey complex ideas simply is crucial. Why would a non-technical executive care about the framework you're using? That's asinine. Worrying about the implementation is TMitTB's job. When meeting with the VP, TMitTB should talk about the business impact of options. This option is cheapest but doesn't give us these features that the marketing department says they must have. This option is best, but it's much more expensive to hire developers with those skills right now.
"When meeting with the VP, TMitTB should talk about the business impact of options."
I don't think he even knows what the business impacts are. TMitTB is just a tech guy who works for the new CTO. Presumably, the CTO (being an executive in charge of technology) can speak both the language of tech and the language of business and could make a business case to the VP, in terms he understands, as to why the company needs the new software. The CTO should not have sent her tech guy to talk to the VP.
1 reply →
This is the only valid answer, here. Why is the guy not talking in terms of technical debt, business impact, KPIs etc.?
If he really knows his shit, he should be expected to break his technical insight down into layman-friendly terms.
Hell, we're expecting just that from our doctors all the time.
> there's still a corner-office audience for "what is a computer?"
There's a technical audience for "what is sales?", and that's thousands of years old. Generalists, especially good generalists, are rare.
There are always going to be people like this, in any field. TSR (of Dungeons & Dragons fame) actually had a CEO who forbid her employees from playtesting their products during work hours, calling it "playing games on company time." http://1d4chan.org/wiki/Lorraine_Williams
Dear God, what a disconnect. How does someone with that line of thinking even get a job in the gaming industry?
Then again, I'm reminded of a boss I used to have who would ask me to fix the shipping calculator on our web server, then ten minutes later he would poke his head in the door and tell me to "quit playing on the goddamn computer and get some work done!" He didn't realize that "working on the web server" is done at a workstation, not physically taking the server (remotely hosted of course) apart and putting it back together. All he knew was the customers were complaining about the shopping cart module not calculating shipping correctly.
1 reply →
Yet, they do: many VP's in this position are promoted from parts of the company that have nothing to do with tech. I even wonder if this is the majority. Not sure. A huge chunk of VP's out there, though.
It's got to be the majority, since most non-tech companies (i.e., the vast majority of companies in the world) only use tech as a tool - it's not the focus of their business. The executives of banks, retail businesses, airlines, etc. are not likely to have been promoted from the IT Department.
1 reply →
> auto company VP
The VP is not in charge of a software company. Presumably some sort of widget/manufacturing operation ("cycle reduction"), so it is unfair to accuse him of not knowing what is going on in IT (at the level of engines/drivetrains for an auto company).
This comment falls neatly into the category of my favorite trendy term of 2014/2015, the "hot take". No matter how informative or well-intentioned a piece of writing may be, some people will just immediately go looking for something wrong with it, reason or authorial intention be damned.
Maybe instead of ridiculing the majority of people who don't understand what we do, we should celebrate a piece like this that makes an effort to educate. Maybe instead of lamenting their ignorance, we should commend the VPs and everyone else with enough curiosity about code to make it through this behemoth of an article.
It's perfectly acceptable. Business optimises for least-knowledge-you-can-get-away-with.
Three generations of my dad's side of the family were in the print industry: everything from printing Vogue and Playboy to fancy art books to dull but well-paid corporate stuff (annual reports, mergers and stock issue documents—500 page books that the SEC make you print filled with legalese that nobody reads).
Most people working in big print companies know nothing about print. They don't know about how paper works or how ink works. They have no understanding of how colour works or why you can't print certain colours on certain materials, or how long certain types of print work takes. Not at the junior level and not at the management level.
Hell, if you took half the people in a big print management company and asked them to explain the basics of offset printing, they couldn't give you a "lead paragraph of Wikipedia"-level description. And that technology has been around since 1875.
For all but a small set of technical and management roles, a lot of businesses are far less interested in technical know-how than "soft skills". In a shocking number of places, the ability to build a tower out of rolled up newspaper and sticky tape in a team building exercise is valued over an ability to know the details of how the industry or its core technologies work.
I think you're taking their conceit of a VP audience too seriously. To me that just feels like a fun narrative choice, and the article is just intended for any adult dealing with and slightly bewildered by code.
A lot of people find themselves in this role, people who specialize in other fields but still need to interact with developers because software is eating the world, but more quickly it's eating their role.
Software doesn't only happen at software companies.
Actually, I've met a lot of programmers who are deeply resistant to learning anything about the core business of the firm they work for, especially if it can't be automated of if it resists automation because of strongly established legacy practices, as well as the tech short-sightedness that so often prevails - eg saying that the putative listener won't need $85,000 in Oracle licensing any more, oblivious to the fact that such licenses involve periodic contractual obligations.
> Imagine an auto company VP who says "I don't know anything about engines and drivetrains and all that technical stuff. All I know is that when you guys are in a meeting talking about your variable valve timing system, all I smell is money burning!"
His job isn't to know about drivetrains and engines and stuff. His job is to manage products, cashflow, audit requirements, stock market regulations, and marketing. That's the stuff he has to deal with day to day.
> what it is his company does
The company sells products. The website with a shopping cart is not the primary purpose of the company.
The company may in fact sell automobiles.
Most VP's do not sell computers (or even software). They sell goods and services that depend on computers. What you're saying is more like "a VP at UPS or FedEx should understand everything about engines." That's ridiculous, you don't need to understand how trucks work to know they move goods from point A to point B.
Writing software is not what (many) companies do. Companies exist to create things of value for their clients. Software is one aspect of the secondary functions that enhance this value creation.
In the article, the VP worked at a company that sold things on the internet. The things they produced were their primary function; internet platform development is a secondary function, much like marketing, hiring, business development etc etc.
So the theoretical VP's core competency should not have been software dev, or even technology - that's what CTOs and technical leads are for. In fact his core competency probably wasn't product development anymore, if it ever was. He was a manager, and his role was managing resources within the company to optimise their primary function.
Hence the disconnect, and hence why articles like this (and audiences for them) exist.
It's a caricature, much like the pointy haired boss in Dilbert. The reader is supposed to think "at least I'm not as dumb as that guy".
Also, the whole article is written as if it were 1997. The graphic design is rather HotWired-like, in a world where they had the tools we have now.
This article doesn't attempt to answer the dull, dry, boring, well-understood question "What is code?" in the sense of "What is it that computers use to make them do what they do?". It tries to answer the much more interesting, intangible, existential question "what is code?", in the sense of what does it mean for the world to depend on a culturally isolated priesthood of technologists who control what computers do? How does that work?
I've seen a lot of 50something execs who have been passed by on the technical superhighway.
I met a CIO of a large insurance company who didn't know what Python was.
I know a senior banker and educator who struggles with the basics of Excel and Word. (Yes, there are still people out there who are used to the days when secretaries did all the work.)
I knew an IT exec at a large consumer products company who didn't like using computers at all.
I work at a software company where no one likes computers too.
It should make sense that this article & VP exists. 7 billion people in the world and only 11 million professionally code, and 7 million are hobby coders, according to the author of the article and the IDC.
After auditing the software of at least 20 different startups, I'd have to say there will always be people in positions of power who know nothing. Just look at our politicians.
I though the imaginary VP was pretty clearly not working in a software company. What if random companies suddenly needed to design and build their own automobiles in house? There would be a similar lack of comprehension about just what the automotive engineers were actually doing all day.
At least, I know I would be lost and would greatly appreciate a guide like this.
"It's 2015. The audience of this article shouldn't even exist."
Well, the audience most certainly does exist whether it should or not. The thing to think about is what to do about that.
> Yet, here we are, over 30 years after the original IBM PC was released, and there's still a corner-office audience for "what is a computer?"
As someone who got their start on DOS and BASIC back in the 80s, I say you raise a pretty good point. There are so few languages depicted in this web brochure that it does not illuminate anything.
So many people stopped learning in the 80s that the 50s are starting to catch up with them again.
This is the stuff that cardboard box forts are made of.
Source for accompanying interactives - https://github.com/BloombergMedia/whatiscode
https://github.com/BloombergMedia/whatiscode/blob/master/scr... which outputs: https://dl.dropboxusercontent.com/s/fbppsat45fdzg25/Screensh...
I think of Minecraft as a visual representation of a database. Every block you see has a set of values, starting with the 3 that determine its location within the world (coordinates) and extending to include block type, which determines other values.
And too, even the open spaces. For Minecraft reminds you that a block can occupy any space. Indeed, an open space is a set of blocks whose block type is "open", which makes it both transparent to light sent from neighboring blocks, as well as not blocking player movement.
Most games are essentially just massive databases of pretend stuff with an enjoyable alternative to SQL as the interface.
I love how the page calls you out for skimming it instead of reading it.
Huh, is this what caused the whole page to go pixelated and ask for permission to use my camera each time I changed orientation on my tablet? I had to reload the page each time this happened and scroll down to where I was before. Very annoying.
I scrolled. I admit it. :)
I skimmed and didn't get mocked. I really am a slow reader :-P
Computer don't hurt me, don't hurt me, no more
The lyrics to the original actually make sense for this altered version:
Oh, I don't know why you're not there I give you my love, but you don't care So what is right and what is wrong Gimme a sign
---
Oh, I don't know, what can I do What else can I say, it's up to you I know we're one, just me and you I can't go on
source: http://www.lyricsondemand.com/onehitwonders/whatislovelyrics...
I'm happy to see that I'm not the only stupid singing this while reading the header XD
It's easy: the code is that part of the computer which can't be grabbed and slammed but only cursed.
It's easy to me: the code is the part of the computer which can't be grabbed or slammed but only crushed.
That's why I crush it. I crush code.
I did not expect this good of an article on this subject from a business publication. Well done.
bloomberg is mainly a technology company
We make the vast majority of our money from selling our software subscriptions and have ~4k employees in R&D. Depending on what data you use[1], if we were a public company we'd be the 4th largest in the world by revenue.
[1]: http://en.wikipedia.org/wiki/List_of_the_largest_software_co...
(My intention is not to bikeshed over who is or isn't in the "Software & Programming" industry or specific ranking, but to convey a sense of scale)
Time to follow @ftrain on Twitter then.
"C is a simple language, simple like a shotgun that can blow off your foot... Think of C as sort of a plain-spoken grandfather who grew up trapping beavers and served in several wars but can still do 50 pullups." Most certainly.
Ditto.
I kind of like my answer better: http://qr.ae/7NEnT9
The whole post is just a stream of consciousness brain dump that a layman would never understand. I believe it's possible to explain these things without circular reasoning.
"What are the major programming languages, and what are they used for?" is a different question than "Can you tell me what code is?" The second can include the first. But if your and Paul Ford's answers were swapped, both questioners would have good reason to say, "That's helpful but not what I was looking for." To extend your kitchen metaphor, you haven't mentioned anything about why or how your instructions would ever work. You've left out the compiler, interpreter, executor: the human.
Granted, I think Ford expanded the domain of his question a little further than he needed to, for the sake of what looks to be fun. And I think he occasionally picks a piece of jargon where a clearer, more ordinary word would have done just as well—though I'm not in the mood to dive back into the article to find a case.
In my opinion, this is what society today needs. I don't feel like we need everyone to be able to code, but rather just have a sense on some level that computers are nothing mysterious or magical, unconquerable or incomprehensible, but rather just machines of human creation.
Not sure if we were reading the same article or not, but this is not "what society needs." While there is a part of me that loves computing and wants to share it with the world, I also realize that the nuances of compilation or futures are entirely inside baseball and irrelevant to the vast majority of society at large.
The computer, for most people, is a tool. A means to an entirely unrelated end.
Holy CPU time! That site consumes 100% of my CPU (presumably 100% of one core) whenever it is in the front tab (Firefox/OS X).
Anyone else experiencing that or is it just my laptop running wild?
You'll see similar resource consumption when using event listeners tied to the mouse movement. It's generally not noticed by the general populace, but gives every developer a pause. The page does seem to struggle at times.
Very informative, thanks. In native land we can listen to mouse motion but it is more CPU friendly to have a timer and poll the mouse position periodically, particularly if the location of the cursor causes further processing (like working out what data to display in a popup hint). The good thing with the mouseEnter / mouseLeave is that you can stop the timer and only restart polling when they enter again.
Is there a way of doing this on web pages or is it really still just callbacks for mouse motion?
It's like Myspace and Tumblr barfed all over Businessweek. I opened the article in Firefox with the mobile-emulation feature turned on. Because good god almighty this thing is a trainwreck otherwise.
I've heard tales of AdBlock Plus consuming huge amounts of CPU, especially on large and complex pages.
I'm only using Ghostery. Disabling it doesn't help though.
Interesting, I was able to read the article just fine on my 2011 Kobo touch. 800 MHz ARM Cortex A8 and whatever Webkit was around in early 2011. The border animations are off but all the text and plain images work.
Whatever effects they're running, they did an impressive job with graceful degradation.
Same here on my 2015 MBPr.
You need to Konami code this bad-boy.
wow. heh. strange world we live in when the result of that is on bloomberg. What a time to be alive.
Their 404 page gif is still their best work, but this is a close second.
1 reply →
"That’s how change enters into this world. Slowly at first, then on the front page of Hacker News."
How meta.
I didn't really understand that part. Care to explain?
Aggregate sites like Hacker News and Reddit make distribution of news and ideas very quick and viral compared to more organic growth such as word of mouth and google.
1 reply →
I had no idea they made an hour long educational video on windows 95 with the cast of Friends! That is awesomely 90's. This is a really cool write up, clearly a lot of work went into it
Imagine a world where everyone has their own social version of a Github page instead of a Facebook wall.
Imagine a world where Wikipedia isn't an encyclopedia, but a crowd-sourced collection of all code meticulously indexed and documented that could be written for one language.
Imagine there's no heaven (It's easy if you try)
Imagine a world where everyone read and writes.
This is the best write-up explaining software I have ever seen. Wow.
Smash the patriarchy! Check the console.
http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod...
> You know what, though? Cobol has a great data-description language. If you spend a lot of time formatting dates and currency, and so forth, it’s got you. (If you’re curious, search for “Cobol Picture clause.”)
https://www.google.com/search?q=%E2%80%9CCobol+Picture+claus...
What am I supposed to be looking at here?
Superb writing. I wish all professional writers could write this well.
Paul Ford really is in a class all by himself. Everything I've read by him is truly wonderful.
As a writer, it's both inspiring ("look how amazing nerdy non-fiction can be!") and soul-crushing ("look how much better someone else is at writing!"). I try to focus on the former, but, man, he really makes the rest of us look like Celene Dion showing up at your dive bar's shitty karaoke night.
Celine Dion would be booed out of my local dive's karaoke night.
tldr; watch the video: http://www.bloomberg.com/news/videos/2015-06-10/invisible-co...
What a strange, long, rambling novella on programming languages.
I like the idea, but is there really no way to mute the audio? Sadly I did not finish the article because of that.
"is there really no way to mute the audio?"
There are so many ways to mute my computer, I wouldn't even know how to list them all. But on top of the list I would start with the volume keys on the keyboard and then with the volume panel in the menubar and then with the audio panel of the system preferences. Towards the far end of the list I would cut the wires to the speakers.
Very funny... I currently have other audio playing on my computer that I don't want to stop.
3 replies →
If you use chrome, you might enjoy this flag
chrome://flags/#enable-tab-audio-muting
There's audio? I should have unmuted my laptop.
What audio? There shouldn't be any sound unless you activate the konami code easter egg.
Well now you've got me to investigate. There seems to be a video in the article. However, it is only displayed as a white box in my browser (Safari), without any controls. I can't stop it, I don't even know it's there, except it it playing audio. Strange...
A couple parts of it remind me a lot of JBQ's post on "dizzying but invisible depth": https://plus.google.com/+JeanBaptisteQueru/posts/dfydM2Cnepe
Part of me looks at this and thinks, "This is preaching to the choir"...because while the engineer in me appreciates all the layers and explorations...It must be incredibly bewildering to anyone who is not a coder, which is the ostensible audience given that the story starts off with, 'We are here because the editor of this magazine asked me, “Can you tell me what code is?”'
But then I see the interactive circuit simulation and think "Fuck it, who cares, this is awesome!". Designing circuits is one of those things that, if I were a self-learned coder instead of a comp. eng major, I would've never delved into...yet learning how to build an adder circuit and getting an appreciation of the most basic building block of computation (and how surprisingly complex it is to just add 1s and 0s) is a profound lesson that I think is essential for me, personally, to really grok programming. All the sections about culture and conferences and etc. are a little bit off-field for me...it's not that I don't think that code and life and human thought and behavior aren't intertwined... * I just think the discussion about conferences reads as if the author doesn't realize that all disciplines spawn conferences and conferences culture. There's nothing particularly unique about code conferences. Not the sexism, not even the nerdiness.
I would love to see the OP's editor respond in a not-quite-as-length essay. What did they learn about code after reading the piece that they didn't understand before?
edit: * I'm emphatically not arguing "Oh but everyone does conferences shittily so tech conferences shouldn't be shamed". Just that having it in this "What is Code" essay makes it seem as if it's a notable "feature" of programming...but that understates the problem by an order of magnitude. Sadly, it's a feature in most every discipline, and the inherent feature is the gender imbalance, not the topic of the conference.
edit: Also, I wished that the section on Debugging was much higher than it is...Robert Read's "How to be a Programmer" [1] makes it the first skill, and that's about the right spot for it in the hierarchy of things. Maybe it gets overlooked because it has the connotation of something you do after you've fucked up. But, besides the fact that programming is almost inherently about fucking up, the skill of debugging really underscores the deterministic, logical nature of programming, the idea that if we have to, we can trace things down to the bit to know exactly what has been fucked up in even the most complex of programs. And that's an incredibly powerful feature of programming...and not very well-emphasized to most non-coders.
[1] http://samizdat.mines.edu/howto/HowToBeAProgrammer.html
Not to your main point, but the circuit simulation reminds me of Silon by SLaks: http://silon.slaks.net/
Edit: Also, as a late-bloomer and self-taught (self-teaching) programmer, I am on the other side of the paradigm you're talking about. Petzold's Code is one of the first books a self-taught programmer should pick up. It is an awesome introduction.
One of the few worthy things I felt I got out of school was the moment I grokked the whole stack from sequential logic to the program counter and control logic from a cpu, how each clock tick formed a new circuit. That was really mentally expanding. I got it from reading a prescribed book for a class I wasn't taking from a professor who was a tool, so it is possible to learn these things outside of class. In fact, that's where the real learning, IMO, happens.
What book? For those of us not there yet :)
1 reply →
One of the most memorable weeks in my Engineering degree was using Cadence to build a CPU from the ground up. Every transistor, every connection, the ALU, etc was laid down by someone in our little group of students, and then wired together to make a thing with a few thousand transistors. And it friggin worked.
It also showed how the chip itself would be laid out, where the dopants would be and such.
> Part of me looks at this and thinks, "This is preaching to the choir"...because while the engineer in me appreciates all the layers and explorations...It must be incredibly bewildering to anyone who is not a coder, which is the ostensible audience given that the story starts off with, 'We are here because the editor of this magazine asked me, “Can you tell me what code is?”'
I completely agree. I got a third of the way through it before I just couldn't stand the obfuscation and decoration any further.
What's sad (as I [tweeted][1]) was that there's a 1972 article by Stewart Brand, published in Rolling Stone of all places, that does a better job of actually explaining what computers can do, without resorting to jargon and jive: http://stuartpb.github.io/spacewar-article/spacewar.html
[1]: https://twitter.com/stuartpb/status/609035295002984448
baby don't hurt me baby don't hurt me no more
For reference: What is love by Haddaway https://www.youtube.com/watch?v=K5G1FmU-ldg
created an account just to reply this.
Was the bit about PHP standing for Personal Home Page a joke? I always thought it was "PHP Hypertext Preprocessor". The coolest thing about PHP is the infinite recursion in its name!
https://en.wikipedia.org/wiki/PHP#Release_history It was PHP 1.0, then PHP/FI 2.0, then became the PHP Hypertext Preprocessor at PHP 3.0 I believe.
Nope, it wasn't. recursive was later tackled on (probably still before 1.0)
The article ended up just yanking my Firefox session to somewhere in the middle of the page, followed by some Clippy expy nagging me about how fast I'm supposedly reading the article.
I think of it as a beautiful, colourful, crystalline structure.
Very interesting read... I enjoyed it.
One thing I noticed though is that the author is definitely stuck in the old "Microsoft is the great Satan" mindset. If he ever finds out about all the open-source stuff MS is doing these days under Satya Nadella, I think his head would probably explode.
He doesn't know what to say to a C# developer (nothing in common), but automatically trusts a Python developer? Really? sigh
I like Paul Ford -- first thing that made me subscribe to his Medium feed was this piece about brief, remembering and old computers: https://medium.com/message/networks-without-networks-7644933...
Anyone know what happens when you allow bloomberg.com to use your camera when you finish reading the article?
It adds a photo of you to the certificate, and then allows you to download it.
Can't be good, if Bloomberg is for it
Do you consider to be a photo for your certificate of achievement to be good?
3 replies →
The scroll performance was bothering me so much i had to add transform: translateZ(0); to the #background-canvas element of the page to stop the screen painting on every fking scroll; to continue to read in peace without my eyes bleeding. Great article though :)
I had to switch to view/source to read the article. Halfway through there was a shopping cart on wheels obstructing the text (ironic).
* <div class="videoWrapper">
<script src='//cdn.gotraffic.net/projector/latest/bplayer.js'>BPlayer(null, {"id":"P4_i7PihRGiWcPh3gdNMhg","htmlChildId":"bbg-video-player-P4_i7PihRGiWcPh3gdNMhg","serverUrl":"http://www.bloomberg.com/api/embed","idType":"BMMR","autopla... </div>
*
Also - I have no CPU activity at all, so presumably some plugins that are running for others, aren't being executed in my copy of chrome.
Sounds like a better experience than on Firefox, which fails to load anything, even text, past the first video.
> Smalltalk’s history is often described as slightly tragic, because many of its best ideas never permeated the culture of code. But it’s still around, still has users, and anyone can use Squeak or Pharo. Also—
>> 1. Java is an object-oriented language, influenced by C++, that runs on a virtual machine (just like Smalltalk).
> 2. Objective-C, per its name, jammed C and Smalltalk together with no apologies.
> 3. C# (pronounced “C sharp”) is based on C and influenced by Java, but it was created by Microsoft for use in its .NET framework.
> 4. C++ is an object-oriented version of C, although its roots are more in Simula.
>> The number of digital things conceived in 1972 that are still under regular discussion is quite small. (It was the year of The Godfather and Al Green’s Let’s Stay Together.) The world corrupts the pure vision of great ideas. I pronounce Smalltalk a raging, wild, global success.
Except that these examples are "object-oriented" in almost none of the ways Smalltalk was object-oriented: http://www.paulgraham.com/reesoo.html
The specious reasoning on display in this paragraph is almost offensive in its glib uncomprehension. Calling Smalltalk "a raging, wild, global success" because modern programming languages call themselves "object-oriented" is like saying women in technology are well-represented because Ada Lovelace was the first programmer.
I get that it's supposed to be tongue-in-cheek, but like the rest of the writing in this article, it's supposed to be tongue-in-cheek in a way that gestures toward what the author actually thinks. In this case, what it's gesturing at is the notion that Smalltalk has had a large-scale tangible influence (if not wholesale adoption) on modern programming languages, which, if you actually take the time to understand the subject, is just not true.
The print edition has landed!
https://twitter.com/ftrain/status/609388625596416001
The Charlie Rose interview about this piece: http://charlierose.com/watch/60575137
> “No,” I said. “First of all, I’m not good at the math. I’m a programmer, yes, but I’m an East Coast programmer, not one of these serious platform people from the Bay Area.”
seriously?
Well, in the same article he says "A computer is a clock with benefits." So I believe that no, he isn't serious.
That jumped out at me too. It colored the remainder of my reading experience.
yeah ... trying to put it behind me. this has otherwise been a great high-level introduction to coding.
It's a joke.
http://twitter.com/whatiscode
I would have loved the new Safari mute features on that page....
How dare you pollute my ears with garbage without a mute option.
Fun article. Even to this day sometimes I look at the software we have all built and wonder:
> It’s amazing any of it works at all.
Resizing my browser window (Firefox 38) in the middle of reading causes a section of the story to loop infinitely.
Code is a demonstration, the ipothesis are the requisites. Actually, the test class is the proof!
very cool article
although to a layman I would try to answer "what is code" more simply: code is just instructions.
instructions for how to tie a windsor knot or cook a recipe or play a piano piece can be thought of as "code" executed by the human.
i viewed source to see how they did the custom skin and noticed this:
shouldnt it be if(window.console) ?
> It’s a comedy of ego, made possible by logic gates.
Anyone else find the easter egg ?
(..old school video game code)
What a cpu and memory hog that page is!
Many disclaimers:
1. This clearly took A LOT of work, and I have not finished reading it. I intend to, but as another comment calculated below, that will take around 127 minutes. This comment is simply about the beginning.
2. I'm not 100% certain yet what the intended goal of this article is, so I may just be off base. That being said, my criticisms should be interpreted more as questions, since I'm deeply fascinated with how to make programming more accessible. I hope they are taken as such, and people share their experiences/successes/failures in getting people to understand "what we do". Again, like other commenters here I have suffered the fate of parents not really understanding what you do (unlike the even superficial understanding of what a physicist does).
3. People learn differently, this is me pretending to not know anything and reading this article. It is thus flawed on two axises: I can't know for sure how I would have taken it in, and even if I did, it may be great for most people but bad for me.
All that being said, I had a few issues with this article('s beginning) if the goal is to make programming seem understandable to non-programmers. It seems to jump around a lot at the beginning and focus on just how complex everything is. If the goal is "programmers are justified in their work, look how complex everything they deal with is!", then this may be an OK approach. However, if the goal is to help them understand what we do day to day, it may not.
Some examples:
1. The early references to math. I once upon a time thought math was a pre-requisite to programming. I have now met enough awesome programmers that are absolute rubbish at math that I no longer believe that to be true. I believe referring to the "math" of things a lot scares people off (makes it seem like "one of those math things math people do" and inaccessible, when in reality your everyday programmer does not do a lot of (complex) math).
2. The early reference to circuits, compilation, and keyboard codes. This is a tremendous amount of scope that is unnecessary in my opinion, and just makes everything seem so obtuse. Showing keyboard codes goes a long way in conveying how much a computer does, but I feel is very confusing in relation to programming. I don't deal with "keyboard codes". We could also get into for example the actual hardware and how even having to deal with denouncing a key is hard! But I think everyone would see why that isn't great for the (introduction) of a programming explanation.
3. The circuits I believe are pretty and let you do things interactively, but I have a hard time believing they convey any information to people not familiar with programming. No one knows what XOR means (which you can flip the gates to), and just furthers the idea that code is this weird incantation we do. More putting them in "awe" of programming than understanding it.
Then again, I've been criticized for relying to heavily on analogy. My explanation would probably start with a lot of hand waiving: "lets tell the computer to get a sandwhich shall we?", then trying to get deeper bit by bit, etc. Others have probably tried this and failed, so I am genuinely curious if people walk away from this article feeling like they have a better understanding of things.
"If you’re old enough to remember DOS, you know what a command line is."
This is a joke right?
why is it a joke? it's true.
it's funny because it's true.
You may be surprised how many programmers don't use CLI. I have clients on windows machines that saw me running git commands in terminal and said "So glad we have a gui to manage this for us"
TLDR, good lord
I would try to explain it as levels of abstraction and how they extend beyond the computers that execute the code. You can go down through the levels of abstraction, 1 by 1, until the point is made rather than attempting to start from the bottom and work up.
So, for example, when talking to the non-technical executive, the first level of abstraction is the technical expert that tries to explain complex technical issues. Below that, there might be a technical management layer that deals with technical issues on a more granular level, but still isn't looking at the code. Below that there's the actual developers who are writing code and are concerned with the actual logic the computer is executing. Below that are the framework authors that abstract away the common parts of writing an application of a certain type. Below that are the language platform authors who write compilers or interpreters that translate the code typed by the programmers into a format that either the computer or a lower-level abstraction (LLMV, etc) deals with. At this point, it's probably not necessary to go any lower, but you can go all the way down to CPU/machine architecture level, if necessary.
The key point is that even highly-technical people have to trust the layers of abstraction below the point where they have full understanding. I've been coding for over 20 years and I still only have a cursory understanding of how my compiler is translating the code I write into machine code, let alone how the actual hardware that runs the code. I took EE courses in college and understand the theory, but the implementation by the folks at Intel and other hardware vendors is opaque to me and I'm forced to trust that it works.
The coders employed by your company may be able to dig into framework code, but the chances are that they're fully trusting the runtimes that they work with. That trust may be the result of a well-earned reputation or through testing that the claims made by the language runtime are empirically true, but it's still trusting something that they're unequipped to verify themselves. This need to trust bubbles all the way up to senior management. The systems are just too complex for anyone concerned with the finished product to understand the whole picture.
That means that, as an executive, you're likely trusting your senior technical leadership. The only way you avoid doing that is to dig in and better understand the abstraction layer they're providing. You can also make that trust easier by doing the same sorts of things that a coder does with their language runtime...give tasks to your abstraction layer and test whether they're completed successfully. And, when those tasks are not completed successfully, don't accept techno-babble responses, dig in to understand the wheres and whys of where things broke down. Likely, the chain of trust of those abstraction layers was broken at some point...figure out where that point was so you can prevent it from happening again.
Every abstract layer adds uncertainty to the system. A CPU engineer can tell you how long a small task will take within a ns or so. A compiler engineer can tell you how many CPU cycles an expression will result in and compute an approximate time for a given processor within microseconds. And it continues as you go up the chain until you're talking to senior management and he's giving you swags with a margin of error of months. Understanding this goes a long way towards explaining the behaviors that are so confusing to the non-technical executive. It's intimidating, but the good news is that many of the skills of a good manager are what's necessary to achieve the necessary level of demystifying. The way that you begin to understand these layers of abstraction is through inquiry. Ask the right questions and, over time, you'll understand more and more of how software development happens.
awesome
Ignoring the content, the structure of this article is amazing. It feels like an entire magazine in a single essay. The background animations that change as you scroll, the contextual content (try scrolling really fast). I'm not even all that keen on the bright oversaturated aesthetic, but it's just so cool. I'd love to see a short piece on how they made it.
Have you seen their error pages?
http://www.bloomberg.com/lookathis
Also this one http://www.bloomberg.com/500
2 replies →
I was rather disappointed. I wanted someone to point a finger and yell HAAAAXXXXX!!!
It is, in fact, an entire magazine in one essay. It's the only thing in the most recent issue of BW.
Intro articles like this do a lot to reveal biases and misunderstandings. Like with Java.
The article says "Java= enterprise" but I can tell you the best user experiences I ever saw delivered over the web were those done with Java Web Start (not applets- applications launched in a JVM from the web). I developed several in the day that continued to run for years- because users loved them and they were safe and secure.
Why Web Start didn't take over, I have no idea. It was also a superb platform for mobile delivery.
> Intro articles like this do a lot to reveal biases and misunderstandings.
This is one of the reasons I barely recommend any intro articles in Lean Notes (http://www.leannotes.com/): almost every single one is just a stream of incomplete and incorrect statements about how the world works, based on the author's myopic personal experiences.
Rather than properly generalizing and consolidating what needs to be said to convey a full understanding of the topic, most intros settle for the first example they can think of that could be remotely construed as related to the words they've previously used for whatever subject, regardless of whether it has meaning in any context. (Example: saying that type safety prevents you from trying to "multiply seven by cats".)
It seems like a pretty Dunning-Kruger thing: the less broad your knowledge is, the more justified you feel in writing an introductory text to the field.
The only time I've ever seen somebody actually qualified to write an introductory text actually doing so (as I can immediately recall) is Charles Petzold's [Code: The Hidden Language of Computer Hardware and Software][Code] (although I suspect, from the few excerpts of it I've seen, that Brian Kernighan's "D is for Digital" is good, too).
[Code]: http://www.amazon.com/Code-Language-Computer-Hardware-Softwa...
A couple parts of it remind me a lot of JBQ's post on "dizzying but invisible depth": https://plus.google.com/+JeanBaptisteQueru/posts/dfydM2Cnepe
I can't understand the rationale behind this gaudy redesign job that Bloomberg carried out. I just can't wrap my head around it. It just violates everything that I know about web design and usability for news/corporate websites/portals.
Maybe they were trying to pull off a Craigslist here but still I can't really stomach these changes.
This is not the main BBG website, but yeah I don't like the new BBG Home page design either, this post design (its a post for bbg/graphics) is actually really awesome and its main purpose is to be stereotypical "nerdy" .
Good article. Sub par web design.
Watch it when capitalists pushing incessantly people to learn coding. They're trying very hard to cut the costs of their input "materials" and they will do everything that they could to devalue us in every way possible.
So, if you're a talented and competent dev, be super aggressive with these predators and take everything your hands can grab before they have the upper hand and show us their true colors.
Happy Coding!
This is a really ugly, selfish attitude. It's like opposing literacy because it will put pressure on jobs for those who can read and write. It's circling the wagons around people who had the privilege and opportunity to learn these things before everybody else.
Agreed, we need more Bloomerg articles that increase the supply of financial and negotiating skills.
> It's circling the wagons around people who had the privilege and opportunity to learn these things before everybody else.
The underprivileged Bloomberg readership.
http://www.forbes.com/sites/venkateshrao/2012/09/03/entrepre...
"...the balance of power between investors and entrepreneurs that marks the early, frontier days of a major technology wave (Moore’s Law and the Internet in this case) has fallen apart. Investors have won, and their dealings with the entrepreneur class now look far more like the dealings between management and labor (with overtones of parent/child and teacher/student). Those who are attracted to true entrepreneurship are figuring out new ways to work around the traditional investor class. The investor class in turn is struggling to deal with the unpleasant consequences of an outright victory..."
This sounds contradicted to the mathematics/empirical/quantitative observations. Startups built on texting two alphabetic characters are attracting million dollar financing rounds. Startups based on erased timeouts of texts are declining 3 billion dolllar acquisition offers. VCs are trying to get deal flow by building the reputation of being the most helpful to entreprenuers. Interest rates are at historic lows and hundred billion dollar pension and mutual funds are pouring money into every 1st to 3rd tier vc to chase returns. tl;dr...this sounds like some bs.
3 replies →
I don't see that... or rather, if it's true here it's true to a significantly greater extent in most other industries.
Money continues to be available, and often lots of it. It's available on better terms than most others in most other professions can even imagine receiving.
To put it bluntly: in most industries you are meat and own nothing and never have any chance of owning anything. This has been the condition for nearly all human beings who have ever lived, today and in the past.
There are also more alternatives to VC today: larger angel rounds, crowd funding, etc. It's also easier to bootstrap since everything (but people) has fallen in price. Those two things together have made the funding environment more competitive for VCs -- they have to offer more value or compete at the higher end.
Let us keep the sacred arts secret, brothers.
EDIT: Just being sarcastic. You are clearly incompetent, coasting along in your job, and afraid of someone with 6 months of experience being better than you.
Seems to have worked well enough for doctors and lawyers. They're unionized (through the AMA and ABA), upper-class professionals who command much more respect from the general public than we do, and whereas our salaries tend to max out at around $150k, theres can easily exceed $500k (in the case of medical specialists or law firm partners).
6 replies →
And you are an idiot.
Capitalists only want to reduce unnecessary costs, not all costs.
A programmer is more like factory equipment than a factory worker. If a company invests in superior equipment, they can produce better quality products and net higher profits.
If we're going down in the name of efficiency, the managerial and legal professionals are going first.
In this industry it seems that product success is only loosely related to product quality.
The accountants are in even bigger trouble than the lawyers, fwiw.
Watch it when capitalists pushing incessantly people to learn coding. They're trying very hard to cut the costs of their input "materials" and they will do everything that they could to devalue us in every way possible.
Did you actually read this article? The article doesn't aim to teach Bloomberg's audience (which consists of VPs, SVPs, and managers, as implied in the first couple of paragraphs) how to code or replace the average developer.
This is egregious if you replace "coding" with "writing" or "reading".
Historically, any perceived detriments of mass education have been significantly outweighed by benefits.
That's true, but it's also true that you can only add so much stuff to the syllabus, and then there'll be too much school.
You've got to draw the line at some point. Why not draw it at boring computer stuff that barely anybody needs to know?
Capitalists want people to be generally happy. Happy people buy more things. People who have useful well-paid jobs are generally happy (and can buy more things). Everybody knows that there will be huge drop in available jobs next few years, but software development skills will still be in high demand. Hence, capitalists want more people to learn how to code, so they could keep their jobs, so they can be generally happy and buy more things.
Capitalism _is_ profit-oriented, but happy people bring more profit.
Indeed. Labor costs.
Why did Bloomberg ask to use my camera while reading the article
So it could take a photo of you for your certificate of achievement
"How often are you going to be multiplying sevens and cats? Soooo much."
Where the fuck does this meme of "fundamental type mismatches come up all the time in ordinary code" come from? What kind of defective system are people writing where it's normal for strings and numbers to be interpreted relationally (even accidentally)?
It sounds like the author is trying to demonstrate the significance of things like syntax transformations and format conversions (like transforming an email address to a mailto link), but that's nothing like "multiplying sevens and cats". It's manipulating things that aren't inherently incompatible - if anything, it's multiplying sevens and "7"s.
All these batshit insane contrived examples in asides like http://www.bloomberg.com/graphics/2015-paul-ford-what-is-cod... do is make code seem less accessible and comprehensible to anybody who isn't already intimately familiar with what's safe to interpret as sarcasm or hyperbole and what's not, which goes exactly contrary to the stated thesis of the article.
It can happen accidentally quite easily. Someone new to a codebase starts hacking in a feature and mistypes a variable as 'value' instead of 'values'. They fail to realize there's already a 'value' variable in the global namespace (perhaps it's a gigantic spaghetti code mess of a file). They don't have good test cases that exercise this exact line and fail to see the bug. Code ships to production, three months later the line runs and explodes.
Your example is quite good, although there are far more bulletproof ways than exhaustive test cases to make sure this doesn't happen.
On the web it's sort of all strings, so it's not hard to be in a situation where you have "length=7" & "cat=tabby" and get into a problem. Beyond that, many developers are in the habit of using primitives for everything, which makes these sorts of errors much more common.
Yes, it's possible to have strings for two different things. In what world are those strings going to be cross-evaluated?
I'm glad I came here to read the comments that urged me to read on, because I stopped at the point where the VP was whining that his job was on the line and the software guy's wasn't. Made me a little sick to my stomach. In what company is that ever the case? Even if the VP's job is lost (rare occurrence in my experience), the severance package is more than the software person's salary for a year is.
My understanding was that the development manager in the taup blazer was an IT consultant brought in to run the project making it a little easier for that person to disappear to the next gig no matter how disastrous the project turned out.
That's what I was thinking too. But then they seem like employees in the rest of the story. A little ambiguous.
Hmm, that did come out quite negative. I'm sorry. Personal stress coming through.