From the comments, I think readers are generally failing to take an opportunistic attitude toward this article. Dig out useful ideas and ignore what doesn't help.
Yes, there are situations where working too quickly will bite you, so you can't always do it. But the key idea here is that by working quickly you reduce your expectations of how costly a new effort is. Then you'll actually start it. Inertia is a powerful adversary. Any weapon you can put on your belt to fight it is worthwhile. I'm keeping this idea.
The phrase may put you off but I think the sentiment is sound. Too many dismiss something/somebody in entirety because of one flaw. I think it's better to try to find value in everything; even if it's 99% shit, you can still learn from that 1%.
I have to agree. My personal experience suggests if you want to get good at something, keep doing lots of it and aim for speed rather than perfection. You'll end up being speedy and get closer to perfection, than if you just try for perfection.
But one good counter example does come to mind - designing a database schema.
I'm trying to wrestle with what the difference might be. I think Markov processes, e.g. processes where the future state depends on a prior state, are relevant.
Maybe we could say tasks are either "strongly Markvoian", that is, how well we do them now will influence our future work and hence we should really think them through, e.g. designing a schema. Weakly Markovian, in which case there may be some future impact but not much, and so we exercise caution but "done is better than perfect". And finally non-Markovian - e.g. throwing out the garbage, cooking dinner, most emails - getting the thing done is simply a pass/fail and so we just have to do it, quality is relatively unimportant.
I think what I'm saying is, most tasks will be weakly or non-markovian, so we should "move fast and break things", but every now and then there'll be something we need to do that is strongly markovian. For such things we should be prepared to take a step back and give ourselves a little extra time, so things don't blow up further down the track.
In general I disagree. I learned far more in 6 months at a "high quality" shop working excruciatingly slow than I did in the 5 years at the previous job where I banged out code as fast as possible.
I feel like when you focus on speed you very quickly learn just enough to get the task done quickly. And then your progress stalls.
I find that when I'm doing any work that involves modifying existing code, there's a huge difference in how quickly I can work when I'm touching code that was originally written by people with different work styles. Usually I can sail through the more methodically-written stuff. When the time comes to make changes to code written by the people most fond of uttering, "The perfect is the enemy of the good," though, progress slows to a crawl. Adding new features without introducing new defects to that code is like trench warfare.
That said, much like the article says, the fast workers did tend to get assigned more new tasks. I think maybe this was a huge win for them. From a wider perspective, though, it's a bit tragic because it results in this inexorable downward spiral in terms of code quality. Of course that worked out well for them too because more defects meant more opportunities for them to cape up, swoop in, and save the day. I can't remember who it was who said, "Beware of your firefighters, they are probably your chief arsonists," but there's a lot of truth in that statement.
In that boat right now, but instead of 5 years in, 1.5 years in. I've been reading books on design patterns (we barely use them, you should get an idea of how bad it is here) and just in general trying to be better.
All of my car/motorcycle track instructors were big believers in "Slow is smooth, and smooth is fast".
Basically go slow and work on the correct form/line/being smooth/etc... Once you have that down, you are ready to go fast and you will be better/faster than someone who hasn't gotten the flow down at a manageable speed first.
I used to teach guitar and always had to tell students to practice it slowly, but perfectly first, and only try to speed things up once they had the basics mastered.
Speed is impressive, but you literally have to learn to walk before you can run!
It depends on your goal. Even with database schemas, iterating quickly optimizes for learning and skill improving.
Aiming for quality tries to optimizes for creating a quality product faster. Still, taking some time for skill honing might be better than trying to design the perfect system on your first try.
I'd say it depends on your current skill level rather than on a task property like "strongly Markovian". Learning has diminishing returns. At some point more learning is not worth it anymore.
Don't bang frenetically on walls too. It's very very good to be able to reach deep understanding without hyperfast iterations. There will be times where you don't have that luxury too. Make haste slowly, balance.
Do things fast when the cost of doing them wrong is low. If you're learning something, or doing something with low risk, then doing it as fast as possible is a really good idea (for all the reasons set out in the article).
But...
Do things slowly if the cost of getting it wrong is so high that you'll have no opportunity to try again. For example, don't pack a parachute quickly.
The key is recognising that there's more than one way to approach soemthing; selecting the right method for the problem at hand is the winning strategy.
I think the concept you're trying to described is Reversible Decisions (unfortunately I can't recall who coined that phrase).
The idea is that any decision that is straightforward or easy to change should not be sweated over for any appreciable amount of time, and in fact can be deferred indefinitely (deciding not to decide).
Meanwhile, any decision or indecision that will have long term repercussions should be considered at length and with all due haste.
I mention indecision here deliberately, because things like deciding not to put authentication into your application in version 1 counts as a decision, one with far reaching and usually fairly aggravating (IME at least) long term effects on the project. Others would include thread safety, the ability to cluster or shard your design, multilingual support, audit trails, etc. If you are the only solution in the space then you often have time to correct these mistakes. But if one of your competitors figures these things out before you, you can find yourself in real trouble (one of the aspects of the Innovators Dilemma).
Let's say in a life time, the accident rates due to packing a parachute quickly is 1/10000, and the fatal rate of the slow packing group is 1/1000000. Even though the fast group faces bigger danger than the people in the slow group (or people sitting at home), but other than the few who have the bad luck, the rest of them will practice way more than the other group, jump more times, go to more places, have bigger opportunities to become a world champion of parachute packing or whatever parachuting sports.
Sure, a few will be forgotten by the world.
The victors we see in the world are probably the people who are still alive in the fast group, and have produced lots of results because of their speed and being still alive. Someone in that group will pay a huge price, but it's not necessary you or any particular one.
You're attacking this analogy with made-up numbers and wild logical leaps. What is the real risk increment to packing a chute hastily, and does the real number help or hurt your position? Now make the stakes really high. Also consider the possibility that your choices have externalities, and others around you may not want to share their jumps with someone they perceive to be that reckless idiot who's going to get himself killed.
You didn't give specific numbers on how often someone can jump, but consider the realistic bounds on how much more often a person who packs hastily can skydive. How often is this person jumping? Are we in a scenario where the amount of time it takes to pack a parachute is really the limiting factor, to the point where the hasty packers can jump "way more?" Seems like what that would mean in concrete terms is that as soon as you hit the ground you're going to hit the john, re-pack your chute, and immediately be back in the plane. Is that a realistic scenario?
To be honest, if packing a parachute takes an hour longer and increases the chance of living by 100* I'd say that's worthwhile. But then, I also wouldn't call 1/10000 an especially high risk. We face those sorts of odds just driving a car and they don't put many of us off[1].
If the probability of death from packing a parachute quickly were 1/100 then I think your argument would break down somewhat. Like I said, you should spend the appropriate time on something depending on the costs and risks associated with it. "Do everything fast" is wrong, but so is "Do everything slow".
Speaking as one who is laid back in a personality sense, I could imagine nothing worse than a life lived non-stop frenetically under the imagined need to speed up all activity in the name of productivity.
Take time to pause, reflect, think, and enjoy your life experiences.
Even when it comes to work-related activity, there are times and places to do things quickly and there are times and places to do them deliberately. If nothing else, just for sanity's sake, it is important to pace yourself through a day, through a week, through a month, through a year, through a career. Even if speed were exactly correlated with maximum productivity and effectiveness, it is vital that you have times when you simply feel you can enjoy being at work, being with people, doing your activities, without everything feeling you have to work like a machine that will be evaluated by engineering standards only.
Even more, we all have different personalities and some people do not work well if they feel they are forced to work at some arbitrarily quick pace as opposed to one that suits their style.
Finally, even speed as a factor can vary with your activities as you develop skills in those activities. When I began years ago to try to write things, I was agonizingly slow about the process. I felt I had a quick mind but the process of getting what was in my mind down on paper made me feel plain stupid. Whatever I did, it would never come out right. Through a very tedious process of writing and re-writing, it would eventually become passable and that was it. It might take me a week in such cases to write something expository of modest length. Yet, realizing this was a weakness, I worked damned hard to fix it and, through a process of many years and countless hours of effort, I reached a breakthrough point where I could do "walls of text" (in the phrasing of some) in 10-15 minutes and produce quality stuff. I now write very quickly and effectively. But had I tried to do so years ago with my limited abilities at that time, all I would have produced was hash.
So, lighten up and do it in your own style. Yes, speed does matter. But it is only one of many factors that will determine how you do at work or, even more important, at life itself. By all means, apply yourself well - be diligent, hard-working, etc. but do it fast or slow as suits your needs and your own style. At least that is how I view it.
> a life lived non-stop frenetically under the imagined need to speed up all activity in the name of productivity.
I don't think that is quite what the article is suggesting. I think it was reminding us to thinking about the effects of the speed at which you accomplish things. If there is a behavior you are trying to encourage in others (e.g. requesting a code review), focusing on responding quickly can be crucial to helping foster that behavior. Similarly, if there is a communication channel (e.g. Slack vs. email) that is not being adopted, holding yourself back from quick responses to those emails while responding quickly to Slack messages will help foster the transition.
This doesn't mean that you need to do everything as quickly as possible. It does mean that if there is something that you do slowly that you want to improve on, it may be helpful to be aware of the additional mental cost you associate with the activity so that you can compensate for it. Similarly, if you are trying to improve the quality of your writing, focusing on improving the speed of your writing (or even just the speed of your typing) while maintaining the same quality might pay off faster than just focusing on improving quality.
One of the tenets of Extreme Programming is, "Quit when you're tired." Why? Because it's faster.
It's not faster today - if you kept working, you'd presumably get more than zero done. But you'd also create more bugs, and you'd come back more tired tomorrow. Coding is not an assembly line; your brain needs to be fresh.
Taking time to pause, reflect,and think is the same. It's slower in the next minute, maybe in the next hour. But stopping to think and realizing what is the right thing to do can save you days of waste.
My first boss said, "You need to learn when the most productive thing you can do is go look out the window for 15 minutes." After 30 years, it's still good advice.
I'm ignoring your point about work-life balance here. All I'm saying is, too much emphasis on speed slows you down, even only considering work.
Perhaps this applies more to shooting than software development, but...
Slow is smooth. Smooth is fast.
If I take 20 minutes more to code a module because I'm thinking about it, but spend 30 minutes less debugging problems with the module, that's fast.
If I take a day to respond to an email, but the person I'm conversing with gets the info they need, avoiding three more days of back and forth, that's fast.
If I take a week longer to iterate through a project idea, but nail the implementation, then I can know that I'm pivoting because the idea was wrong, not the implementation.
How do mere mortals become wizards to their peers? They take the time to read documentation and code and really understand the tools that they are working on. What software projects have stood the test of time? The ones that were painstakingly thought out and progressed slowly. Of course there's a balance, and our current system of financing software projects rewards fast and loose, and there will always be a place for fast prototyping to help understand the problem, but to create something truly amazing takes a lot of time.
"What software projects have stood the test of time? The ones that were painstakingly thought out and progressed slowly."
I kinda think that the opposite is true, or at least as true as your statement (meaning that at least as much half-baked stuff rushed out the door 'stood the test of time' as did stuff that took a lot of time to ship.) Unix, C, Windows, PHP, JavaScript...
To offer an example of a project that has stood the test of time far longer than the ones you mentioned, consider Fortran. The first compiler was released in 1957, several years after it was first proposed. The specification took a couple of years to complete. This was over 60 years ago, and even now they are releasing an update to Fortran (Fortran 2015).
Even in the examples you mentioned, Unix, C, and Javascript are all run through standards bodies now. It took years for C11 to be finalized. It's been years since ES6 has started development. PHP7 has taken a few years (and a version update before even being released).
Windows, depending on who you're talking to, can be a good example of half-baked, or an example of why half-baked is terrible, with regards to Windows 8 and Windows 10 releases.
In addition to that, I like to try to answer every question I can at work, even if I have to just go Google it myself. Telling someone you don't know and they should just Google it is robbing yourself of an opportunity to 1) learn it yourself, and 2) explain it to someone else (which is a GREAT way of making sure you really understand it).
Plus, people seem to like it when they ask something, and you help research through it with them. They come back to you again, which gives you another free opportunity to share someone else's learning, and you quickly turn into that person who either knows everything, or knows where to find out.
I agree with you completely here, but you definitely have to be careful, particularly as you become more knowledgeable. Some developers get into the habit of simply asking every time they can't find an answer, or giving up after only a few minutes trying, without realizing that the searching builds a better foundation than the answer many times.
My general rule has been - always ask what they've tried first. If it seems a sincere effort has been put into it so far, by all means help out. It may be something you know immediately, but there is value in teaching people to learn for themselves.
* Momentum - To stick with something and finish it, you need to use momentum. Lacking momentum, projects languish. Quoting author Steven Pressfield: "Second only to habit, momentum is a writer’s (or artist’s or entrepreneur’s) mightiest ally in the struggle against Resistance."
* Waiting is painful. That's the point behind the examples in the middle section (waiting for an email reply, waiting for Google, waiting for an employee to finish a task). One thing this article makes me more aware of is the concept of getting impatient with yourself. Part of you is the boss or client that wants things accomplished, and it is sizing up the worker part of you, wondering if it assigns a task whether that task will get done quickly or require a lot of waiting, and there's a relationship to manage there. (Perhaps this dynamic underpins the phenomenon of momentum?)
* Quantity Always Trumps Quality - There's a blog post by Jeff Atwood with this title. The point is, if you want to get better at something, do a lot, and don't worry about quality while you're practicing.
Focusing on "speed" is probably a good mental trick for a couple reasons. First, it validates and acknowledges that we hate to wait. Second, it helps overcome perfectionism and the tendency to think and judge instead of doing and creating, by giving us something to measure that is not about quality but is instead correlated with action and progress.
> Quantity Always Trumps Quality - There's a blog post by Jeff Atwood with this title. The point is, if you want to get better at something, do a lot, and don't worry about quality while you're practicing.
I wholeheartedly disagree. My martial arts instructor had a saying: "Practice doesn't make perfect. Perfect practice makes perfect." I've found that to be true. After all, if you practice bad technique, how do think you're going to perform in real situations?
Not to mention, you usually practice things in a controlled environment where you are evaluating your own technique. That's often not true in real situations, where your focus needs to be split on many different things. If you want to execute well in a real environment, good technique needs to be second nature so you don't have to think about it. You just do it. Being able to do that requires a large amount of good practice.
"Quality" here means the quality of your result. If you are learning to paint/sing/code, it means how good the painting/singing/code is. The advice is to not "worry" about the quality, in the sense that if you are not producing good quality output, that does not mean you are doing anything wrong. When you are just learning something, the output will not be good quality! You need to produce a lot of crap results, and that makes most people uncomfortable.
I can relate what you're saying to my experience taking singing lessons, though even there, I find that making progress is all about turning off the inner judge while you practice. All you need to practice something is a bit of intent and a bit of awareness; it's not important that you "worry" about the quality of what you are doing, per se.
Speed matters and there are more ways to make things faster other than doing them often.
1. If you are coding ensure that your debugging environment is good (simplest example would be if you write HTML how long does it take to see the output or is it auto-refresh itself int the second monitor when you change the file?).
2. If you are building a project how long does it take to deploy? Do you have CI with a one button push to production environment? If you have then you'll deploy faster and more often there is no overhead. If not you won't want to deploy often due to the overhead of the deploy.
etc.
The most important lesson I've learned about speed is that "design your environment and build for speed and remove the repetitiveness" then you can be fast. You can apply this rule to many things. Blogging, ensure that your blog tool makes it easier (how easy to link stuff, find images, does it really necessary to add an image to all posts, do you FTP upload, or do you just copy&paste and image to the editor, what happens if it crashes, do you lose data or does it just recover?)/
Reminds me of Mjoolnar's (author of vim) talk on 7 habits of effective editing 2.0; the core idea is essentially (my current interpretation): awareness: look for repetition - they are candidates for automation and/or being more effective. Rule of three: If you do something (anything) 3 times - figure out if it can be done more efficiently. In vim (or any other editor) an example might be shifting the indendation of code: maybe you only remember how to do it a line at a time; figure out how to work on blocks of code; figure out how to do it on a file - a whole project.
> I’ve noticed that if I respond to people’s emails quickly, they send me more emails.
An alternative explanation would be that if you don't take your time to understand people's mail and just rush to answer them as quickly as possible, things that would take two mails to communicate now end up being a thread of ten mails, two phone calls and an in-person meeting.
Nothing is more infuriating than a person that replies 30 seconds later with a message that suggests they didn't read past the first sentence.
I don't know the author or his ways of responding to emails, but in my experience the above often applies to people that value speed above all else.
I also don't respect people that always respond to email immediately because it makes me think they don't have anything important enough to work on that requires unbroken concentration.
I see a lot of responses that seem to reflect what I consider a misunderstanding of the real advantage of speed.
It's not to get the same amount of work done in a shorter time (the management fallacy referenced in some of the comments).
The point of speed is to increase the number of feedback opportunities. Each feedback datum allows for slight course corrections / confirmation of original hypothesis.
By analogy, think of it like sample size. If you accept time as a primary constraint, then faster iterations (even if you accomplish less!) tend to give you more samples. More samples mean less variance.
Making decisions on a better model (less variance) is very appealing to me.
This article reminds me of one of my all-time favorite articles. Quantity Always Trumps Quality by Jeff Atwood [1]. There is a bunch of commentary on it on Less Wrong [2], some of which is interesting. Also an interesting parallel to a Paul Graham post [3]:
> I was taught in college that one ought to figure out a program completely on paper before even going near a computer. I found that I did not program this way. I found that I liked to program sitting in front of a computer, not a piece of paper. Worse still, instead of patiently writing out a complete program and assuring myself it was correct, I tended to just spew out code that was hopelessly broken, and gradually beat it into shape. Debugging, I was taught, was a kind of final pass where you caught typos and oversights. The way I worked, it seemed like programming consisted of debugging.
> For a long time I felt bad about this, just as I once felt bad that I didn't hold my pencil the way they taught me to in elementary school. If I had only looked over at the other makers, the painters or the architects, I would have realized that there was a name for what I was doing: sketching. As far as I can tell, the way they taught me to program in college was all wrong. You should figure out programs as you're writing them, just as writers and painters and architects do.
Before anything, the obsession with speed seems like a management fantasy that tries to squeeze more out of workers in less time. We shouldn't forget that we're all human, and there are myths and facts related to working fast.
Working "fast" is tricky. The ironic conclusion part of the article is an example to that. People will miss the point and screw up more while trying to be "faster". However if a task becomes more automatic, it will become faster, which means that actually doing something "fast" would consume less energy, since it is more or less automated and unconscious, while "trying" to be fast would consume more energy.
The key here seems to be that just do something "a lot", and it will become faster by itself through time by being more and more "automated". But take your time, and stop worrying about speed. Speed is just pressure which will bring more pressure as you do stuff faster. Don't create unrealistic pressure.
Another key is garbage collection. Just get stuff "done", and get rid of the old tasks, or treat your long to-do items as "later" lists. If something really matters, you will do it anyway. You won't even need a to-do list to keep track of your "important tasks".
Ok, i will hold up my hand, im a software manager, and nothing fustrates me more than developers trying to do everything at break neck speed and just plain geting it wrong. They either fail to read or understand the requirements in the rush to get started, or they will rapidly push out a pile of shit, and fein suprise when it repeatably gets rejected by QA. My best guys are the ones who take the time to read tbe specs and talk to the stakeholders about what they want, and who spend time making sure that what they produce will pass muster. Those guys are golden, they are worth 10 of the immature speed freaks.
The implied benefit is that repetition improves quality, so more repetitions, more quickly, means your quality will improve more quickly.
You need to self-reflect after each iteration, mind you, to make sure that you learn each time you do the thing. But, this is necessary for skill progress whether you're going fast or slow.
(And, having to make the corrections can be as informative or more so, depending on how you're able to reflect on the "why" you're refining or fixing the thing).
"The implied benefit is that repetition improves quality, so more repetitions, more quickly, means your quality will improve more quickly."
Yes, but that's frequently not the case. Developers at "sweat shops" tend to write far more code, and work many more hours, than the average dev at a top tier software house and yet they are usually much worse devs.
I feel a tendency towards perfectionism is implied by the tendency toward slowness. Working quickly could balance out the most self-defeating aspects of perfectionism.
Entrepreneurs would say that speed is the signal of mature markets: more and more effort for less and less return. Mobile games were very profitable in 2009-10 (low effort, high return), balanced in 2012 (speed was important at that stage), unbeatable and unbearable as a business in 2015 (however fast you are, you're just playing a lottery).
Thinking slowly doesn't imply working slowly. Thinking slowly means making a plan, doing some research, before working. Each one of these can be done fast.
- You can only hold so many details in your head at once, and you can only sustain that collection for a limited duration. Holding those details there can be crucial for doing good work and the faster you work the more you make use of them.
- Doing good work often requires a lot of experimentation and iteration. In a number of circumstances this may only be practical if you can do them fast enough
I am reminded of the story of how Leonardo da Vinci took three years to paint The Last Supper and produced only a small number of paintings, generally considered masterpieces, during his life.
Working quickly is important because it's the better method to survive and thrive. Quickly producing 100 deliverables with qualities ranging from shit to pretty good, will probably on average always defeat carefully crafting one deliverable in the same time frame.
Speed lets you try many alternatives, experiment with many different even opposite options, and draw out creativeness.
Another possible reason is that speed is the strength of the younger generations. In human history, if new comers want to beat the current authority (in business or politics) who have already mastered the intricacies of the current game, one have to propose and experiment large quantity of new alternatives, new rules of new games, even though most of experiments might have low quality results judging by the established rules. But it's the better way to compete and survive.
The author rests his entire argument on illusions to "the mind" that are untested and impossible to test. For example, "If you work quickly, the cost of doing something new will seem lower in your mind." Not sure how we get into someone's mind to measure the "cost of doing something."
We could not confirm or disconfirm this as truth, say in the context of an experiment. This phenomenon is better described by the concept of immediacy of reinforcement. As one decreases the delay to reinforcement, the strength of behavior maintained by that reinforcer increases [0].
Strict behaviorism? I'm advocating for the experimental analysis of behavior (sometimes referred to as radical behaviorism). When applied to practical human problems, it is referred to as applied behavior analysis.
The study I cited provides support for the authors conclusion. However, his description is flawed because it doesn't provide means for prediction and control of behavior.
For every piece of code you write, there should be a unit test covering it that you can run fairly instantly with 1 keypress which catches at least internal-consistency errors. (Integration tests would cover external-consistency errors, but when you're refactoring/adding/deleting code, internal-consistency is violated far more often than external consistency, in my experience.)
Once you experience this for yourself, you will NOT want to go without it. It allows you to IMMEDIATELY get back to coding without waiting for test runs to finish and without creating bugs unknowingly that you only catch much later.
This is why I love Haskell---the check, though approximate, is built in to the language as the type checker. Pair it up with the speed Haskell coder's best friend, "undefined", and you can be just 40 characters into writing part of a method and check it right there on the spot, getting into a good flow.
I see that appeal of Haskell, and I think the strict focus on typing does catch many possible bugs... but I don't think it catches all the types of bugs that an actual unit test would, so I'm almost afraid that it's leaned on a bit too much.
One apparent counter-example is that when practicing the piano, some common advice is to play pieces much slower than you think is necessary, but play them correctly. Then speed up once you have the right habits.
But the overall goal is to learn new pieces as efficiently as possible. Someone who is efficient at practicing will make much more progress (and have more fun doing it) than someone who isn't.
That's the catch, isn't it? You probably want to get high quality fast, but you can't get fast without losing quality, at least in the beginning. But I'm pretty sure speed trumps quality in most cases - having a good enough product out is eons better than having a perfect product years away...
I feel like there's a bunch of data over-fitting going on here. The examples here are merely the ones that fit with the author's model, are they not? I'm sure I could have had a theorem that points to the idea that "fast work is catastrophic", gone back in time, and shown how the slow individuals / components of a system were significantly more effective than the fast ones.
It's funny that they include google as a fast example. On my Nexus4 and Nexus7 Google search is the slowest thing you can do on the internet. Youtube HD videos are way faster than a google search and fail less often when on the train/bus. I always wonder how sending a String and getting back a list of Strings can be so much more expensive than a HD video but who am I to judge, right?
I think he's having in mind the time when you had to enter altavista-digital.com to enter a search site.
There were worlds between that and google.com (both entering the URL and getting the results).
I am not sure if the writer has thought out the drawbacks to working quickly but there is probably some benefit to deciding quickly.
The writer mentions that faster employees have more work assigned to them. Woo hoo, but what happens when the work queue fills up faster than it can be emptied? Then the once-fast employee now seems slow.
The article reminds myself when I was younger when I was 27. Now I am 28, and I have grown wise and old in this year to know that working quickly does exactly that, produce quick results without much quality. It works well in cases where you need a result or just something to build momentum, but it's not at all sustainable. Just like coding, if you speed without thinking through the steps you end up with a mountain of technical debt, instead, it's actually faster to think slowly and careful and doing the right things instead of your ego which gets a huge kick out of doing more in higher quantity than quality in short period of time. Again, it works well in some scenarios which I can't think of right now but not at as important as this young buck has written.
so inspiring.
this person is generous, admitting to being the slowest.
-=in defence of taking your sweet time=-
the arguments for speed and the arguments for quality are not either or. they both contribute to a product that works.
to introduce the case for slowness a natural precedent is appropriate. the development phase for human beings was on the order of 2 billion years.
and in that time a whole lot of nothing happened. all those evolutionary competitors at every level of the tree of life were more rapidly produced than humans. and now humans rule the world and in 200000 years have used that comprehensive development period to move rapidly and adapt so effectively to the world that we have changed it to support 7 billion of us and tripled our life spans. the hockey stick curve of our technology speaks to the benefits of long development, and the long tail of non-adaptive more-rapidly developed ideas that come to nought.
those practising rapid development and launch save costs during the development stage and increase costs during the much longer operating stage
by code that has more bugs, takes more to maintain, is more brittle, and these things constribute to being slower to adapt to
customers and competitors when it counts, that is, when you have customers, are burning operating costs, and competing.
it works
better to develop comprehensively when it is cheap to and build the most efficient product to be really useful when you run with it. longer development, then move faster in operations.
otherwise you end up
solving your terrible code base by hiring more brains, and those brains could be better put to use creating improvements for your customers and not fixing the consequences you shipped in a sprint.
pre launch development is
cheap, so it works to take your time and not optimize that __process__ prematurely. everything is more expensive after launch when the stakes are real and where an advantage can be moving fast -- if the code you crafted creates that you can adapt quickly, then you've minimized
costs over the operating period, and your brains can work on growing and retaining, rather than building an
ozymandius of monkey patches.
the time
when speed is important is after launch not before it. if you rush your pre launch development, you will
be a slow operator, and this will cost
you exponentially more than the linear increase in cost
from a longer development time.
also the more
robust the system you build is,
the slower ( as in slow thinking )
you can create your
decisions in the operating period to really consider strategy.
let your
competitors
ship
first
and watch the things they miss. let them pay for the experiments you now decline to run. if someone seems to be capturing the market through their business model then you are too slow anyway and the high order bit for you isn't code anymore. if that is not your market, then it is filled with competitors who
are mimicing each others sub-monopoly strategies. so you can step in, be the last mover, and take the market. invest time during development to make code and tools that work, grasp your business plan, and have the possibility of thinking strategically about the business, in the operation phase, once you have launched.
From the comments, I think readers are generally failing to take an opportunistic attitude toward this article. Dig out useful ideas and ignore what doesn't help.
Yes, there are situations where working too quickly will bite you, so you can't always do it. But the key idea here is that by working quickly you reduce your expectations of how costly a new effort is. Then you'll actually start it. Inertia is a powerful adversary. Any weapon you can put on your belt to fight it is worthwhile. I'm keeping this idea.
One good takeaway: be slow at things you don't want to do a lot. Be slow in answering the kind of email you don't want to receive again, etc.
I like to call this "intentional incompetence" :)
2 replies →
"Opportunistic attitude" looks too much like "confirmation bias" to me to be comfortable with it.
The phrase may put you off but I think the sentiment is sound. Too many dismiss something/somebody in entirety because of one flaw. I think it's better to try to find value in everything; even if it's 99% shit, you can still learn from that 1%.
5 replies →
If it's an idea you're already using, then it isn't an opportunity.
I have to agree. My personal experience suggests if you want to get good at something, keep doing lots of it and aim for speed rather than perfection. You'll end up being speedy and get closer to perfection, than if you just try for perfection.
But one good counter example does come to mind - designing a database schema.
I'm trying to wrestle with what the difference might be. I think Markov processes, e.g. processes where the future state depends on a prior state, are relevant.
Maybe we could say tasks are either "strongly Markvoian", that is, how well we do them now will influence our future work and hence we should really think them through, e.g. designing a schema. Weakly Markovian, in which case there may be some future impact but not much, and so we exercise caution but "done is better than perfect". And finally non-Markovian - e.g. throwing out the garbage, cooking dinner, most emails - getting the thing done is simply a pass/fail and so we just have to do it, quality is relatively unimportant.
I think what I'm saying is, most tasks will be weakly or non-markovian, so we should "move fast and break things", but every now and then there'll be something we need to do that is strongly markovian. For such things we should be prepared to take a step back and give ourselves a little extra time, so things don't blow up further down the track.
In general I disagree. I learned far more in 6 months at a "high quality" shop working excruciatingly slow than I did in the 5 years at the previous job where I banged out code as fast as possible.
I feel like when you focus on speed you very quickly learn just enough to get the task done quickly. And then your progress stalls.
Perhaps it depends on your goals.
I find that when I'm doing any work that involves modifying existing code, there's a huge difference in how quickly I can work when I'm touching code that was originally written by people with different work styles. Usually I can sail through the more methodically-written stuff. When the time comes to make changes to code written by the people most fond of uttering, "The perfect is the enemy of the good," though, progress slows to a crawl. Adding new features without introducing new defects to that code is like trench warfare.
That said, much like the article says, the fast workers did tend to get assigned more new tasks. I think maybe this was a huge win for them. From a wider perspective, though, it's a bit tragic because it results in this inexorable downward spiral in terms of code quality. Of course that worked out well for them too because more defects meant more opportunities for them to cape up, swoop in, and save the day. I can't remember who it was who said, "Beware of your firefighters, they are probably your chief arsonists," but there's a lot of truth in that statement.
In that boat right now, but instead of 5 years in, 1.5 years in. I've been reading books on design patterns (we barely use them, you should get an idea of how bad it is here) and just in general trying to be better.
2 replies →
I can think of lots of more things, where doing them fast initially makes sure you'll never learn them:
* Typing on a keyboard
* Sharpening a knife
* Driving a car.
It seems more like a question of doing things a lot and very focused, than focusing on speed initially.
All of my car/motorcycle track instructors were big believers in "Slow is smooth, and smooth is fast".
Basically go slow and work on the correct form/line/being smooth/etc... Once you have that down, you are ready to go fast and you will be better/faster than someone who hasn't gotten the flow down at a manageable speed first.
I used to teach guitar and always had to tell students to practice it slowly, but perfectly first, and only try to speed things up once they had the basics mastered.
Speed is impressive, but you literally have to learn to walk before you can run!
Tried to teach myself to touch type. I can kind of do it, but its just so slow compared to my normal typing.
6 replies →
It depends on your goal. Even with database schemas, iterating quickly optimizes for learning and skill improving. Aiming for quality tries to optimizes for creating a quality product faster. Still, taking some time for skill honing might be better than trying to design the perfect system on your first try.
I'd say it depends on your current skill level rather than on a task property like "strongly Markovian". Learning has diminishing returns. At some point more learning is not worth it anymore.
Don't bang frenetically on walls too. It's very very good to be able to reach deep understanding without hyperfast iterations. There will be times where you don't have that luxury too. Make haste slowly, balance.
Yes and no.
Do things fast when the cost of doing them wrong is low. If you're learning something, or doing something with low risk, then doing it as fast as possible is a really good idea (for all the reasons set out in the article).
But...
Do things slowly if the cost of getting it wrong is so high that you'll have no opportunity to try again. For example, don't pack a parachute quickly.
The key is recognising that there's more than one way to approach soemthing; selecting the right method for the problem at hand is the winning strategy.
I think the concept you're trying to described is Reversible Decisions (unfortunately I can't recall who coined that phrase).
The idea is that any decision that is straightforward or easy to change should not be sweated over for any appreciable amount of time, and in fact can be deferred indefinitely (deciding not to decide).
Meanwhile, any decision or indecision that will have long term repercussions should be considered at length and with all due haste.
I mention indecision here deliberately, because things like deciding not to put authentication into your application in version 1 counts as a decision, one with far reaching and usually fairly aggravating (IME at least) long term effects on the project. Others would include thread safety, the ability to cluster or shard your design, multilingual support, audit trails, etc. If you are the only solution in the space then you often have time to correct these mistakes. But if one of your competitors figures these things out before you, you can find yourself in real trouble (one of the aspects of the Innovators Dilemma).
The parachute analogy is worth exploring.
Let's say in a life time, the accident rates due to packing a parachute quickly is 1/10000, and the fatal rate of the slow packing group is 1/1000000. Even though the fast group faces bigger danger than the people in the slow group (or people sitting at home), but other than the few who have the bad luck, the rest of them will practice way more than the other group, jump more times, go to more places, have bigger opportunities to become a world champion of parachute packing or whatever parachuting sports.
Sure, a few will be forgotten by the world.
The victors we see in the world are probably the people who are still alive in the fast group, and have produced lots of results because of their speed and being still alive. Someone in that group will pay a huge price, but it's not necessary you or any particular one.
You're attacking this analogy with made-up numbers and wild logical leaps. What is the real risk increment to packing a chute hastily, and does the real number help or hurt your position? Now make the stakes really high. Also consider the possibility that your choices have externalities, and others around you may not want to share their jumps with someone they perceive to be that reckless idiot who's going to get himself killed.
You didn't give specific numbers on how often someone can jump, but consider the realistic bounds on how much more often a person who packs hastily can skydive. How often is this person jumping? Are we in a scenario where the amount of time it takes to pack a parachute is really the limiting factor, to the point where the hasty packers can jump "way more?" Seems like what that would mean in concrete terms is that as soon as you hit the ground you're going to hit the john, re-pack your chute, and immediately be back in the plane. Is that a realistic scenario?
5 replies →
To be honest, if packing a parachute takes an hour longer and increases the chance of living by 100* I'd say that's worthwhile. But then, I also wouldn't call 1/10000 an especially high risk. We face those sorts of odds just driving a car and they don't put many of us off[1].
If the probability of death from packing a parachute quickly were 1/100 then I think your argument would break down somewhat. Like I said, you should spend the appropriate time on something depending on the costs and risks associated with it. "Do everything fast" is wrong, but so is "Do everything slow".
[1] In 2003 the annual risk of being killed as a car user were 1/15261. http://www.medicine.ox.ac.uk/bandolier/booth/Risk/trasnsport...
2 replies →
don't pack a parachute quickly.
Write a to-do list app super fast, but take time with medical software...
How long do you think Workflowy took to conceptualize and implement?
2 replies →
Speaking as one who is laid back in a personality sense, I could imagine nothing worse than a life lived non-stop frenetically under the imagined need to speed up all activity in the name of productivity.
Take time to pause, reflect, think, and enjoy your life experiences.
Even when it comes to work-related activity, there are times and places to do things quickly and there are times and places to do them deliberately. If nothing else, just for sanity's sake, it is important to pace yourself through a day, through a week, through a month, through a year, through a career. Even if speed were exactly correlated with maximum productivity and effectiveness, it is vital that you have times when you simply feel you can enjoy being at work, being with people, doing your activities, without everything feeling you have to work like a machine that will be evaluated by engineering standards only.
Even more, we all have different personalities and some people do not work well if they feel they are forced to work at some arbitrarily quick pace as opposed to one that suits their style.
Finally, even speed as a factor can vary with your activities as you develop skills in those activities. When I began years ago to try to write things, I was agonizingly slow about the process. I felt I had a quick mind but the process of getting what was in my mind down on paper made me feel plain stupid. Whatever I did, it would never come out right. Through a very tedious process of writing and re-writing, it would eventually become passable and that was it. It might take me a week in such cases to write something expository of modest length. Yet, realizing this was a weakness, I worked damned hard to fix it and, through a process of many years and countless hours of effort, I reached a breakthrough point where I could do "walls of text" (in the phrasing of some) in 10-15 minutes and produce quality stuff. I now write very quickly and effectively. But had I tried to do so years ago with my limited abilities at that time, all I would have produced was hash.
So, lighten up and do it in your own style. Yes, speed does matter. But it is only one of many factors that will determine how you do at work or, even more important, at life itself. By all means, apply yourself well - be diligent, hard-working, etc. but do it fast or slow as suits your needs and your own style. At least that is how I view it.
> a life lived non-stop frenetically under the imagined need to speed up all activity in the name of productivity.
I don't think that is quite what the article is suggesting. I think it was reminding us to thinking about the effects of the speed at which you accomplish things. If there is a behavior you are trying to encourage in others (e.g. requesting a code review), focusing on responding quickly can be crucial to helping foster that behavior. Similarly, if there is a communication channel (e.g. Slack vs. email) that is not being adopted, holding yourself back from quick responses to those emails while responding quickly to Slack messages will help foster the transition.
This doesn't mean that you need to do everything as quickly as possible. It does mean that if there is something that you do slowly that you want to improve on, it may be helpful to be aware of the additional mental cost you associate with the activity so that you can compensate for it. Similarly, if you are trying to improve the quality of your writing, focusing on improving the speed of your writing (or even just the speed of your typing) while maintaining the same quality might pay off faster than just focusing on improving quality.
One of the tenets of Extreme Programming is, "Quit when you're tired." Why? Because it's faster.
It's not faster today - if you kept working, you'd presumably get more than zero done. But you'd also create more bugs, and you'd come back more tired tomorrow. Coding is not an assembly line; your brain needs to be fresh.
Taking time to pause, reflect,and think is the same. It's slower in the next minute, maybe in the next hour. But stopping to think and realizing what is the right thing to do can save you days of waste.
My first boss said, "You need to learn when the most productive thing you can do is go look out the window for 15 minutes." After 30 years, it's still good advice.
I'm ignoring your point about work-life balance here. All I'm saying is, too much emphasis on speed slows you down, even only considering work.
As someone who's just finishing nearly four weeks of non-stop work including weekends, I approve of this message.
Or as the people who fight wars for a living say: "slow is smooth, smooth is fast."
Perhaps this applies more to shooting than software development, but...
Slow is smooth. Smooth is fast.
If I take 20 minutes more to code a module because I'm thinking about it, but spend 30 minutes less debugging problems with the module, that's fast.
If I take a day to respond to an email, but the person I'm conversing with gets the info they need, avoiding three more days of back and forth, that's fast.
If I take a week longer to iterate through a project idea, but nail the implementation, then I can know that I'm pivoting because the idea was wrong, not the implementation.
How do mere mortals become wizards to their peers? They take the time to read documentation and code and really understand the tools that they are working on. What software projects have stood the test of time? The ones that were painstakingly thought out and progressed slowly. Of course there's a balance, and our current system of financing software projects rewards fast and loose, and there will always be a place for fast prototyping to help understand the problem, but to create something truly amazing takes a lot of time.
"What software projects have stood the test of time? The ones that were painstakingly thought out and progressed slowly."
I kinda think that the opposite is true, or at least as true as your statement (meaning that at least as much half-baked stuff rushed out the door 'stood the test of time' as did stuff that took a lot of time to ship.) Unix, C, Windows, PHP, JavaScript...
To offer an example of a project that has stood the test of time far longer than the ones you mentioned, consider Fortran. The first compiler was released in 1957, several years after it was first proposed. The specification took a couple of years to complete. This was over 60 years ago, and even now they are releasing an update to Fortran (Fortran 2015).
Even in the examples you mentioned, Unix, C, and Javascript are all run through standards bodies now. It took years for C11 to be finalized. It's been years since ES6 has started development. PHP7 has taken a few years (and a version update before even being released).
Windows, depending on who you're talking to, can be a good example of half-baked, or an example of why half-baked is terrible, with regards to Windows 8 and Windows 10 releases.
4 replies →
In addition to that, I like to try to answer every question I can at work, even if I have to just go Google it myself. Telling someone you don't know and they should just Google it is robbing yourself of an opportunity to 1) learn it yourself, and 2) explain it to someone else (which is a GREAT way of making sure you really understand it).
Plus, people seem to like it when they ask something, and you help research through it with them. They come back to you again, which gives you another free opportunity to share someone else's learning, and you quickly turn into that person who either knows everything, or knows where to find out.
I agree with you completely here, but you definitely have to be careful, particularly as you become more knowledgeable. Some developers get into the habit of simply asking every time they can't find an answer, or giving up after only a few minutes trying, without realizing that the searching builds a better foundation than the answer many times.
My general rule has been - always ask what they've tried first. If it seems a sincere effort has been put into it so far, by all means help out. It may be something you know immediately, but there is value in teaching people to learn for themselves.
2 replies →
"create something truly amazing takes a lot of time" Reminds of a front t-short slogan from Games People Play by Eric Berne.
Go slow so you can be fast :)
There are actually three lessons here:
* Momentum - To stick with something and finish it, you need to use momentum. Lacking momentum, projects languish. Quoting author Steven Pressfield: "Second only to habit, momentum is a writer’s (or artist’s or entrepreneur’s) mightiest ally in the struggle against Resistance."
* Waiting is painful. That's the point behind the examples in the middle section (waiting for an email reply, waiting for Google, waiting for an employee to finish a task). One thing this article makes me more aware of is the concept of getting impatient with yourself. Part of you is the boss or client that wants things accomplished, and it is sizing up the worker part of you, wondering if it assigns a task whether that task will get done quickly or require a lot of waiting, and there's a relationship to manage there. (Perhaps this dynamic underpins the phenomenon of momentum?)
* Quantity Always Trumps Quality - There's a blog post by Jeff Atwood with this title. The point is, if you want to get better at something, do a lot, and don't worry about quality while you're practicing.
Focusing on "speed" is probably a good mental trick for a couple reasons. First, it validates and acknowledges that we hate to wait. Second, it helps overcome perfectionism and the tendency to think and judge instead of doing and creating, by giving us something to measure that is not about quality but is instead correlated with action and progress.
> Quantity Always Trumps Quality - There's a blog post by Jeff Atwood with this title. The point is, if you want to get better at something, do a lot, and don't worry about quality while you're practicing.
I wholeheartedly disagree. My martial arts instructor had a saying: "Practice doesn't make perfect. Perfect practice makes perfect." I've found that to be true. After all, if you practice bad technique, how do think you're going to perform in real situations?
Not to mention, you usually practice things in a controlled environment where you are evaluating your own technique. That's often not true in real situations, where your focus needs to be split on many different things. If you want to execute well in a real environment, good technique needs to be second nature so you don't have to think about it. You just do it. Being able to do that requires a large amount of good practice.
"Quality" here means the quality of your result. If you are learning to paint/sing/code, it means how good the painting/singing/code is. The advice is to not "worry" about the quality, in the sense that if you are not producing good quality output, that does not mean you are doing anything wrong. When you are just learning something, the output will not be good quality! You need to produce a lot of crap results, and that makes most people uncomfortable.
I can relate what you're saying to my experience taking singing lessons, though even there, I find that making progress is all about turning off the inner judge while you practice. All you need to practice something is a bit of intent and a bit of awareness; it's not important that you "worry" about the quality of what you are doing, per se.
I dont want you "practising" with my product, i'll take quality over quantity any day. If i want crap i'll outsource it.
That's why this point isn't so related to the other two, because it's specific to the goal of getting better at something.
Speed matters and there are more ways to make things faster other than doing them often.
1. If you are coding ensure that your debugging environment is good (simplest example would be if you write HTML how long does it take to see the output or is it auto-refresh itself int the second monitor when you change the file?).
2. If you are building a project how long does it take to deploy? Do you have CI with a one button push to production environment? If you have then you'll deploy faster and more often there is no overhead. If not you won't want to deploy often due to the overhead of the deploy.
etc.
The most important lesson I've learned about speed is that "design your environment and build for speed and remove the repetitiveness" then you can be fast. You can apply this rule to many things. Blogging, ensure that your blog tool makes it easier (how easy to link stuff, find images, does it really necessary to add an image to all posts, do you FTP upload, or do you just copy&paste and image to the editor, what happens if it crashes, do you lose data or does it just recover?)/
Reminds me of Mjoolnar's (author of vim) talk on 7 habits of effective editing 2.0; the core idea is essentially (my current interpretation): awareness: look for repetition - they are candidates for automation and/or being more effective. Rule of three: If you do something (anything) 3 times - figure out if it can be done more efficiently. In vim (or any other editor) an example might be shifting the indendation of code: maybe you only remember how to do it a line at a time; figure out how to work on blocks of code; figure out how to do it on a file - a whole project.
https://m.youtube.com/watch?v=p6K4iIMlouI
> I’ve noticed that if I respond to people’s emails quickly, they send me more emails.
An alternative explanation would be that if you don't take your time to understand people's mail and just rush to answer them as quickly as possible, things that would take two mails to communicate now end up being a thread of ten mails, two phone calls and an in-person meeting.
Nothing is more infuriating than a person that replies 30 seconds later with a message that suggests they didn't read past the first sentence.
I don't know the author or his ways of responding to emails, but in my experience the above often applies to people that value speed above all else.
Agreed.
I also don't respect people that always respond to email immediately because it makes me think they don't have anything important enough to work on that requires unbroken concentration.
I was going to quote the same line, but add that I don't want anyone to send me more emails. Who could possibly want more emails?
I see a lot of responses that seem to reflect what I consider a misunderstanding of the real advantage of speed.
It's not to get the same amount of work done in a shorter time (the management fallacy referenced in some of the comments).
The point of speed is to increase the number of feedback opportunities. Each feedback datum allows for slight course corrections / confirmation of original hypothesis.
By analogy, think of it like sample size. If you accept time as a primary constraint, then faster iterations (even if you accomplish less!) tend to give you more samples. More samples mean less variance.
Making decisions on a better model (less variance) is very appealing to me.
This article reminds me of one of my all-time favorite articles. Quantity Always Trumps Quality by Jeff Atwood [1]. There is a bunch of commentary on it on Less Wrong [2], some of which is interesting. Also an interesting parallel to a Paul Graham post [3]:
> I was taught in college that one ought to figure out a program completely on paper before even going near a computer. I found that I did not program this way. I found that I liked to program sitting in front of a computer, not a piece of paper. Worse still, instead of patiently writing out a complete program and assuring myself it was correct, I tended to just spew out code that was hopelessly broken, and gradually beat it into shape. Debugging, I was taught, was a kind of final pass where you caught typos and oversights. The way I worked, it seemed like programming consisted of debugging.
> For a long time I felt bad about this, just as I once felt bad that I didn't hold my pencil the way they taught me to in elementary school. If I had only looked over at the other makers, the painters or the architects, I would have realized that there was a name for what I was doing: sketching. As far as I can tell, the way they taught me to program in college was all wrong. You should figure out programs as you're writing them, just as writers and painters and architects do.
1: http://blog.codinghorror.com/quantity-always-trumps-quality/
2: http://lesswrong.com/lw/53e/just_try_it_quantity_trumps_qual...
3: http://www.paulgraham.com/hp.html
Yeah, iteration on an idea is important, so, the faster you complete it (at least MVP) and put out to the world, the better it is.
Its one of the key lessons YC taught me.
Before anything, the obsession with speed seems like a management fantasy that tries to squeeze more out of workers in less time. We shouldn't forget that we're all human, and there are myths and facts related to working fast.
Working "fast" is tricky. The ironic conclusion part of the article is an example to that. People will miss the point and screw up more while trying to be "faster". However if a task becomes more automatic, it will become faster, which means that actually doing something "fast" would consume less energy, since it is more or less automated and unconscious, while "trying" to be fast would consume more energy.
The key here seems to be that just do something "a lot", and it will become faster by itself through time by being more and more "automated". But take your time, and stop worrying about speed. Speed is just pressure which will bring more pressure as you do stuff faster. Don't create unrealistic pressure.
Another key is garbage collection. Just get stuff "done", and get rid of the old tasks, or treat your long to-do items as "later" lists. If something really matters, you will do it anyway. You won't even need a to-do list to keep track of your "important tasks".
Ok, i will hold up my hand, im a software manager, and nothing fustrates me more than developers trying to do everything at break neck speed and just plain geting it wrong. They either fail to read or understand the requirements in the rush to get started, or they will rapidly push out a pile of shit, and fein suprise when it repeatably gets rejected by QA. My best guys are the ones who take the time to read tbe specs and talk to the stakeholders about what they want, and who spend time making sure that what they produce will pass muster. Those guys are golden, they are worth 10 of the immature speed freaks.
I believe the author has mistaken speed with efficiency. What's the benefit of being fast if you're sloppy and have to redo things often?
The implied benefit is that repetition improves quality, so more repetitions, more quickly, means your quality will improve more quickly.
You need to self-reflect after each iteration, mind you, to make sure that you learn each time you do the thing. But, this is necessary for skill progress whether you're going fast or slow.
(And, having to make the corrections can be as informative or more so, depending on how you're able to reflect on the "why" you're refining or fixing the thing).
"The implied benefit is that repetition improves quality, so more repetitions, more quickly, means your quality will improve more quickly."
Yes, but that's frequently not the case. Developers at "sweat shops" tend to write far more code, and work many more hours, than the average dev at a top tier software house and yet they are usually much worse devs.
2 replies →
I feel a tendency towards perfectionism is implied by the tendency toward slowness. Working quickly could balance out the most self-defeating aspects of perfectionism.
Entrepreneurs would say that speed is the signal of mature markets: more and more effort for less and less return. Mobile games were very profitable in 2009-10 (low effort, high return), balanced in 2012 (speed was important at that stage), unbeatable and unbearable as a business in 2015 (however fast you are, you're just playing a lottery).
not so long ago... https://news.ycombinator.com/item?id=9952875
Thinking slowly doesn't imply working slowly. Thinking slowly means making a plan, doing some research, before working. Each one of these can be done fast.
Then : Hacker News considered harmful.
Some other reasons why speed can be good:
- You can only hold so many details in your head at once, and you can only sustain that collection for a limited duration. Holding those details there can be crucial for doing good work and the faster you work the more you make use of them.
- Doing good work often requires a lot of experimentation and iteration. In a number of circumstances this may only be practical if you can do them fast enough
I am reminded of the story of how Leonardo da Vinci took three years to paint The Last Supper and produced only a small number of paintings, generally considered masterpieces, during his life.
Working quickly is important because it's the better method to survive and thrive. Quickly producing 100 deliverables with qualities ranging from shit to pretty good, will probably on average always defeat carefully crafting one deliverable in the same time frame.
Speed lets you try many alternatives, experiment with many different even opposite options, and draw out creativeness.
Another possible reason is that speed is the strength of the younger generations. In human history, if new comers want to beat the current authority (in business or politics) who have already mastered the intricacies of the current game, one have to propose and experiment large quantity of new alternatives, new rules of new games, even though most of experiments might have low quality results judging by the established rules. But it's the better way to compete and survive.
The author rests his entire argument on illusions to "the mind" that are untested and impossible to test. For example, "If you work quickly, the cost of doing something new will seem lower in your mind." Not sure how we get into someone's mind to measure the "cost of doing something."
We could not confirm or disconfirm this as truth, say in the context of an experiment. This phenomenon is better described by the concept of immediacy of reinforcement. As one decreases the delay to reinforcement, the strength of behavior maintained by that reinforcer increases [0].
0: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1332712/
Sounds like you're advocating a return to strict behaviorism.
Strict behaviorism? I'm advocating for the experimental analysis of behavior (sometimes referred to as radical behaviorism). When applied to practical human problems, it is referred to as applied behavior analysis.
The study I cited provides support for the authors conclusion. However, his description is flawed because it doesn't provide means for prediction and control of behavior.
24 replies →
As a developer, what this means to me is:
For every piece of code you write, there should be a unit test covering it that you can run fairly instantly with 1 keypress which catches at least internal-consistency errors. (Integration tests would cover external-consistency errors, but when you're refactoring/adding/deleting code, internal-consistency is violated far more often than external consistency, in my experience.)
Once you experience this for yourself, you will NOT want to go without it. It allows you to IMMEDIATELY get back to coding without waiting for test runs to finish and without creating bugs unknowingly that you only catch much later.
This is why I love Haskell---the check, though approximate, is built in to the language as the type checker. Pair it up with the speed Haskell coder's best friend, "undefined", and you can be just 40 characters into writing part of a method and check it right there on the spot, getting into a good flow.
I see that appeal of Haskell, and I think the strict focus on typing does catch many possible bugs... but I don't think it catches all the types of bugs that an actual unit test would, so I'm almost afraid that it's leaned on a bit too much.
2 replies →
One apparent counter-example is that when practicing the piano, some common advice is to play pieces much slower than you think is necessary, but play them correctly. Then speed up once you have the right habits.
But the overall goal is to learn new pieces as efficiently as possible. Someone who is efficient at practicing will make much more progress (and have more fun doing it) than someone who isn't.
High speed alone is not so good if it comes without quality.
That's the catch, isn't it? You probably want to get high quality fast, but you can't get fast without losing quality, at least in the beginning. But I'm pretty sure speed trumps quality in most cases - having a good enough product out is eons better than having a perfect product years away...
See: Worse is Better vs Worse is Worse.
I feel like there's a bunch of data over-fitting going on here. The examples here are merely the ones that fit with the author's model, are they not? I'm sure I could have had a theorem that points to the idea that "fast work is catastrophic", gone back in time, and shown how the slow individuals / components of a system were significantly more effective than the fast ones.
It's funny that they include google as a fast example. On my Nexus4 and Nexus7 Google search is the slowest thing you can do on the internet. Youtube HD videos are way faster than a google search and fail less often when on the train/bus. I always wonder how sending a String and getting back a list of Strings can be so much more expensive than a HD video but who am I to judge, right?
I think he's having in mind the time when you had to enter altavista-digital.com to enter a search site. There were worlds between that and google.com (both entering the URL and getting the results).
Not using flyspeck grey fonts on light gray background in a tiny column on a vast screen is more important for readability than it seems.
I am not sure if the writer has thought out the drawbacks to working quickly but there is probably some benefit to deciding quickly.
The writer mentions that faster employees have more work assigned to them. Woo hoo, but what happens when the work queue fills up faster than it can be emptied? Then the once-fast employee now seems slow.
I find myself really, tangentially curious now: what editor is he using that has a slow undo feature?
There is a big unasked question here - is your final work product the same or better quality when working fast?
If you can honestly answer "Yes" to that question, by all means, speed it up.
Move as quickly as possible while maintaining a high level of accuracy/quality. Some people underestimate themselves, moving too slowly!
I can see the benefit of making progress quickly. Haven't seen frenetic/"quick" working lead to that result.
Never mind the quality, feel the width :)
The article reminds myself when I was younger when I was 27. Now I am 28, and I have grown wise and old in this year to know that working quickly does exactly that, produce quick results without much quality. It works well in cases where you need a result or just something to build momentum, but it's not at all sustainable. Just like coding, if you speed without thinking through the steps you end up with a mountain of technical debt, instead, it's actually faster to think slowly and careful and doing the right things instead of your ego which gets a huge kick out of doing more in higher quantity than quality in short period of time. Again, it works well in some scenarios which I can't think of right now but not at as important as this young buck has written.
so inspiring. this person is generous, admitting to being the slowest.
-=in defence of taking your sweet time=-
the arguments for speed and the arguments for quality are not either or. they both contribute to a product that works.
to introduce the case for slowness a natural precedent is appropriate. the development phase for human beings was on the order of 2 billion years.
and in that time a whole lot of nothing happened. all those evolutionary competitors at every level of the tree of life were more rapidly produced than humans. and now humans rule the world and in 200000 years have used that comprehensive development period to move rapidly and adapt so effectively to the world that we have changed it to support 7 billion of us and tripled our life spans. the hockey stick curve of our technology speaks to the benefits of long development, and the long tail of non-adaptive more-rapidly developed ideas that come to nought.
those practising rapid development and launch save costs during the development stage and increase costs during the much longer operating stage by code that has more bugs, takes more to maintain, is more brittle, and these things constribute to being slower to adapt to customers and competitors when it counts, that is, when you have customers, are burning operating costs, and competing.
it works better to develop comprehensively when it is cheap to and build the most efficient product to be really useful when you run with it. longer development, then move faster in operations.
otherwise you end up solving your terrible code base by hiring more brains, and those brains could be better put to use creating improvements for your customers and not fixing the consequences you shipped in a sprint.
pre launch development is cheap, so it works to take your time and not optimize that __process__ prematurely. everything is more expensive after launch when the stakes are real and where an advantage can be moving fast -- if the code you crafted creates that you can adapt quickly, then you've minimized costs over the operating period, and your brains can work on growing and retaining, rather than building an ozymandius of monkey patches.
the time when speed is important is after launch not before it. if you rush your pre launch development, you will be a slow operator, and this will cost you exponentially more than the linear increase in cost from a longer development time.
also the more robust the system you build is, the slower ( as in slow thinking ) you can create your decisions in the operating period to really consider strategy.
let your competitors ship first and watch the things they miss. let them pay for the experiments you now decline to run. if someone seems to be capturing the market through their business model then you are too slow anyway and the high order bit for you isn't code anymore. if that is not your market, then it is filled with competitors who are mimicing each others sub-monopoly strategies. so you can step in, be the last mover, and take the market. invest time during development to make code and tools that work, grasp your business plan, and have the possibility of thinking strategically about the business, in the operation phase, once you have launched.