June 9, 2012
by jsproul
0 comments
June 3, 2012
by jsproul
0 comments
Seth’s Blog: Really Bad Powerpoint
“Powerpoint could be the most powerful tool on your computer. But it’s not. Countless innovations fail because their champions use PowerPoint the way Microsoft wants them to, instead of the right way.”
June 3, 2012
by jsproul
0 comments
Webstock ’12: Scott Hanselman – It’s not what you read, it’s what you ignore
June 3, 2012
by jsproul
0 comments
Bret Victor – Inventing on Principle
June 3, 2012
by jsproul
0 comments
Coding Horror: It’s Never Been Built Before
“[S]oftware projects truly aren’t like other engineering projects. I don’t say this out of a sense of entitlement, or out of some misguided attempt to obtain special treatment for software developers. I say it because the only kind of software we ever build is unproven, experimental software.”
June 3, 2012
by jsproul
0 comments
How to Beat the Odds at Judging Risk
Fast, clear feedback is crucial to gauging probabilities; for lessons, consult weathermen and gamblers
By DYLAN EVANS
Most of us have to estimate probabilities every day. Whether as a trader betting on the price of a stock, a lawyer gauging a witness’s reliability or a doctor pondering the accuracy of a diagnosis, we spend much of our time—consciously or not—guessing about the future based on incomplete information. Unfortunately, decades of research indicate that humans are not very good at this. Most of us, for example, tend to vastly overestimate our chances of winning the lottery, while similarly underestimating the chances that we will get divorced.
Weather forecasters tend to focus on a few clear questions, and their accuracy gets tested the very next day
Psychologists have tended to assume that such biases are universal and virtually impossible to avoid. But certain groups of people—such as meteorologists and professional gamblers—have managed to overcome these biases and are thus able to estimate probabilities much more accurately than the rest of us. Are they doing something the rest of us can learn? Can we improve our risk intelligence?
Sarah Lichtenstein, an expert in the field of decision science, points to several characteristics of groups that exhibit high intelligence with respect to risk. First, they tend to be comfortable assigning numerical probabilities to possible outcomes. Starting in 1965, for instance, U.S. National Weather Service forecasters have been required to say not just whether or not it will rain the next day, but how likely they think it is in percentage terms. Sure enough, when researchers measured the risk intelligence of American forecasters a decade later, they found that it ranked among the highest ever recorded, according to a study in the Journal of the Royal Statistical Society.
It helps, too, if the group makes predictions only on a narrow range of topics. The question for weather forecasters, for example, is always roughly the same: Will it rain or not? Doctors, on the other hand, must consider all sorts of different questions: Is this rib broken? Is this growth malignant? Will this drug cocktail work? Studies have found that doctors score rather poorly on tests of risk intelligence.
Finally, groups with high risk intelligence tend to get prompt and well-defined feedback, which increases the chance that they will incorporate new information into their understanding. For weather forecasters, it either rains or it doesn’t. For battlefield commanders, targets are either disabled or not. For doctors, on the other hand, patients may not come back, or they may be referred elsewhere. Diagnoses may remain uncertain.
If Dr. Lichtenstein’s analysis is correct, we should be able to develop training programs for instilling greater risk intelligence by boosting and speeding up feedback. Royal Dutch Shell introduced just such a program in the 1970s. Senior executives had noticed that when newly hired geologists predicted oil strikes at four out of 10 new wells, only one or two actually produced. This overconfidence cost Royal Dutch Shell millions of dollars. In the training program, the company gave geologists details of previous explorations and asked them for numerical estimates of the chances of finding oil. The inexperienced geologists were then given feedback on the number of oil strikes that had actually been made. By the end of the program, their estimates roughly matched the actual number of oil strikes.
Intelligence agencies are also working to improve their approach to risk. In 2011, researchers began recruiting volunteers for a multiyear, Web-based study of people’s ability to predict world events. The Forecasting World Events Project, an experiment sponsored by the Director of National Intelligence, aims to discover whether some kinds of personalities are better than others at such exercises. Volunteers offer their best guesses about events and trends in realms such as international relations, economics, public health and technology.
Just by becoming aware of our tendency to be overconfident or underconfident in our estimates, we can go a long way toward correcting for our most common errors. Doctors, for instance, could provide numerical estimates of probability when making diagnoses and then get data about which ones turned out to be right. As for the rest of us, we could estimate the likelihood of various events in a given week, record our estimates in numerical terms, review them the next week and thus measure our risk intelligence in everyday life. A similar technique is used by many successful gamblers: They keep accurate and detailed records of their earnings and their losses and regularly review their strategies in order to learn from their mistakes.
No one can be great at estimating all types of probabilities in all situations. But given the right conditions and the right kind of self-reflection and practice, we can all make substantial improvements in our risk intelligence.
— From “Risk Intelligence” by Dylan Evans. Copyright © 2012 by Dylan Evans
A version of this article appeared May 12, 2012, on page C3 in the U.S. edition of The Wall Street Journal.
June 3, 2012
by jsproul
0 comments
Hacker Lifestyle: How I Feel Satisfied with Every Day
“I like to consider myself a very productive person. I do a lot of writing, I make a good living running my own business and I maintain many open source projects. And yet, by outward appearances, I don’t seem to work particularly hard, but I still manage to get a lot done, and I go to bed feeling satisfied every night and every morning, I wake up eager to attack the day. I’d like to share with you how I do it.”
The DDC 50 Point Plan to Ruin Yer Career
April 22, 2012 by jsproul | 0 comments
Point No. 01: Enjoy the Goddamned Moment
March 21, 2012
by jsproul
0 comments
“No compulsion in the world is stronger than the urge to edit someone else’s document.”
“…or their code.”
—Jason’s corollary
March 18, 2012
by jsproul
0 comments
Copyright Math: a quantitative reasoning master class by Rob Reid (video) | Ars Technica
“In embargoing their music from legal services, and greeting almost every element of today’s online music experience with lawsuits—not just MP3 players, but locker services, interactive radio services, and much more—the labels gave piracy a half-decade monopoly on awesomeness. I believe that the music industry would have something close to double its current US revenues today if it hadn’t blasted itself in the foot, shin, hip, torso, and chest by doing this.”
Copyright Math: a quantitative reasoning master class by Rob Reid (video) | Ars Technica.