I think that there are often times where people who have no experience in a particular field can bring helpful new perspectives — and questions — to those who have been working in that area for a long time. I’ve learned ways to become a better from my student teachers and from my own students, for example. And, in my community organizing career, I saw that there was often truth to our axiom that the “people who are most directly affected by a problem often have excellent ideas on how to fix it” — regardless of their “expertise” in affordable housing, traffic control, crime prevention, etc.
At the same time, however, there are and were caveats to those points. Student teachers and students lack a great deal of experience, and though I solicit their suggestions and take them seriously (for the most part — student suggestions of “give us candy” might not warrant much consideration) my professional judgment takes precedent. I also need to point out that I create many opportunities for students to teach their other classmates and reflect on what they learn about the art of teaching — student suggestions for me after those experiences are at a much higher level than before….
In community organizing, before we would approach any decision-maker with our proposals, we would first have extensive “research actions” where we would meet with people who had professional expertise to learn what we needed to know. And when we would meet with officials, we would generally first explain the problem and then — before proposing our own solution — ask what ideas they might have for solving them. We generally then would discover that our problems existed either because decision-makers didn’t know that the problem existed and just needed to be told about it; or because they knew the problem existed but they didn’t know how to solve it or they needed our help to solve it; or, finally, because they knew about it but there was pressure on them not to solve it. If it was the last case, we then either knew we needed to create an equal or greater amount of pressure on him/her to get it fixed.
In any case, though, it’s critical to know what you don’t know before you start telling others how they should do the job they have been trained for and have spent years actually doing.
It seems to me that it’s not unusual to find that many of the most outspoken “school reformers” do not value of “knowing what you don’t know.”
Here are some resources that I think make insightful points about this issue, and which I’ve previously shared. I thought it would be useful to bring them all together in one “The Best…” list:
It’s not my habit to quote from Forbes Magazine, but there is an exceptional column there by Conor Friedersdorf that’s titled Why Some Elites Know Not What They Run. In it, he shares correspondence he has had with a person who wants to reshape health policy, but does not want to spend any time “working in the trenches.” It provides a much more articulate representation of my concern than any that I could write.
Here’s an excerpt:
I understand why my correspondent doesn’t want to spend several years in the field or working at a hospital before she gets to even tackle the systemic problems that she wants to solve, but I’d wager anything that she’d be far better at improving outcomes on the ground if she understood first hand what went on there, rather than grasping it through case studies or theory, or even as an “on location” manager. There are reasons why we have a “management track” and a “worker track” in many areas of American life, but there are costs even in fields where this makes sense overall. One cost is a reduced ability to get the valuable knowledge held by people ultimately affected by rules, policies, reforms, etc. A related cost is that our meritocratic elite is often utterly unable to grasp how the systemic changes it proposes will play out at a practical level.
He also writes:
Organizations are always trying to attract extreme talent and ambition. Part of the reason why they let people skip steps, rather than hiring people who’ve proven themselves by working up every step of the ladder, is the same reason why NBA teams gamble on high schoolers at draft time: the safe bet is picking a solid role player like Shane Battier, but the opportunity cost is the chance at a superstar like Kobe Bryant, Kevin Garnett or Lebron James. These are examples where the gamble pays off, as it often does when Harvard or Yale kids are hired. The difference is that Kobe Bryant wasn’t stepping into a job that called for him to improve the collegiate basketball system that he skipped over.
Today, The Atlantic published a piece titled Any Old Genius Can’t Always Be a Political Genius. I think it’s very important to read the entire post, but here’s how it starts:
Politics is one of those industries, like writing and coaching professional sports teams, where everyone thinks they could do it better than the pros, without any practice, any training, or any real-world experience. This week, we have accounts of multiple guys who are extremely successful in their chosen field trying out as political strategists: baseball genius Bill James, computer genius Steve Jobs, casino genius Sheldon Adelson. But as Michael Jordan found out when he tried baseball, and Tyra Banks found out when she tried singing, and Ethan Hawk found out when he tried writing, just because you’re really good at one thing doesn’t mean you’ll be good at a different thing.
N+1 editor Keith Gessen explained this phenomenon in another context in a podcast last month:
“You reach a certain age and all your friends who became doctors or lawyers all of a sudden they pop up again… and say, ‘I wrote a novel. Can you help me get it published?’ And you just want to say to them, ‘Go back 10 years, or 15 years, and instead of being a lawyer or a doctor, become a writer. Because I don’t show up to your office… we don’t show up at the doctor’s office and start performing surgery…'”
Here are a couple of Dilbert comic strips that speak to this issue, too:
Jonah Lehrer has written a column for Wired titled Do Political Experts Know What They’re Talking About? In it, he interviews researcher Philip Tetlock about his examining of political pundit predictions. I think it also relates to the topic of this post. Here is an excerpt:
Tetlock: Some experts displayed a top-down style of reasoning: politics as a deductive art. They started with a big-idea premise about human nature, society, or economics and applied it to the specifics of the case. They tended to reach more confident conclusions about the future. And the positions they reached were easier to classify ideologically: that is the Keynesian prediction and that is the free-market fundamentalist prediction and that is the worst-case environmentalist prediction and that is the best case technology-driven growth prediction etc. Other experts displayed a bottom-up style of reasoning: politics as a much messier inductive art. They reached less confident conclusions and they are more likely to draw on a seemingly contradictory mix of ideas in reaching those conclusions (sometimes from the left, sometimes from the right).
We called the big-idea experts “hedgehogs” (they know one big thing) and the more eclectic experts “foxes” (they know many, not so big things).
Lehrer: Do these different styles correlate with levels of accuracy?
Tetlock: In assessing accuracy, it is crucial to make the “law of large numbers” work for you. Any fool can be lucky a few times. The key is consistency. So, in the first round of our studies, we assessed the accuracy of almost 30,000 predictions from almost 300 experts. We tested a lot of different hypotheses about the correlates of consistency and accuracy. Is ideology the key factor? Having a PhD? Having past access to classified information? And a lot of hypotheses bit the dust. The most consistent predictor of consistently more accurate forecasts was “style of reasoning”: experts with the more eclectic, self-critical, and modest cognitive styles tended to outperform the big-idea people (foxes tended to outperform hedgehogs).
Gary Klein has been researching decision-making and insight for the past thirty years. Edge has a lengthy interview with him that is fascinating to read. It focuses on his research with firefighters, and he connects those lessons to other areas, and also relates to main point in this post. I’d strongly encourage you to read it. Here’s an excerpt:
That became part of our model — the question of how people with experience build up a repertoire of patterns so that they can immediately identify, classify, and categorize situations, and have a rapid impulse about what to do. Not just what to do, but they’re framing the situation, and their frame is telling them what are the important cues. That’s why they’re always looking, or usually looking, in the right place. They know what to ignore, and what they have to watch carefully.
It’s telling them what to expect, and so that’s why performance of experts is smoother than the performance of novices, because they’re not just doing the current job, they know what to expect next, so they’re getting ready for that. It’s telling them what are the relevant goals so that they can choose accordingly.
Sometimes you want to put a fire out, and sometimes the fire has spread too much and you want to make sure it doesn’t advance to other buildings near by, or sometimes you need to do search and rescue. They’ve got to pick an appropriate goal. It’s not just put the fire out each time.
It seems to me that Klein’s research can also be directly related to the fact that a teacher has to make .7 decisions each minute during the school day (you can read about that data at Larry Cuban’s blog).
Skeptics Say, ‘Do Your Own Research.’ It’s Not That Simple. is from The NY Times.
Why Is It So Hard to Be Rational? is from The New Yorker.
I hope readers can contribute other resources, too.
All feedback is welcome.