When most organizations design new work processes, they assume that team members will make the best possible use of them to improve team performance. That is, they assume that team members will act rationally. In most cases, this assumption is wrong.
Compelling evidence from research in the fields of psychology and behavioral economics suggests that people behave in ways that are far from rational. For instance, we systematically underestimate task-completion times — a tendency known as "planning fallacy" — and postpone tasks repeatedly over time (i.e., procrastinate).
We also tend to overestimate the accuracy of our own thoughts and the odds of our success — that is, we tend to be overconfident. These tendencies are common, yet when we make decisions intended to improve how effectively teams perform tasks, we regularly fail to account for them.
This was brought home to me in my field research. Here's a case in point:
In 2004, my HBS colleague Gary Pisano and I conducted a project at a leading manufacturer of highly sophisticated production equipment for the electronics industry, which I'll call "Exotech." Like many companies, Exotech struggled with serious time delays in its product-development projects. As the market became more competitive and customers more demanding, Exotech's senior management launched a systematic effort to improve its product-development performance through more rigorous upfront planning, a well-defined process, clear milestones for project reviews, cross-functional project teams with strong leaders, and rigorous post-project reviews to glean lessons learned.
As part of this effort, the company decided to use tools to drive high levels of performance on a new project that required both hardware and software components. The hardware team had employed similar processes in the past and found them helpful. So the company assumed the software team would, too. The project set strict timelines, formulated schedules based on careful analysis of critical paths, and carefully tracked progress using a sophisticated web-based scheduling tool that provided daily updates on progress and early warnings about potential slips. The teams working on this project were given thorough training in these tools and were aided by dedicated project coordinators who collected the appropriate information, ran the analyses, and provided them to the team in weekly meetings.
The results? The project ran successfully for hardware development, and the project management tools worked exactly as expected. But the results were much different on the software side. In fact, the software-development effort began to run late from the beginning and never caught up. The software was finished about six months behind schedule and led to a delay in the shipment of the first commercial systems.
What was puzzling about Exotech's results was that the project-management tools put in place were designed to provide information — early warning of problems — that could help the project team respond to them and reduce or even eliminate delays. But they didn't for the software-development team. The reasons: The team members were overly confident of their ability to complete the project on time and ignored the information that the tools provided that suggested that they were falling behind.
This case is not an isolated example. In my research, I have found that these behaviors are common.
Here are three potential solutions to address all-too-human irrational behavior:
1. Educate Team Members
In some cases, learning about our own irrationalities can help us understand why we make decision mistakes and try to correct for them. The members of the software team at Exotech may have used the tools if they understood the psychological tendency people have to resist change even when change will lead to better performance.
2. Encourage Dissenting Views
Unfortunately, even if someone is aware of his or her behavioral tendencies, that's not enough to overcome them. An antidote is to encourage members of the team to express their dissenting views. Forcing individuals to interact with others who question or challenge their conclusions is likely to reduce overconfidence, combat the natural reluctance to embrace change, and other tendencies that may disrupt team effectiveness.
3. Change the Process, Not the People.
When interpreting the world around us, we often overemphasize our own impact and underemphasize the role of environmental factors. But the environment can greatly influence our behavior and can compensate for our irrationalities. So another way to compensate or minimize the impact of irrational human behavior is to improve the environment or processes. For instance, Microsoft tries to correct for the overly optimistic projections of individual software developers by setting rules about the amount of buffer time that should be added to projects.
Together, these three strategies can help you introduce tools and processes that combat irrational human behavior and enhance team performance.
This post is part of the HBR Insight Center on The Secrets of Great Teams.
By Francesca Gino
http://blogs.hbr.org/hbsfaculty/2012/03/when-designing-work-processes.html?awid=7713774882025583693-3271
Compelling evidence from research in the fields of psychology and behavioral economics suggests that people behave in ways that are far from rational. For instance, we systematically underestimate task-completion times — a tendency known as "planning fallacy" — and postpone tasks repeatedly over time (i.e., procrastinate).
We also tend to overestimate the accuracy of our own thoughts and the odds of our success — that is, we tend to be overconfident. These tendencies are common, yet when we make decisions intended to improve how effectively teams perform tasks, we regularly fail to account for them.
This was brought home to me in my field research. Here's a case in point:
In 2004, my HBS colleague Gary Pisano and I conducted a project at a leading manufacturer of highly sophisticated production equipment for the electronics industry, which I'll call "Exotech." Like many companies, Exotech struggled with serious time delays in its product-development projects. As the market became more competitive and customers more demanding, Exotech's senior management launched a systematic effort to improve its product-development performance through more rigorous upfront planning, a well-defined process, clear milestones for project reviews, cross-functional project teams with strong leaders, and rigorous post-project reviews to glean lessons learned.
As part of this effort, the company decided to use tools to drive high levels of performance on a new project that required both hardware and software components. The hardware team had employed similar processes in the past and found them helpful. So the company assumed the software team would, too. The project set strict timelines, formulated schedules based on careful analysis of critical paths, and carefully tracked progress using a sophisticated web-based scheduling tool that provided daily updates on progress and early warnings about potential slips. The teams working on this project were given thorough training in these tools and were aided by dedicated project coordinators who collected the appropriate information, ran the analyses, and provided them to the team in weekly meetings.
The results? The project ran successfully for hardware development, and the project management tools worked exactly as expected. But the results were much different on the software side. In fact, the software-development effort began to run late from the beginning and never caught up. The software was finished about six months behind schedule and led to a delay in the shipment of the first commercial systems.
What was puzzling about Exotech's results was that the project-management tools put in place were designed to provide information — early warning of problems — that could help the project team respond to them and reduce or even eliminate delays. But they didn't for the software-development team. The reasons: The team members were overly confident of their ability to complete the project on time and ignored the information that the tools provided that suggested that they were falling behind.
This case is not an isolated example. In my research, I have found that these behaviors are common.
Here are three potential solutions to address all-too-human irrational behavior:
1. Educate Team Members
In some cases, learning about our own irrationalities can help us understand why we make decision mistakes and try to correct for them. The members of the software team at Exotech may have used the tools if they understood the psychological tendency people have to resist change even when change will lead to better performance.
2. Encourage Dissenting Views
Unfortunately, even if someone is aware of his or her behavioral tendencies, that's not enough to overcome them. An antidote is to encourage members of the team to express their dissenting views. Forcing individuals to interact with others who question or challenge their conclusions is likely to reduce overconfidence, combat the natural reluctance to embrace change, and other tendencies that may disrupt team effectiveness.
3. Change the Process, Not the People.
When interpreting the world around us, we often overemphasize our own impact and underemphasize the role of environmental factors. But the environment can greatly influence our behavior and can compensate for our irrationalities. So another way to compensate or minimize the impact of irrational human behavior is to improve the environment or processes. For instance, Microsoft tries to correct for the overly optimistic projections of individual software developers by setting rules about the amount of buffer time that should be added to projects.
Together, these three strategies can help you introduce tools and processes that combat irrational human behavior and enhance team performance.
This post is part of the HBR Insight Center on The Secrets of Great Teams.
By Francesca Gino
http://blogs.hbr.org/hbsfaculty/2012/03/when-designing-work-processes.html?awid=7713774882025583693-3271