Shared Information Bias in Group Decision-Making

Markéta Kučerová
DESIGN KISK
Published in
8 min readApr 27, 2021
Photo by Leon on Unsplash

At the beginning of a Design Sprint week, a team of stakeholders, designers, and developers gather to gain as much information from each other as possible. Their shared goal is to develop a working prototype of a product by the end of the week, and they need all the relevant information to support their decision-making during the process. Each team member can contribute the discussion with their individual expertise, and a sprint facilitator will ensure that the expertise is shared and decisions are made within a given timeframe. A part of the facilitator’s role is to foster open discussion and direct the team away from group biases that could compromise their information sharing and decision-making.

Groups are thought to take better key decisions than individuals — group members can look at a given problem from diverse perspectives and give specific insights based on their expertise which should make them together more capable than an individual alone [1]. However, when the group members fail to share their expertise or integrate others’ opinions, they can arrive at erroneous decisions despite deciding as a group.

In a design sprint, the facilitator ensures that the team benefits from its enhanced group decision-making ability and avoids common pitfalls of group work — such as biases. Bias is an inclination to a particular opinion or behavior that is objectively considered unfair or disproportionate, and it can manifest in many ways in the interaction of both individuals and groups. In this article, we will discuss three of the most frequent biases that occur during group decision making:

  • Groupthink occurs when the group members agree to a dysfunctional solution as they suppress their critical thinking in order to reach a consensus with the rest of the group. [3]
  • Shared information bias is a tendency of group members “to discuss information that they all have access to while ignoring equally important information that is available to only a few of the members.” [1] They will spend more time and effort on discussing information that is familiar to most members of the group and will tend to consider that shared information more significant. [6]
  • Hidden profiles occur when a group together has all the necessary information to make a decision, but no group member alone has all the information. A hidden profile is a knowledge that needs to be shared among the whole group to foster informed decision-making. [2]

How biases harm group work

To mitigate the natural group biases, the team members and their facilitator need to learn to recognize the symptoms. Each team has a different background and group dynamics, affecting the way biases manifest in the group work [5]. When a team knows which bias they are most prone to, they can continuously mitigate it in their discussions. Generally, group biases manifest themselves in the following ways:

Shunning the critical thinkers and praising the conformists

Under the influence of groupthink, groups seek conclusions and consensus. Consequently, they prefer opinions consistent with the group’s majority over disrupting opinions of critical thinkers — they prefer consensus over the right decision.

When a critical thinker attempts to discuss unshared information that could delay reaching consensus, the other group members tend to ignore the disruptor. On the contrary, members who present opinions in accordance with the group’s general mindset will get praised and acknowledged. [3, 4]

Repeated information seems more important

Due to the shared information bias, groups tend to discuss more information that most members are already familiar with and omit unshared information. As “the shared information is discussed repeatedly, it is likely to be seen as more valid and to have a greater influence on decisions as a result of its high cognitive accessibility” [1]

False sense of confidence

As a group tends to discuss information that is already familiar to its members and omits information that would challenge the group’s shared believes, the group members can experience a false sense of well-being and confidence. Consequently, they will consider the group overly competent or infallible and neglect fact-checking and critical thinking. [3]

Abilene paradox

When individual members fail to share their expertise and preferences with the group, the group may collectively agree on a decision that is counter to the preferences of the majority of its members. In other words, the paradox occurs when individual members think they are the only ones who would disagree with the group and decide to remain silent or vote for (what they perceive as) the majority opinion, while they are actually voting for the more discussed minority opinion.

Research of group biases

Intuitively, a group’s inclination to biases can be affected by the relationships and hierarchy within the group, the structure of the organization the group belongs to, or the discussion’s situational context [3]. Numerous scientific research has been conducted to examine what causes and influences the way groups share information and make decisions. However, experimental studies are limited by their artificial setting: a real group solving a real problem may behave differently from a group given a task to solve during an experiment session.

In their research from 1985, Strasser and Titus [6] simulated a situation when a group should collectively choose the best candidate for student body president. They “defined three experimental conditions according to how the information about the candidates was distributed over the 4 group members before discussion. In the shared condition, participants read descriptions that contained all of the profile information about each candidate. Two unshared conditions were used; in both, a participant was given only partial information about each candidate. However, the distribution of information across a group’s members was designed so that a group, collectively, had all of the information and potentially could recreate the complete candidate profiles during discussion.”[6]

The experiment helped to define the shared information bias — information about the candidates that was shared among all the discussion members was discussed more frequently with greater emphasis, while the unshared information was discussed less. Furthermore, analysis of the research “also suggests that this failure to consider unique information is most likely when the unique information counters the prevailing sentiment in the group and could change its final decision.” In other words, if the group members held an opinion about a candidate before the discussion, they were less likely to integrate unshared information that would be against their preferred candidate.

The influence of information distribution prior to the discussion was further explored in a study by Reimer and Hinz in 2010 [5], where they created “a new condition, in which group members received their information regarding the choice alternatives at the beginning of their group session”[5] and compared them with groups that were informed before the discussion. The research focused on the groups’ ability to detect hidden profiles and integrate unshared information.

The study found that “presenting information at the beginning or prior to a discussion influenced when group members processed their information, as well as how they processed it.” In accordance with previous studies, they also concluded that group members who had formed opinions before the discussion were less likely to change them after receiving additional information. Furthermore, as indicated by research by Mojzish and Schulz-Hardt [2], when group members express their preferences at the beginning of a discussion, the group is less likely to uncover hidden profiles and integrate unshared information, and that knowing other’s preferences prior to the discussion. “reduces the attention devoted to encoding the information exchanged, which, in turn, negatively affects decision quality.”[2]

Prevention and control

During the design process, a facilitator can use various design methods for ideation and problem definition to suppress the group biases. Furthermore, they can consider the group’s characteristics and choose such methods that will suit the group dynamics. Some of the commonly used methods are:

  • Acknowledging an expert’s role. People who have unshared information often do not realize that the information is not commonly known. Hence they do not share the information because they consider it obvious. When a group member is granted the role of an expert, they are encouraged to share their specific knowledge and explain even what might seem obvious to them. [7]
  • Choosing the devil’s advocate. To support critical thinking, a facilitator can assign a group member the role of a devil’s advocate. The member’s task will be to challenge the group’s assumptions and provide alternative perspectives. [8]
  • Group heterogeneity. Homogeneous groups are more prone to groupthink; by increasing diversity among the group members, the group will be more likely to consider diverse perspectives and make better decisions. In practice, this could mean having enough diversity within existing teams or inviting people from outside of the teams to provide their insight.

Conclusion

In the design process of a new product, it is essential to gather information from various sources; subsequently, the information is used to support the decision-making process of a team or a workgroup. The way information is shared with or among the group members will influence their discussions and, consequently, their decisions. When some information remains unshared, groups can come to erroneous conclusions and make decisions that will later prove dysfunctional.

Information sharing in a group can be affected by group biases that stem from the group’s need or cohesion and consensus. A facilitator of a group discussion can use various methods to mitigate these biases and support an open climate. Finally, it is the responsibility of the group members, the facilitator, and even the organization to create processes that will decrease the probability of erroneous decisions.

References

[1] R. Jhangiani and H. Tarry, “Group Decision Making,” Opentextbc.ca, Sep. 26, 2014. https://opentextbc.ca/socialpsychology/chapter/group-decision-making/ (accessed Apr. 09, 2021).

[2] A. Mojzisch and S. Schulz-Hardt, “Knowing others’ preferences degrades the quality of group decisions.,” in Journal of Personality and Social Psychology, 2010, vol. 98, no. 5, pp. 794–808, doi: 10.1037/a0017627.

[3] E. Griffin, “Groupthink of Irving Janis,” in A First Look at Communication Theory, New York: McGrawHill, 2008, pp. 235–246.

[4] R. Brenner, “Effects of Shared Information Bias: I,” Chaco Canyon Consulting, Dec. 05, 2018. https://chacocanyon.com/pointlookout/181205.shtml (accessed Apr. 08, 2021).

[5] T. Reimer, A. Reimer, and V. B. Hinsz, “Naïve Groups Can Solve the Hidden-Profile Problem,” Human Communication Research, vol. 36, no. 3, pp. 443–467, Jun. 2010, doi: 10.1111/j.1468–2958.2010.01383.x.

[6] G. Stasser and W. Titus, “Pooling of unshared information in group decision making: Biased information sampling during discussion.,” Journal of Personality and Social Psychology, vol. 48, no. 6, pp. 1467–1478, Jun. 1985, doi: 10.1037/0022–3514.48.6.1467.

[7] D. D. Stewart and G. Stasser, “Expert role assignment and information sampling during collective recall and decision making.,” Journal of Personality and Social Psychology, vol. 69, no. 4, pp. 619–628, 1995, doi: 10.1037/0022–3514.69.4.619.

[8] R. T. Hartwig, “Facilitating problem solving: A case study using the devil’s advocacy technique,” Nov. 2010, Accessed: Apr. 09, 2021. [Online]. Available: https://brainmass.com/file/274163/Research+Article+-+DA.pdf.

--

--