Cognitive psychology is the study of how the mind (as distinguished from the brain) works. There is a long history of research on issues such as how humans perceive the world, how they understand language, how their memory works, and how they reason and make decisions. Much of this work is experimental and theoretically motivated. However, it is clear that the principles uncovered in this approach apply to many real-world issues, including how we deal with severe weather. Although there are numerous uncertainties in the weather domain, the biggest psychological uncertainty that cognitive psychology can address is how people interpret and use (or misuse) the information provided by forecasters to make weather-related decisions.

How new information is processed

“When we have an experience that matches our schema, we use the schematic framework to organize and understand it.”

Cognitive psychology has long been in the business of describing how the human mind processes and uses information. A key finding of research following this approach is that new incoming information is understood in the context of what is already known by the individual. Incoming information is categorized according to the generic concepts with which the individual is already familiar.1Eleanor Rosch, “Principles of Categorization,” in Readings in Cognitive Science: A Perspective From Psychology and Artificial Intelligence, ed. Allan Collins and Edward E. Smith (Morgan Kaufmann Publishers, 1988), 312–22. In addition, complex event information is categorized by “schemas.”2Cambridge: Cambridge University Press, 1932More Info → Everyone has schemas for the sequence of subevents in familiar events (e.g., going to a restaurant: being seated, reading menu, ordering, etc.). When we have an experience that matches our schema, we use the schematic framework to organize and understand it. A similar construct is a mental model.3Philip N. Johnson-Laird, “Mental Models and Human Reasoning,” Proceedings of the National Academy of Sciences of the United States of America 107, no. 43 (2010): 18243–18250. We have mental models for things we cannot apprehend perceptually—like the shape of the earth. In order to understand concepts like these, we build a 3D mental representation that we can “look at” from all angles with our mind’s eye to understand it. Concepts, schemas, and models are the background organizing principles—part of what is referred to as semantic memory, our memory for meanings4Allan M. Collins and Elizabeth F. Loftus, “A Spreading-Activation Theory of Semantic Processing,” Psychological Review 82, no. 6 (1975): 407–428.—that our mind uses every day to understand the world. We apply them automatically so we are not usually aware of them or what they are doing for us—they are largely unconscious assumptions.

When experts communicate risk information

Although there are huge similarities between people in terms of their concepts, schemas, and models—which accounts for the remarkable success of most communication—there are also differences, especially when it comes to specific areas of expertise. That is where uncertainty comes in—and communication can go awry. The person who is conveying the information may not realize that there is a complex set of models and schemas that the recipient may not share, but which are providing the framework necessary to understand the information. As a result, the person conveying the information may not think to make this background information explicit. If recipients of the information are not privy to the same concepts, schemas, and models, they will use whatever they have to organize and understand the information, and that can result in misunderstandings.

When forecasters explain complex weather phenomena, they are subtly—probably not consciously—referring to a whole range of mental models and schemas that are common knowledge to them but not necessarily familiar to their audience. For example, I once sat in on a weather briefing for emergency managers concerning some potential wind issues for the Pacific Northwest. The forecaster giving the briefing talked a lot about the position of the low-pressure system, but gave very few specifics about the wind speeds and directions for the areas affected. To him, those things were obvious from the position of the low, because of the complex 3D mental model he had of the atmospheric situation. Few of his listeners had that same specific mental model and most did not really understand what was going to happen in their jurisdictions. For them it would have been better if he had specified the range of wind speeds and direction as well.

We all do this to some extent whenever we have professional expertise—we forget that the basics are not common knowledge—because many of the cognitive processes that structure our thoughts operate largely on an unconscious level. You really have to stop and think: Does any of this information have unspoken assumptions or implications that I need to make crystal clear?

The manner in which people interpret information can also be affected by their momentary expectations and goals. People generally have specific decisions to which they need to apply weather information. The greater the degree to which the expression of the weather information matches those expectations, the easier it will be for people to understand it.

“The recipient of the weather briefing, the emergency manager or the city official, has a different perspective.”

Emergency managers’ decisions concern outcomes in their specific jurisdictions: Where and when will there be high winds? How high? What areas will be flooded? However, a lot of the weather briefings they receive start by describing the atmospheric conditions causing the weather. This is how the forecaster (or any expert) thinks about the situation. In other words, the briefings tend to follow the thought processes of how the information was developed. The recipient of the weather briefing, the emergency manager or the city official, has a different perspective. They are thinking about the decisions that will need to be made—e.g., is this going to be bad enough in my area to require extra staffing? They are thinking about how the information will be used. Because of that, emergency managers and city officials assume that forecasters are going to provide information that directly addresses those expectations. Emergency managers may not understand how the position of the low applies to their decision threshold (e.g., gust of 60 mph, sustained winds of 40). Therefore, if forecasters begin with the position of the low-pressure system, it might be confusing, partly because of the difference in mental models and partly because of the recipients’ expectations.

The power of expectations—an experiment

We conducted an interesting experiment that illustrates the power of expectations.5Susan Joslyn et al., “The Effects of Wording on the Understanding and Use of Uncertainty Information in a Threshold Forecasting Decision,” Applied Cognitive Psychology 23, no. 1 (2008): 55–72. We were going to provide 80 percent predictive intervals for various weather parameters on a website, but could not decide how to define the upper bound (“10 percent chance that the winds will be greater than X” or “90 percent chance that the winds will be less than X”). We worried about the first definition because of framing,6Daniel Kahneman and Amos Tversky, “Choices, Values, and Frames,” American Psychologist 39, no. 4 (1984): 341–50. a psychological effect that causes people to think things will be different depending on which outcome is emphasized. Here, we worried people would think that wind speeds would be higher than we intended if we used the “greater than” expression.

So we tested it. We gave participants a decision task (e.g., post a high-wind advisory if they thought winds would be greater than 20 knots) and then a forecast. Half of participants read the “greater than” expression (“10 percent chance that the winds will be greater than X”), and half the “less than” (“90 percent chance that the winds will be less than X”). We found that people understood the forecast better and translated the information to the appropriate advice for the public if the “greater than” definition was used (moreover, there was no framing effect).

However, when we changed the task to posting a freeze warning, indicating temperatures below 32 degrees, and tested it on a new group of participants, we found that people understood the “less than” forecast better (“90 percent chance that the temperature will be less than X”), and the “greater than” forecast was confusing and caused people to make decision errors. They behaved as though the forecast was telling them exactly the opposite to what it meant.

“The lesson here is to think about how the recipient of the information intends to use it—that is what they will be expecting to hear about.”

The experiment revealed that the uncertainty expression made the most sense when it met peoples’ expectations and their expectations were tied to the decision goal. If the decision was whether to post a high-wind warning indicating that winds would be greater than 20 knots—the “greater than” definition made sense. If the decision was whether to post a low-temperature warning, indicating that the temperature would fall below 32 degrees, then the “less than definition” made more sense. The lesson here is to think about how the recipient of the information intends to use it—that is what they will be expecting to hear about. If the information matches those expectations, it will be much easier to process and better understood. Incidentally, we also found (and consistently find) that people made much better decisions when they had the numeric uncertainty information, in this case the 80 percent predictive interval, than when given a single-value forecast.

Conclusion

Psychological uncertainty is inherent in the communication process itself. Inevitably, some aspect of the mental organizational structure or background information of the speaker and the recipient will differ to some degree. The difference in expertise between weather professionals and other professionals or members of the public enhances those differences, leading to uncertainty about how the message will be understood. Thus, when academics or practitioners communicate information to people who do not share their expertise or goals, they must think about the assumptions, implications, and background information that need to be spelled out as well as about how the information will be used. Communication strategies that take these issues into account will reduce the uncertainty of how the information will be understood and applied.

References:

1
Eleanor Rosch, “Principles of Categorization,” in Readings in Cognitive Science: A Perspective From Psychology and Artificial Intelligence, ed. Allan Collins and Edward E. Smith (Morgan Kaufmann Publishers, 1988), 312–22.
2
Cambridge: Cambridge University Press, 1932More Info →
3
Philip N. Johnson-Laird, “Mental Models and Human Reasoning,” Proceedings of the National Academy of Sciences of the United States of America 107, no. 43 (2010): 18243–18250.
4
Allan M. Collins and Elizabeth F. Loftus, “A Spreading-Activation Theory of Semantic Processing,” Psychological Review 82, no. 6 (1975): 407–428.
5
Susan Joslyn et al., “The Effects of Wording on the Understanding and Use of Uncertainty Information in a Threshold Forecasting Decision,” Applied Cognitive Psychology 23, no. 1 (2008): 55–72.
6
Daniel Kahneman and Amos Tversky, “Choices, Values, and Frames,” American Psychologist 39, no. 4 (1984): 341–50.