Not Great, Not Terrible

Vjeran Buselic
In Search for Knowledge
13 min readSep 27, 2024

In last two columns I intentionally made a break/distraction, explaining WHY is necessary to upgrade your knowledge gathering channels to Generative AI, and especially WHY now.

We will cover WHY, HOW, and WHAT in another section dedicated to the art of questioning.

Simple answer, if you didn’t have a chance to read, or if you just want me to summarize key points — quite serious society changes are coming in a way we create and consume content, but also how we work, learn, and interact.

So, it is important to invest now in mastering Generative AI (for your own benefits), rather than go mainstream, easy (wrong) way.

And I also set a change in necessary (maybe not sufficient) direction — understanding (and improving) the way WE understand (and learn) the world.

And learn it and do it recursively through Generative AI, your new best friend (with some limits, you will learn to compensate).

After we thoroughly checked Generative AIs teeth (strengths and weaknesses), understood WHY it has certain limits (LLMs uses associative, not rule based reasoning), it would be fair to perform the same health procedure over ourselves:

Mirror, mirror on the wall …

Which is neither easy, because we are not trained to do so, nor efficient, because we don’t have measurement scale to compare.

Thus, we can only conclude: 3.6 roentgen — not great, not terrible ☹.

Generalization — a blessing AND a course

Depend on why (and how) you use it!

I pointed out AND, hoping you understand obligatory part of it. It is different from OR, or maybe, because it includes both sides as mandatory.

So, no one can take Generalization just as blessing (or a course) — it is both!

In order not to do any unnecessary psycho-introspective, and possibly ruin so needed confidence in ourselves, let’s use some scientific based reasoning and ruin it by generalization. 😊

In our strategy of understanding (Generative AI/our world/the problem we are solving), we already agree that we will use models for simplification, but identify tigers, the most important features of it, so the model can be useful.

For our specific, not general 😊 purpose!

And generalization is our best friend, if we use it properly.

From inverting principle, we are all aware that if someone says that men are stronger than women, it does not imply that all the men are stronger, nor one specific one (you) is stronger than some randomly assigned woman.

It is just generalization — recognizing patterns or principles in a sample and applying those insights to the larger group.

This is a borderline most woke purist will react — you can not use generalization, nor stereotypes to simplify the very complex, especially societal environment and relationships.

Hallo!?

We are in a mission of simplification of some complex phenomena, NEVER just one individual, so generalization, and particular stereotypes are our friends/tools.

Take stereotyping as an example.

They are not true for every individual, but stereotypes are formed, because we have noticed that MAJORITY of people are/behave in that way. This observation is legit, not offending.

It is offending that you project stereotype to particular individual. But this is wrong to do, as well.

Explaining that, lets understand how people (in general, as a species) think, so maybe, it is applicable to you.

Please, do not be offended if it does, or does ‘not, this is not in a scope of this article.

Human’s flaws in reasoning

From the top of my head I could not think of anyone more important (and simpler to understand) than two last century scholars Daniel Kahneman and Amos Tversky.

Their colossal work on that field left us with enormous opus on human errors in reasoning, plus System 1 and System 2 model of thinking, which explain not only how we think and make decisions, but why and how we make all these flaws.

I will give you very brief (condensed) history of their opus, and direct you to Knowing more section in case you want to really understand how fragile our (rational) thinking is.

And to better understand our current improved dual mode of reasoning — System 1 and System 2 model.

Their work, started in mid-seventies, is centered on the idea that humans do not always make rational decisions, even when presented with clear information.

Instead, we often rely on cognitive shortcuts, called heuristics, which can lead to systematic errors, or biases.

Heuristics are mental shortcuts that allow people to make quick decisions without having to process every piece of information.

While heuristics can be efficient and often useful in everyday decision-making, they are not foolproof and can lead to flawed reasoning.

Availability, Representativeness, and Anchoring

Kahneman and Tversky’s seminal 1974 paper “Judgment under Uncertainty: Heuristics and Biases” laid the foundation for understanding how people think in situations of uncertainty.

It introduced three most important heuristics: availability, representativeness, and anchoring — that individuals use to make judgments and decisions.

Today researchers have identified over 180 cognitive biases and heuristics.

These cognitive shortcuts reflect the ways in which people simplify complex judgments and can often lead to systematic errors. While the exact number can vary depending on how biases are classified and grouped, the most well-known biases are part of a comprehensive framework used in psychology, behavioral economics, and related fields. Major categories with examples are:

  1. Decision-making biases (e.g., loss aversion, overconfidence bias)
  2. Memory-related biases (e.g., availability heuristic, hindsight bias)
  3. Social biases (e.g., fundamental attribution error, in-group bias)
  4. Belief and judgment biases (e.g., confirmation bias, anchoring bias)
  5. Emotional and motivational biases (e.g., optimism bias, status quo bias)

Loss Aversion

Second very influential contribution was introduced in their 1979 paper “Prospect Theory: An Analysis of Decision under Risk.”

This theory challenged the traditional economic model of expected utility, which assumes that individuals make decisions by maximizing expected utility based on the likelihood and value of outcomes.

BTW, LLMs predicts next word solely based on likelihood and value of outcomes, they do not fail for loss aversion bias.

Prospect Theory posits that people evaluate potential gains and losses differently and that losses loom larger than gains — a concept known as loss aversion.

For example, the emotional impact of losing $100 is typically stronger than the pleasure of gaining $100 (or more!).

This phenomenon explains why individuals are often risk-averse when it comes to potential gains but may take irrational risks to avoid losses.

Framing effect

Kahneman and Tversky’s work also demonstrated the importance of how information is presented, a concept known as the framing effect.

According to their research, individuals’ choices can vary significantly depending on how a situation is framed, even if the underlying information is the same.

For instance, people tend to be more risk-averse when options are framed as potential gains: (e.g., “You have a 90% chance of surviving the surgery”) but more willing to take risks when options are framed as losses (e.g., “There is a 10% chance of dying from the surgery”).

This demonstrates how subtle differences in wording or presentation can lead to dramatically different decisions.

Framing effects have profound implications for communication, marketing, politics, and public policy, as the way choices are presented can influence public behavior.

Two modes of thinking: System 1 and System 2

Kahneman’s later work, particularly in his 2011 popular science book ‘Thinking, Fast and Slow’, expands on the idea of two modes of thinking: System 1 and System 2.

While this framework is more closely associated with Kahneman’s solo work, it is built on the above mentioned foundational research conducted with Tversky, who passed away in 1996.

· System 1 thinking is fast, intuitive, and automatic. It relies on heuristics and is often influenced by biases. This mode of thinking is efficient for everyday tasks but prone to errors in complex or unfamiliar situations.

· System 2 thinking is slow, deliberate, and analytical. It requires conscious effort and is less susceptible to biases, though it can be mentally taxing and inefficient for routine decisions.

The distinction between these two systems helps explain why people often make errors in judgment and decision-making.

While System 1 is necessary for quick decisions in daily life, its reliance on heuristics makes it vulnerable to biases. System 2, while more accurate, is not always engaged when it should be due to cognitive laziness or time constraints.

In everyday life, System 1 and System 2 are not mutually exclusive; they work together to complement each other’s strengths and compensate for each other’s weaknesses. Most decisions we make involve an interplay between the two systems, where System 1 handles routine or low-risk decisions, while System 2 steps in for more complex or uncertain tasks.

System 1’s quick, intuitive responses are invaluable for handling routine tasks and time-sensitive decisions, but its reliance on heuristics leaves it prone to biases and errors.

System 2, though slower and more effortful, provides the critical thinking and logic needed to solve complex problems, resist impulsive behaviors, and make rational decisions.

By becoming more aware of when and how these two systems operate, individuals can better manage their cognitive resources, minimize biases, and make more informed decisions.

This awareness is particularly crucial in high-stakes situations, where the automaticity of System 1 can lead to suboptimal choices, and the deliberative power of System 2 is needed to achieve the best outcomes.

In a nutshell, most cognitive biases are result of System 1 dominance!

Examples:

  • Confirmation Bias: The tendency to seek out information that confirms preexisting beliefs is a hallmark of System 1’s quick, associative thinking. System 1 processes familiar information favorably and may ignore or dismiss disconfirming evidence, unless System 2 intervenes.
  • Overconfidence Bias: System 1 often gives individuals a sense of certainty in their judgments, leading to overconfidence. Without System 2’s critical evaluation, people may overestimate the accuracy of their intuitions or predictions.
  • Framing Effect: The framing effect occurs when the way information is presented influences decisions. System 1 quickly reacts to the emotional or contextual framing of a situation, whereas System 2 would assess the underlying content more objectively.
    As we learned from risk aversion bias, people are more likely to take risks when outcomes are framed as avoiding losses, rather than achieving gains, even if the outcomes are identical.

So, please make deliberate effort and start using System 2. It is hard, but if the stakes are high, you have to use it!

Knowing More

Judgment under Uncertainty: Heuristics and Biases

In everyday life individuals heavily rely on heuristics — mental shortcuts or rules of thumb — to simplify complex decisions. While heuristics can be efficient and helpful, they often lead to systematic biases or errors in judgment. Most known and important ones are:

· Availability Heuristic
The availability heuristic describes how people judge the probability of events based on how easily examples come to mind. If a particular event or outcome is more readily recalled — due to media coverage, personal experience, or recency — people assume it is more likely to occur. For instance, after seeing news reports of plane crashes, individuals might overestimate the dangers of flying compared to driving, which is statistically riskier.
The availability heuristic often leads to overestimation of rare but memorable events and underestimation of common but less dramatic ones.

· Representativeness Heuristic
The representativeness heuristic occurs when individuals judge the likelihood of something based on how closely it matches their existing stereotypes or experiences. For example, if someone meets a quiet and detail-oriented individual, they might assume that person is a librarian rather than a salesperson, based on a stereotypical view of what librarians are like. However, they might ignore base rate information — that there are many more salespeople than librarians.
This can lead to the base rate fallacy, where people neglect statistical realities in favor of surface similarities.

· Anchoring and Adjustment Heuristic
The anchoring and adjustment heuristic occurs when individuals rely too heavily on an initial piece of information (the “anchor”) when making decisions.
Even when the anchor is arbitrary or irrelevant, it can significantly influence the final judgment.
For instance, when negotiating prices, the initial figure mentioned often sets the reference point, and subsequent discussions adjust from that number rather than from an objective assessment of value.
Anchoring demonstrates how early exposure to information can distort rational decision-making, particularly when individuals fail to sufficiently adjust from the anchor.

Prospect Theory: Loss Aversion, A New Understanding of Risk

Loss aversion is the tendency to avoid losses over achieving equivalent gains.

Broadly speaking, people feel pain from losses much more acutely than they feel pleasure from the gains of the same size.

This means that people are more motivated to avoid losses than to achieve equivalent gains.

In “Thinking, Fast and Slow”, Kahneman illustrates this loss aversion ratio tends to be around 1.5 to 2.5 times larger for losses than for gains. This means that, for many people, the pain of losing $100 is as impactful as the pleasure of gaining $150 to $250.

The theory also introduces the idea of reference points, suggesting that people evaluate outcomes relative to a particular reference point, rather than in absolute terms.

This helps explain why people might feel disappointed by a salary raise if it falls below their expectations, even though they are objectively better off.

System 1 and System 2: Introduction to Dual-Process Theory

The dual-process theory, most famously articulated by Daniel Kahneman in Thinking, Fast and Slow (2011), divides human thinking into two distinct systems: System 1 and System 2. These two systems represent different modes of cognitive processing that we engage in when faced with decisions, problems, and reasoning tasks.

· System 1 is fast, automatic, intuitive, and often subconscious. It operates effortlessly and quickly, responding to stimuli based on learned patterns, instincts, and heuristics.

· System 2 is slow, deliberate, analytical, and conscious. It requires effort and attention and is activated when more complex reasoning or problem-solving is needed.

These systems work in tandem to guide our thinking and behavior, but their interplay often leads to cognitive biases, errors, or flawed decision-making. Understanding how these systems function and interact can help individuals identify when they are likely to fall into cognitive traps and how to engage in more rational, deliberate thought when necessary.

System 1: Fast, Intuitive, and Automatic Thinking

System 1 thinking is our brain’s default mode in familiar, low-stakes situations where quick responses are necessary. It draws on experience, heuristics (cognitive shortcuts), and learned associations to make fast judgments and decisions.
This mode is evolutionary, designed to help humans navigate environments where immediate responses were critical to survival.

Characteristics of System 1

· Speed and Efficiency: System 1 operates rapidly, often in milliseconds, allowing us to make decisions without conscious deliberation.
· Automaticity: Responses from System 1 are automatic and often feel instinctive, bypassing conscious thought.
· Pattern Recognition: System 1 relies on recognizing patterns, drawing on prior experience to guide decisions. For example, driving on familiar routes or recognizing emotions on people’s faces are largely System 1 tasks.
· Heuristics: This system is heavily reliant on mental shortcuts, such as the availability heuristic or representativeness heuristic, which simplify decision-making but can lead to systematic biases.
· Emotional Influence: System 1 thinking is intertwined with emotions, making decisions based on gut reactions rather than careful analysis.

Advantages of System 1

· Speed: The primary advantage of System 1 is its speed, which allows individuals to act quickly in situations that demand immediate responses. For example, when crossing a street, System 1 assesses oncoming traffic and makes a split-second decision.
· Cognitive Efficiency: System 1 conserves cognitive energy by handling routine tasks effortlessly. It enables individuals to perform everyday actions — like walking, talking, and recognizing familiar objects — without taxing the brain.

Risks and Drawbacks of System 1

· Prone to Bias: The reliance on heuristics makes System 1 susceptible to cognitive biases. Because it simplifies decision-making, System 1 often overlooks critical details or context, leading to errors in judgment.
· Inaccuracy in Complex Situations: While System 1 is useful in familiar, routine scenarios, it often fails in complex or novel situations where careful analysis is required. For example, relying on intuition in financial decisions or medical diagnoses can lead to flawed outcomes.
· Overconfidence: System 1 decisions often feel certain or obvious, which can give individuals a false sense of confidence in their judgments. This is why people may feel sure about an incorrect answer or decision even without fully understanding the problem.

System 2: Slow, Analytical, and Deliberate Thinking

System 2 is the conscious, reflective, and analytical counterpart to System 1. It is activated when we need to think critically, solve complex problems, or evaluate multiple options. System 2 is responsible for reasoning, logic, and weighing evidence, making it indispensable for tasks that require careful thought and reflection.

Characteristics of System 2

· Deliberateness: System 2 operates slowly and methodically. It engages when individuals consciously analyze data, evaluate options, or consider abstract concepts.
· Effort and Focus: Engaging System 2 requires mental effort and sustained attention. Tasks like solving a math problem, planning a project, or critically assessing an argument demand System 2 processing.
· Logical Reasoning: System 2 is responsible for formal reasoning and logic. It follows systematic steps to reach conclusions, rather than relying on automatic responses or gut feelings.
· Capacity for Reflection: System 2 enables individuals to question their own judgments and decisions, offering the ability to reflect on and revise their initial instincts.

Advantages of System 2

· Accuracy: System 2 provides the accuracy and depth needed for complex problem-solving. By carefully weighing pros and cons, assessing evidence, and making reasoned judgments, it mitigates the risk of cognitive biases.
· Self-Control: System 2 is crucial for regulating impulses and emotions. It overrides instinctual reactions from System 1, allowing individuals to resist temptation, delay gratification, or make long-term plans.
· Adaptability in Novel Situations: In unfamiliar or ambiguous situations, System 2 steps in to analyze new information and develop appropriate responses, where System 1 might fail due to a lack of prior experience or patterns.

Risks and Drawbacks of System 2

· Cognitive Strain: System 2 processing is slow and cognitively taxing. It requires effort, and individuals often experience mental fatigue when engaging in prolonged analytical thinking.
· Laziness and Inertia: Because System 2 is effortful, individuals tend to avoid engaging it unless absolutely necessary. This leads to cognitive miserliness, where people default to System 1 thinking, even when the situation calls for careful analysis.
· Overloading: System 2 has a limited capacity for attention and working memory. When faced with too much information or too many complex tasks at once, System 2 becomes overwhelmed, leading to errors or suboptimal decisions.

In Search for Knowledge publication
Mastering Insightful Dialogue with Gen AI

<PREV Marshall’s Plan
NEXT> N/A YET

--

--

Vjeran Buselic
In Search for Knowledge

30 years in IT, 10+ in Education teaching life changing courses. Delighted by GenAI abilities in personalized learning. Enjoying and sharing the experience.